Search This Blog

Tuesday, March 23, 2010

What database works well with 200+GB of data?

Programmer Question

I've been using mysql (with innodb; on Amazon rds) because it's sort of universal default, but it's been ridiculously under-performing, and tweaking it only delays the inevitable.



The data is mostly relatively short (<1kB of bytes each) blobs information about 100Ms of urls. There is (or should be, mysql cannot seem to handle it) very high amount of insert / update / retrieve but few complex queries - not that complex queries wouldn't be useful, but because mysql is so slow that it's far faster to get the data out, process it locally, and cache the results somewhere.



I can keep tweaking mysql and throwing more hardware at it, but it seems increasingly futile.



So what are the options? SQL/relational model/etc. optional - anything will do as long as it's fast, networked, and language-independent.



Find the answer here

No comments:

Post a Comment

Related Posts with Thumbnails