There's an article that tests the same queries against the same data using postgresql and mysql, and postgresql comes out consistently faster
I don't know how good that benchmark is, there's always performance to be gained from tuning, although I'd always thought that the tuning needed to be done more on the postgresql side than mysql (since the postgresql defaults are meant for slow, memory-constrained servers while mysql has several profiles and it's just a matter of choosing a profile that's close to one's server configuration).
The database tested isn't that large either. He's got 10G of data, at my $DAYJOB, i work with 400G of data (2 years or so worth). I don't think I'll try to get mysql to load that much data at all. I may try it when mysql gets to version 6 or 7, but I doubt if I'll try it for the current versions. In any case, the queries we run against that database are sufficiently complex that I'm not sure I could express them in mysql SQL.
Yeah, two years or so from now I may try to load one years worth of data into mysql and see what performance is like for the common things we use the database for.
No comments:
Post a Comment