Skip site navigation (1) Skip section navigation (2)

mysql to postgresql, performance questions

From: Corin <wakathane(at)gmail(dot)com>
To: pgsql-performance(at)postgresql(dot)org
Subject: mysql to postgresql, performance questions
Date: 2010-03-18 14:31:18
Message-ID: 4BA23936.2030507@gmail.com (view raw or flat)
Thread:
Lists: pgsql-performance
Hi all,

I'm running quite a large social community website (250k users, 16gb 
database). We are currently preparing a complete relaunch and thinking 
about switching from mysql 5.1.37 innodb to postgresql 8.4.2. The 
database server is a dual dualcore operton 2216 with 12gb ram running on 
debian amd64.

For a first impression I ran a simple query on our users table (snapshot 
with only ~ 45.000 records). The table has an index on birthday_age 
[integer]. The test executes 10 times the same query and simply discards 
the results. I ran the tests using a php and a ruby script, the results 
are almost the same.

Unluckily mysql seems to be around 3x as fast as postgresql for this 
simple query. There's no swapping, disc reading involved...everything is 
in ram.

query
select * from users where birthday_age between 12 and 13 or birthday_age 
between 20 and 22 limit 1000

mysql
{"select_type"=>"SIMPLE", "key_len"=>"1", "id"=>"1", "table"=>"users", 
"type"=>"range", "possible_keys"=>"birthday_age", "rows"=>"7572", 
"Extra"=>"Using where", "ref"=>nil, "key"=>"birthday_age"}
15.104055404663
14.209032058716
18.857002258301
15.714883804321
14.73593711853
15.048027038574
14.589071273804
14.847040176392
15.192985534668
15.115976333618

postgresql
{"QUERY PLAN"=>"Limit (cost=125.97..899.11 rows=1000 width=448) (actual 
time=0.927..4.990 rows=1000 loops=1)"}
{"QUERY PLAN"=>" -> Bitmap Heap Scan on users (cost=125.97..3118.00 
rows=3870 width=448) (actual time=0.925..3.420 rows=1000 loops=1)"}
{"QUERY PLAN"=>" Recheck Cond: (((birthday_age >= 12) AND (birthday_age 
<= 13)) OR ((birthday_age >= 20) AND (birthday_age <= 22)))"}
{"QUERY PLAN"=>" -> BitmapOr (cost=125.97..125.97 rows=3952 width=0) 
(actual time=0.634..0.634 rows=0 loops=1)"}
{"QUERY PLAN"=>" -> Bitmap Index Scan on birthday_age (cost=0.00..41.67 
rows=1341 width=0) (actual time=0.260..0.260 rows=1327 loops=1)"}
{"QUERY PLAN"=>" Index Cond: ((birthday_age >= 12) AND (birthday_age <= 
13))"}
{"QUERY PLAN"=>" -> Bitmap Index Scan on birthday_age (cost=0.00..82.37 
rows=2611 width=0) (actual time=0.370..0.370 rows=2628 loops=1)"}
{"QUERY PLAN"=>" Index Cond: ((birthday_age >= 20) AND (birthday_age <= 
22))"}
{"QUERY PLAN"=>"Total runtime: 5.847 ms"}
44.173002243042
41.156768798828
39.988040924072
40.470123291016
40.035963058472
40.077924728394
40.94386100769
40.183067321777
39.83211517334
40.256977081299

I also wonder why the reported runtime of 5.847 ms is so much different 
to the runtime reported of my scripts (both php and ruby are almost the 
same). What's the best tool to time queries in postgresql? Can this be 
done from pgadmin?

Thanks,
Corin


Responses

pgsql-performance by date

Next:From: tvDate: 2010-03-18 14:50:53
Subject: Re: mysql to postgresql, performance questions
Previous:From: Matthew WakelingDate: 2010-03-18 12:15:01
Subject: Re: pg_dump far too slow

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group