| From: | Matthew Wakeling <matthew(at)flymine(dot)org> | 
|---|---|
| To: | Balkrishna Sharma <b_ki(at)hotmail(dot)com> | 
| Cc: | pgsql-performance(at)postgresql(dot)org | 
| Subject: | Re: Parallel queries for a web-application |performance testing | 
| Date: | 2010-06-17 09:41:44 | 
| Message-ID: | alpine.DEB.2.00.1006171032120.2534@aragorn.flymine.org | 
| Views: | Whole Thread | Raw Message | Download mbox | Resend email | 
| Thread: | |
| Lists: | pgsql-performance | 
On Wed, 16 Jun 2010, Balkrishna Sharma wrote:
> Hello,I will have a web application having postgres 8.4+ as backend. At 
> any given time, there will be max of 1000 parallel web-users interacting 
> with the database (read/write)I wish to do performance testing of 1000 
> simultaneous read/write to the database.
When you set up a server that has high throughput requirements, the last 
thing you want to do is use it in a manner that cripples its throughput. 
Don't try and have 1000 parallel Postgres backends - it will process those 
queries slower than the optimal setup. You should aim to have 
approximately ((2 * cpu core count) + effective spindle count) number of 
backends, as that is the point at which throughput is the greatest. You 
can use pgbouncer to achieve this.
> I can do a simple unix script on the postgres server and have parallel 
> updates fired for example with an ampersand at the end. Example:
> echo '\timing \\update "DAPP".emp_data set f1 = 123where emp_id =0;' | 
> "psql" test1 postgres|grep "Time:"|cut -d' ' -f2- >> 
> "/home/user/Documents/temp/logs/$NUM.txt" &pid1=$!  echo '\timing 
> \\update "DAPP".emp_data set f1 = 123 where emp_id =2;' | "psql" test1 
> postgres|grep "Time:"|cut -d' ' -f2- >> 
> "/home/user/Documents/temp/logs/$NUM.txt" &pid2=$!  echo '\timing 
> \\update "DAPP".emp_data set f1 = 123 where emp_id =4;' | "psql" test1 
> postgres|grep "Time:"|cut -d' ' -f2- >> 
> "/home/user/Documents/temp/logs/$NUM.txt" &pid3=$!  .............
Don't do that. The overhead of starting up an echo, a psql, and a grep 
will limit the rate at which these queries can be fired at Postgres, and 
consume quite a lot of CPU. Use a proper benchmarking tool, possibly on a 
different server.
Also, you should be using a different username to "postgres" - that one is 
kind of reserved for superuser operations.
Matthew
-- 
 People who love sausages, respect the law, and work with IT standards 
 shouldn't watch any of them being made.  -- Peter Gutmann
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Pierre C | 2010-06-17 11:12:22 | Re: Parallel queries for a web-application |performance testing | 
| Previous Message | venu madhav | 2010-06-17 09:08:05 | Obtaining the exact size of the database. |