Skip site navigation (1) Skip section navigation (2)

Re: High CPU Utilization

From: Greg Smith <gsmith(at)gregsmith(dot)com>
To: Joe Uhl <joeuhl(at)gmail(dot)com>
Cc: pgsql-performance(at)postgresql(dot)org
Subject: Re: High CPU Utilization
Date: 2009-03-17 04:12:42
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-performance
On Mon, 16 Mar 2009, Joe Uhl wrote:

> Now when I run vmtstat 1 30 it looks very different (below).

That looks much better.  Obviously you'd like some more headroom on the 
CPU situation than you're seeing, but that's way better than having so 
much time spent waiting for I/O.

> max_connections = 1000
> work_mem = 30MB

Be warned that you need to be careful with this combination.  If all 1000 
connections were to sort something at once, you could end up with >30GB 
worth of RAM used for that purpose.  It's probably quite unlikely that 
will happen, but 30MB is on the high side with that many connections.

I wonder if your pool might work better, in terms of lowering total CPU 
usage, if you reduced the number of incoming connections.  Each connection 
adds some overhead and now that you've got the I/O situation under better 
control you might get by with less simultaneous ones.  Something to 

* Greg Smith gsmith(at)gregsmith(dot)com Baltimore, MD

In response to

pgsql-performance by date

Next:From: Greg SmithDate: 2009-03-17 04:17:24
Subject: Re: Postgres benchmarking with pgbench
Previous:From: Gregory StarkDate: 2009-03-17 00:30:20
Subject: Re: High CPU Utilization

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group