Skip site navigation (1) Skip section navigation (2)

Re: 100 simultaneous connections, critical limit?

From: Andrew McMillan <andrew(at)catalyst(dot)net(dot)nz>
To: Jón Ragnarsson <jonr(at)physicallink(dot)com>
Cc: pgsql-performance(at)postgresql(dot)org
Subject: Re: 100 simultaneous connections, critical limit?
Date: 2004-01-15 02:57:08
Message-ID: 1074135427.3198.213.camel@kant.mcmillan.net.nz (view raw or flat)
Thread:
Lists: pgsql-performance
On Thu, 2004-01-15 at 01:48, Jón Ragnarsson wrote:
> I am writing a website that will probably have some traffic.
> Right now I wrap every .php page in pg_connect() and pg_close().
> Then I read somewhere that Postgres only supports 100 simultaneous
> connections (default). Is that a limitation? Should I use some other
> method when writing code for high-traffic website?

Whether the overhead of pg_connect() pg_close() has a noticeable effect
on your application depends on what you do in between them.  TBH I never
do that second one myself - PHP will close the connection when the page
is finished.

I have developed some applications which are trying to be
as-fast-as-possible and for which I either use pg_pconnect so you have
one DB connection per Apache process, or I use DBBalancer where you have
a pool of connections, and pg_connect is _actually_ connecting to
DBBalancer in a very low-overhead manner and you have a pool of
connections out the back.  I am the Debian package maintainer for
DBBalancer.

You may also want to consider differentiating based on whether the
application is writing to the database or not.  Pooling and persistent
connections can give weird side-effects if transaction scoping is
bollixed in the application - a second page view re-using an earlier
connection which was serving a different page could find itself in the
middle of an unexpected transaction.  Temp tables are one thing that can
bite you here.

There are a few database pooling solutions out there. Using pg_pconnect
is the simplest of these, DBBalancer fixes some of it's issues, and
others go further still.

Another point to consider is that database pooling will give you the
biggest performance increase if your queries are all returning small
datasets.  If you return large datasets it can potentially make things
worse (depending on implementation) through double-handling of the data.

As others have said too: 100 is just a configuration setting in
postgresql.conf - not an implemented limit.

Cheers,
					Andrew McMillan.
-------------------------------------------------------------------------
Andrew @ Catalyst .Net .NZ  Ltd,  PO Box 11-053,  Manners St,  Wellington
WEB: http://catalyst.net.nz/             PHYS: Level 2, 150-154 Willis St
DDI: +64(4)916-7201       MOB: +64(21)635-694      OFFICE: +64(4)499-2267
              How many things I can do without! - Socrates
-------------------------------------------------------------------------

In response to

pgsql-performance by date

Next:From: pginfoDate: 2004-01-15 13:13:15
Subject: Trigger question
Previous:From: Christopher Kings-LynneDate: 2004-01-15 01:28:07
Subject: Re: 100 simultaneous connections, critical limit?

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group