Re: Running PostgreSQL as fast as possible no matter the consequences

From: Dimitri Fontaine <dimitri(at)2ndQuadrant(dot)fr>
To: "Lello\, Nick" <nick(dot)lello(at)rentrakmail(dot)com>
Cc: pgsql-performance(at)postgresql(dot)org
Subject: Re: Running PostgreSQL as fast as possible no matter the consequences
Date: 2010-11-08 15:57:14
Message-ID: m2mxpj6c6d.fsf@2ndQuadrant.fr
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

"Lello, Nick" <nick(dot)lello(at)rentrakmail(dot)com> writes:
> A bigger gain can probably be had if you have a tightly controlled
> suite of queries that will be run against the database and you can
> spend the time to tune each to ensure it performs no sequential scans
> (ie: Every query uses index lookups).

Given a fixed pool of queries, you can prepare them in advance so that
you don't usually pay the parsing and planning costs. I've found that
the planning is easily more expensive than the executing when all data
fits in RAM.

Enter pgbouncer and preprepare :
http://wiki.postgresql.org/wiki/PgBouncer
http://preprepare.projects.postgresql.org/README.html

Regards,
--
Dimitri Fontaine
http://2ndQuadrant.fr PostgreSQL : Expertise, Formation et Support

In response to

Browse pgsql-performance by date

  From Date Subject
Next Message Klaus Ita 2010-11-08 15:58:13 Re: Running PostgreSQL as fast as possible no matter the consequences
Previous Message Justin Pitts 2010-11-08 15:37:36 Re: Select * is very slow