Skip site navigation (1) Skip section navigation (2)

Re: Running PostgreSQL as fast as possible no matter the consequences

From: Dimitri Fontaine <dimitri(at)2ndQuadrant(dot)fr>
To: "Lello\, Nick" <nick(dot)lello(at)rentrakmail(dot)com>
Cc: pgsql-performance(at)postgresql(dot)org
Subject: Re: Running PostgreSQL as fast as possible no matter the consequences
Date: 2010-11-08 15:57:14
Message-ID: m2mxpj6c6d.fsf@2ndQuadrant.fr (view raw or flat)
Thread:
Lists: pgsql-performance
"Lello, Nick" <nick(dot)lello(at)rentrakmail(dot)com> writes:
> A bigger gain can probably be had if you have a tightly controlled
> suite of queries that will be run against the database and you can
> spend the time to tune each to ensure it performs no sequential scans
> (ie: Every query uses index lookups).

Given a fixed pool of queries, you can prepare them in advance so that
you don't usually pay the parsing and planning costs. I've found that
the planning is easily more expensive than the executing when all data
fits in RAM.

Enter pgbouncer and preprepare :
  http://wiki.postgresql.org/wiki/PgBouncer
  http://preprepare.projects.postgresql.org/README.html

Regards,
-- 
Dimitri Fontaine
http://2ndQuadrant.fr     PostgreSQL : Expertise, Formation et Support

In response to

pgsql-performance by date

Next:From: Klaus ItaDate: 2010-11-08 15:58:13
Subject: Re: Running PostgreSQL as fast as possible no matter the consequences
Previous:From: Justin PittsDate: 2010-11-08 15:37:36
Subject: Re: Select * is very slow

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group