Skip site navigation (1) Skip section navigation (2)

Optimizing sum() operations

From: "Dobes Vandermeer" <dobesv(at)gmail(dot)com>
To: pgsql-novice(at)postgresql(dot)org
Subject: Optimizing sum() operations
Date: 2008-10-03 08:51:06
Message-ID: 7324d9a20810030151n2341c5dcide9681bf74daa4cc@mail.gmail.com (view raw or flat)
Thread:
Lists: pgsql-novice
I'm currently using sum() to compute historical values in reports;
basically select sum(amount) on records where date <= '...' and date
>= '...' who = X.

Of course, I have an index on the table for who and date, but that
still leaves potentially thousands of rows to scan.

First, should I be worried about the performance of this, or will
postgres sum a few thousand rows in a few milliseconds on a decent
system anyway?

Second, if this is a concern, is there a best practice for optimizing
these kinds of queries?

Any help or advice appreciated!

Responses

pgsql-novice by date

Next:From: Steve TDate: 2008-10-03 09:01:45
Subject: Forcing order of Joins etc
Previous:From: Bastiaan OlijDate: 2008-10-03 03:17:02
Subject: Installing postgres on Vista, can't connect remotely

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group