Skip site navigation (1) Skip section navigation (2)

Optimizing sum() operations

From: "Dobes Vandermeer" <dobesv(at)gmail(dot)com>
To: pgsql-novice(at)postgresql(dot)org
Subject: Optimizing sum() operations
Date: 2008-10-03 08:51:06
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-novice
I'm currently using sum() to compute historical values in reports;
basically select sum(amount) on records where date <= '...' and date
>= '...' who = X.

Of course, I have an index on the table for who and date, but that
still leaves potentially thousands of rows to scan.

First, should I be worried about the performance of this, or will
postgres sum a few thousand rows in a few milliseconds on a decent
system anyway?

Second, if this is a concern, is there a best practice for optimizing
these kinds of queries?

Any help or advice appreciated!


pgsql-novice by date

Next:From: Steve TDate: 2008-10-03 09:01:45
Subject: Forcing order of Joins etc
Previous:From: Bastiaan OlijDate: 2008-10-03 03:17:02
Subject: Installing postgres on Vista, can't connect remotely

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group