Optimizing tuning and table design for large analytics DB

From: Rob W <digital_illuminati(at)yahoo(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Optimizing tuning and table design for large analytics DB
Date: 2009-05-07 15:28:47
Message-ID: 952321.37402.qm@web34607.mail.mud.yahoo.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general


Can anyone point me towards good articles or books that would help a PostgreSQL novice (i.e. me) learn the optimal approaches to setting up a DB for analytics?

In this particular case, I need to efficiently analyze approximately 300 million system log events (i.e. time series data). It's log data, so it's only appended to the table, not inserted and is never modified. Only 90 days worth of data will be retained, so old records need to be deleted periodically. Query performance will only be important for small subsets of the data (e.g. when analyzing a week or day's worth of data), the rest of the reports will be run in batch mode. There will likely only be one user at a time doing ad-hoc queries.

This is a a follow-up to the earlier suggestions that PostgreSQL will handle the volumes of data I plan to work with, so I figured I'd give it a shot.

Rob

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Anderson dos Santos Donda 2009-05-07 15:29:07 Re: Is thre a way?
Previous Message Alvaro Herrera 2009-05-07 15:28:14 Re: [GENERAL] how to select temp table