In response to Juan Backson :
> I am using Postgres to store CDR data for voip switches. The data size quickly
> goes about a few TBs.
> What I would like to do is to be able to regularly archive the oldest data so
> only the most recent 6 months of data is available.
> All those old data will be stored in a format that can be retrieved back either
> into DB table or flat files.
> Does anyone know how should I go about doing that? Is there any existing tool
> that can already do that?
Sounds like table partitioning: create, for instance, a table for each
month and DROP old tables after 6 month or so.
Kontakt: Heynitz: 035242/47150, D1: 0160/7141639 (mehr: -> Header)
GnuPG: 0x31720C99, 1006 CCB4 A326 1D42 6431 2EB0 389D 1DC2 3172 0C99
In response to
pgsql-general by date
|Next:||From: Dimitri Fontaine||Date: 2010-03-29 13:41:43|
|Subject: Re: optimizing import of large CSV file into partitioned table?|
|Previous:||From: Juan Backson||Date: 2010-03-29 13:27:37|
|Subject: best practice in archiving CDR data|