From: | "A(dot) Kretschmer" <andreas(dot)kretschmer(at)schollglas(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: best practice in archiving CDR data |
Date: | 2010-03-29 13:33:34 |
Message-ID: | 20100329133334.GI21944@a-kretschmer.de |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
In response to Juan Backson :
> Hi,
>
> I am using Postgres to store CDR data for voip switches. The data size quickly
> goes about a few TBs.
>
> What I would like to do is to be able to regularly archive the oldest data so
> only the most recent 6 months of data is available.
>
> All those old data will be stored in a format that can be retrieved back either
> into DB table or flat files.
>
> Does anyone know how should I go about doing that? Is there any existing tool
> that can already do that?
Sounds like table partitioning: create, for instance, a table for each
month and DROP old tables after 6 month or so.
http://www.postgresql.org/docs/current/static/ddl-partitioning.html
Regards, Andreas
--
Andreas Kretschmer
Kontakt: Heynitz: 035242/47150, D1: 0160/7141639 (mehr: -> Header)
GnuPG: 0x31720C99, 1006 CCB4 A326 1D42 6431 2EB0 389D 1DC2 3172 0C99
From | Date | Subject | |
---|---|---|---|
Next Message | Dimitri Fontaine | 2010-03-29 13:41:43 | Re: optimizing import of large CSV file into partitioned table? |
Previous Message | Juan Backson | 2010-03-29 13:27:37 | best practice in archiving CDR data |