Backup Large Tables

From: "Charles Ambrose" <jamjam360(at)gmail(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Backup Large Tables
Date: 2006-09-22 02:54:07
Message-ID: 61ca079e0609211954g572d4ef8hd6ed8bb597eab99e@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi!

I have a fairly large database tables (say an average of 3Million to
4Million records). Using the pg_dump utility takes forever to dump the
database tables. As an alternative, I have created a program that gets all
the data from the table and then put it into a text file. I was also
unsuccessfull in this alternative to dump the database.

As I see it, I can dump the tables by gradually getting data and dumping it.
I plan to select a number of records from the table then dump it to a text
file. This process continues until all records in the table are obtained.
With this aproach I need a primary key that uniquely identifies each record
so that each pass of getting data from the tables will not get data that has
already been processed.
Problem with this approach though is that my dumping utility will not be
generic.

Are there any alternatives?

Thanks for help in advance.

Thanks!

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Michael Nolan 2006-09-22 03:06:31 Re: Backup Large Tables
Previous Message Christopher Browne 2006-09-22 02:27:07 Re: postgresql rising