importing large files

From: "olivier(dot)scalbert(at)algosyn(dot)com" <olivier(dot)scalbert(at)algosyn(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: importing large files
Date: 2007-09-28 08:22:49
Message-ID: 1190967769.434956.11760@o80g2000hse.googlegroups.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hello,

I need to import between 100 millions to one billion records in a
table. Each record is composed of two char(16) fields. Input format
is a huge csv file.I am running on a linux box with 4gb of ram.
First I create the table. Second I 'copy from' the cvs file. Third I
create the index on the first field.
The overall process takes several hours. The cpu seems to be the
limitation, not the memory or the IO.
Are there any tips to improve the speed ?

Thanks very much,

Olivier

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Alban Hertroys 2007-09-28 08:30:41 Re: Re: Why the ERROR: duplicate key violates unique constraint "master_pkey" is raised? - Is this a Bug?
Previous Message Richard Huxton 2007-09-28 08:11:26 Re: question about pg_dump -a