Re: Better way to bulk-load millions of CSV records into postgres?

From: "Joel Burton" <joel(at)joelburton(dot)com>
To: "Ron Johnson" <ron(dot)l(dot)johnson(at)cox(dot)net>, "PgSQL Novice ML" <pgsql-novice(at)postgresql(dot)org>
Subject: Re: Better way to bulk-load millions of CSV records into postgres?
Date: 2002-05-22 02:09:43
Message-ID: JGEPJNMCKODMDHGOBKDNGEPMCOAA.joel@joelburton.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-novice

> -----Original Message-----
> From: pgsql-novice-owner(at)postgresql(dot)org
> [mailto:pgsql-novice-owner(at)postgresql(dot)org]On Behalf Of Ron Johnson
> Sent: Tuesday, May 21, 2002 4:40 PM
> To: PgSQL Novice ML
> Subject: [NOVICE] Better way to bulk-load millions of CSV records into
> postgres?
>
>
> Hi,
>
> Currently, I've got a python script using pyPgSQL that
> parses the CSV record, creates a string that is a big
> "INSERT INTO VALUES (...)" command, then, execute() it.
>
> top shows that this method uses postmaster with ~70% CPU
> utilization, and python with ~15% utilization.
>
> Still, it's only inserting ~190 recs/second. Is there a
> better way to do this, or am I constrained by the hardware?
>
> Instead of python and postmaster having to do a ton of data
> xfer over sockets, I'm wondering if there's a way to send a
> large number of csv records (4000, for example) in one big
> chunk to a stored procedure and have the engine process it
> all.

You could change your Python script to output a COPY command, which is
*much* faster than individual INSERT commands.

- J.

Joel BURTON | joel(at)joelburton(dot)com | joelburton.com | aim: wjoelburton
Knowledge Management & Technology Consultant

In response to

Browse pgsql-novice by date

  From Date Subject
Next Message Ron Johnson 2002-05-22 02:43:11 Re: PostgreSQL+Access97+Linux: How to..
Previous Message Victor Manuel Torres Aguirre 2002-05-22 00:50:33 PostgreSQL+Access97+Linux: How to..