Re: Inserting streamed data

From: Doug McNaught <doug(at)mcnaught(dot)org>
To: Kevin Old <kold(at)carolina(dot)rr(dot)com>
Cc: pgsql <pgsql-general(at)postgresql(dot)org>
Subject: Re: Inserting streamed data
Date: 2002-10-31 18:19:47
Message-ID: m34rb2bhdo.fsf@varsoon.wireboard.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Kevin Old <kold(at)carolina(dot)rr(dot)com> writes:

> I have data that is streamed to my server and stored in a text file. I
> need to get that data into my database as fast as possible. There are
> approximately 160,000 rows in this text file. I understand I can use
> the COPY command to insert large chunks of data from a text file, but I
> can't use it in this situation. Each record in the text file has 502
> "fields". I pull out 50 of those. I haven't found a way to manipulate
> the COPY command to pull out the values I need. So that solution would
> be out.
>
> I have a perl script that goes through the file and pulls out the 50
> fields, then inserts them into the database, but it seems to be very
> slow. I think I just need some minor performance tuning, but dont' know
> which variables to set in the postgresql.conf file that would help with
> the speed of the inserts.

First: are you batching up multiple INSERTS in a transaction? If you
don't it will be very slow indeed.

Second, why not have the Perl script pull out the fields you want,
paste them together and feed them to COPY? That should eliminate the
parse overhead of multiple INSERTS.

-Doug

In response to

Browse pgsql-general by date

  From Date Subject
Next Message David Blood 2002-10-31 18:25:13 Re: Inserting streamed data
Previous Message Chris Gamache 2002-10-31 18:14:54 Creating a unique identifier...