Better copy/import

From: Steven Lane <stevelcmc(at)mindspring(dot)com>
To: pgsql-admin(at)postgresql(dot)org
Subject: Better copy/import
Date: 2001-07-23 13:24:56
Message-ID: v03007803b781d5ebe403@[65.15.153.184]
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-admin

Hello all:

Sorry for the bad subject line on the last version of this post.

I'm trying to load about 10M rows of data into a simple postgres table. The
data is straightforward and fairly clean, but does have glitches every few
tens of thousands of rows. My problem is that when COPY hits a bad row it
just aborts, leaving me to go back, delete or clean up the row and try
again.

Is there any way to import records that could just skip the bad ones and
notify me which ones they are? Loading this much data is pretty
time-consuming, especially when I keep having to repeat it to find each new
bad row. Is there a better way?

-- sgl

In response to

Responses

Browse pgsql-admin by date

  From Date Subject
Next Message Gary Stainburn 2001-07-23 14:09:09 Re: Better copy/import
Previous Message Steven Lane 2001-07-23 13:23:06 Re: error status 139