Re: error status 139

From: Steven Lane <stevelcmc(at)mindspring(dot)com>
To: pgsql-admin(at)postgresql(dot)org
Subject: Re: error status 139
Date: 2001-07-23 13:23:06
Message-ID: v03007802b781d595cfa4@[65.15.153.184]
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-admin

Hello all:

I'm trying to load about 10M rows of data into a simple postgres table. The
data is straightforward and fairly clean, but does have glitches every few
tens of thousands of rows. My problem is that when COPY hits a bad row it
just aborts, leaving me to go back, delete or clean up the row and try
again.

Is there any way to import records that could just skip the bad ones and
notify me which ones they are? Loading this much data is pretty
time-consuming, especially when I keep having to repeat it to find each new
bad row. Is there a better way?

-- sgl

In response to

Browse pgsql-admin by date

  From Date Subject
Next Message Steven Lane 2001-07-23 13:24:56 Better copy/import
Previous Message Luis Sousa 2001-07-23 08:14:44 Re: Operator *=