Skip site navigation (1) Skip section navigation (2)

Re: how to continue after error in batch mode with psql

From: Jasen Betts <jasen(at)xnet(dot)co(dot)nz>
To: pgsql-novice(at)postgresql(dot)org
Subject: Re: how to continue after error in batch mode with psql
Date: 2010-04-29 09:18:58
Message-ID: hrbiu2$tt4$2@reversiblemaps.ath.cx (view raw or flat)
Thread:
Lists: pgsql-novice
On 2010-04-28, Urs Rau <urs(dot)rau(at)uk(dot)om(dot)org> wrote:
> I have a nightly process that pulls data over into postgresql 8.3.10
> from a progress server runing under8.x. unfortunately the progress db
> does not enforce record size/length limitations. It does not care if a
> record of 20 characters contains 21, or 100 for that matter.
>
> we have a simple pipe that taks the progress data export dump and
> imports it into postgresql
>
> su - postgres -c "cat test.csv | psql -X -q test -c \"COPY t_test FROM
> stdin WITH DELIMITER AS ',' NULL AS '?' CSV QUOTE AS '\\\"' ESCAPE AS
> '\\\"'\""
>
> If any of the fields are over-length, we do get a error message that
> tells us which row needs fixing.
>
> ERROR:  value too long for type character varying(20)
> CONTEXT:  COPY t_test, line 2, column t_comment_c: "'comment 3 that is
> too long'"
>
> But how do we get psql to run through and continue after an error and
> import as much as possible of the rest of the import data and give us
> error messages about all lines with errors?

why not just copy the whole lot into a temp table (like the target
table but with text fields instead of fixed-width varchar)
and then do a select on that to find the over-length lines and another
select to insert the good data into the real target table.


In response to

Responses

pgsql-novice by date

Next:From: Jasen BettsDate: 2010-04-29 09:21:11
Subject: Re: Indicators
Previous:From: Urs RauDate: 2010-04-28 15:59:05
Subject: how to continue after error in batch mode with psql

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group