> From: "Matthew Kennedy" <mkennedy(at)hssinc(dot)com>
> > I have a ton of data in a text delimited file from an old legacy system.
> > When uploading it into postgres, I'd do something like this:
> > COPY stuff FROM 'stuff.txt' USING DELIMITERS = '|';
> > The problem is some of the rows in stuff.txt may not conform to the
> > stuff table attributes (duplicate keys, for example). The command above
> > terminates with no change to the table stuff when it encounters an error
> > in stuff.txt. Is there a way to have postgres ignore erroneous fields
> > but keep information about which fields weren't processed. I believe
> > Oracle has some support for this through an IGNORE clause.
* Adam Lang <aalang(at)rutgersinsurance(dot)com> [000829 08:29] wrote:
> Well, you could always make a table that has no constraints on duplicates
> and COPY TO that one. Then, make a query that inserts the data into your
> production table that handles the duplicates.
Actually, last I checked COPY INTO actually checks the RULE system, so
I'm pretty sure one can setup a rule to check for violated constraints
and 'INSTEAD DO NOTHING'. :-)
In response to
pgsql-general by date
|Next:||From: Trond Eivind =?iso-8859-1?q?Glomsr=F8d?=||Date: 2000-08-29 15:54:15|
|Subject: Re: 7.1 Release Date|
|Previous:||From: Tom Lane||Date: 2000-08-29 15:43:16|
|Subject: Re: 7.1 Release Date |