From: | Alfred Perlstein <bright(at)wintelcom(dot)net> |
---|---|
To: | Adam Lang <aalang(at)rutgersinsurance(dot)com> |
Cc: | Matthew Kennedy <mkennedy(at)hssinc(dot)com>, pgsql-general(at)postgresql(dot)org |
Subject: | Re: Ignore when using COPY FROM |
Date: | 2000-08-29 15:50:41 |
Message-ID: | 20000829085041.W18862@fw.wintelcom.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
> From: "Matthew Kennedy" <mkennedy(at)hssinc(dot)com>
> > I have a ton of data in a text delimited file from an old legacy system.
> > When uploading it into postgres, I'd do something like this:
> >
> > COPY stuff FROM 'stuff.txt' USING DELIMITERS = '|';
> >
> > The problem is some of the rows in stuff.txt may not conform to the
> > stuff table attributes (duplicate keys, for example). The command above
> > terminates with no change to the table stuff when it encounters an error
> > in stuff.txt. Is there a way to have postgres ignore erroneous fields
> > but keep information about which fields weren't processed. I believe
> > Oracle has some support for this through an IGNORE clause.
* Adam Lang <aalang(at)rutgersinsurance(dot)com> [000829 08:29] wrote:
> Well, you could always make a table that has no constraints on duplicates
> and COPY TO that one. Then, make a query that inserts the data into your
> production table that handles the duplicates.
Actually, last I checked COPY INTO actually checks the RULE system, so
I'm pretty sure one can setup a rule to check for violated constraints
and 'INSTEAD DO NOTHING'. :-)
-Alfred
From | Date | Subject | |
---|---|---|---|
Next Message | Trond Eivind =?iso-8859-1?q?Glomsr=F8d?= | 2000-08-29 15:54:15 | Re: 7.1 Release Date |
Previous Message | Tom Lane | 2000-08-29 15:43:16 | Re: 7.1 Release Date |