From: | "Tony Wasson" <ajwasson(at)gmail(dot)com> |
---|---|
To: | "Glenn Gillen" <glenn(dot)gillen(at)gmail(dot)com> |
Cc: | pgsql-sql(at)postgresql(dot)org |
Subject: | Re: Can COPY update or skip existing records? |
Date: | 2008-10-01 16:33:16 |
Message-ID: | 6d8daee30810010933r1dba9b41uda6d9e04a0ccf05f@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
On Tue, Sep 30, 2008 at 5:16 AM, Glenn Gillen <glenn(dot)gillen(at)gmail(dot)com> wrote:
> Hey all,
>
> I've got a table with a unique constraint across a few fields which I
> need to regularly import a batch of data into. Is there a way to do it
> with COPY without getting conflicts on the unique contraint? I have no
> was of being certain that some of the data I'm trying to load isn't in
> the table already.
>
> Ideally I'd like it to operate like MySQL's on_duplicate_key_update
> option, but for now I'll suffice with just ignoring existing rows and
> proceeding with everything else.
I ran into a similar problem. I'm using these merge_by_key functions:
http://pgfoundry.org/projects/mbk
Here's a quick example...
CREATE TEMP TABLE foo (LIKE dst INCLUDING DEFAULTS);
COPY foo (c1, c2) FROM STDIN;
(your copy data here)
\.
SELECT * FROM merge_by_key(
'public', -- table schema
'dst', -- table name
'mnew.c2 < mold.c2', -- merge condition
'select c1,c2 FROM foo'
);
Disclaimer: The author is a friend of mine. :-)
From | Date | Subject | |
---|---|---|---|
Next Message | Montaseri | 2008-10-02 20:49:59 | Query how-to |
Previous Message | Tom Lane | 2008-10-01 15:47:16 | Re: Accessing elements of bytea[] always returns NULL |