From: | Joel Burton <jburton(at)scw(dot)org> |
---|---|
To: | Joe Johnson <joej(at)generalsearch(dot)net> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: COPY from file to table containing unique index |
Date: | 2001-04-11 02:16:58 |
Message-ID: | Pine.LNX.4.21.0104102216270.31213-100000@olympus.scw.org |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Tue, 10 Apr 2001, Joe Johnson wrote:
> I have a table with over 1,000,000 records in it containing names and phone
> numbers, and one of the indexes on the table is a unique index on the phone
> number. I am trying to copy about 100,000 more records to the table from a
> text file, but I get an error on copying because of duplicate phone numbers
> in the text file, which kills the COPY command without copying anything to
> the table. Is there some way that I can get Postgres to copy the records
> from the file and just skip records that contain duplicates to the unique
> index? I found that using PHP scripts to do inserts for a file of this size
> take MUCH longer than I'd like, so I'd like to avoid having to do it that
> way if I can. Any help is appreciated. Thanks!
There are a few options.
This was discussed yesterday, in the thread 'problem with copy command'
--
Joel Burton <jburton(at)scw(dot)org>
Director of Information Systems, Support Center of Washington
From | Date | Subject | |
---|---|---|---|
Next Message | Thomas Lockhart | 2001-04-11 02:34:01 | Re: Speaking of Indexing... (Text indexing) |
Previous Message | Joel Burton | 2001-04-11 02:14:38 | Re: newbie question - INSERT |