From: | Ron Johnson <ron(dot)l(dot)johnson(at)cox(dot)net> |
---|---|
To: | PgSQL General ML <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: populate table with large csv file |
Date: | 2003-09-25 16:57:57 |
Message-ID: | 1064509077.1441.57.camel@haggis |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general pgsql-performance |
On Thu, 2003-09-25 at 11:38, Dave [Hawk-Systems] wrote:
> have the table "numbercheck"
> Attribute | Type | Modifier
> -----------+------------+----------
> svcnumber | integer | not null
> svcqual | varchar(9) |
> svcequip | char(1) |
> svctroub | varchar(6) |
> svcrate | varchar(4) |
> svcclass | char(1) |
> trailer | varchar(3) |
> Index: numbercheck_pkey
>
> also have a csv file
> 7057211380,Y,,,3,B
> 7057216800,Y,,,3,B
> 7057265038,Y,,,3,B
> 7057370261,Y,,,3,B
> 7057374613,Y,,,3,B
> 7057371832,Y,,,3,B
> 4166336554,Y,,,3,B
> 4166336863,Y,,,3,B
> 7057201148,Y,,,3,B
>
> aside from parsing the csv file through a PHP interface, what isthe easiest way
> to get that csv data importted into the postgres database. thoughts?
No matter what you do, it's going to barf: svcnumber is a 32-bit
integer, and 7,057,211,380 is significantly out of range.
Once you change svcnumber to bigint, the COPY command will easily
suck in the csv file.
--
-----------------------------------------------------------------
Ron Johnson, Jr. ron(dot)l(dot)johnson(at)cox(dot)net
Jefferson, LA USA
"Python is executable pseudocode; Perl is executable line noise"
From | Date | Subject | |
---|---|---|---|
Next Message | Josh Berkus | 2003-09-25 17:04:24 | Re: [GENERAL] PostgreSQL at OSCON 2004 |
Previous Message | P.J. "Josh" Rovero | 2003-09-25 16:50:56 | Re: populate table with large csv file |
From | Date | Subject | |
---|---|---|---|
Next Message | Oleg Lebedev | 2003-09-25 19:40:12 | TPC-R benchmarks |
Previous Message | P.J. "Josh" Rovero | 2003-09-25 16:50:56 | Re: populate table with large csv file |