Skip site navigation (1) Skip section navigation (2)

Some CSV file-import questions

From: Ron Johnson <ron(dot)l(dot)johnson(at)cox(dot)net>
To: PgSQL Novice ML <pgsql-novice(at)postgresql(dot)org>
Subject: Some CSV file-import questions
Date: 2002-05-19 11:01:26
Message-ID: 1021806087.9171.18.camel@rebel (view raw, whole thread or download thread mbox)
Lists: pgsql-novice

If the csv file generated by the application that I am 
importing from contains quotes around each field, must I
write a program to strip these "field-level" quotes before
sending the file to COPY?

As we know, COPY is a single transaction.  Therefore, it
would be "unpleasant" if, say, the process that is doing the
importing dies 90% of the way through a 10,000,000 row table.
Is there a checkpoint mechanism, that, would do a COMMIT, for
example, every 10,000 rows.  Then, if the process that is doing 
the importing does 90% of the way through that 10,000,000 row 
table, when you restart the COPY, it skips over the inserted
Here is an example from the RDBMS that I currently use:
$ bulkload -load -log -commit=10000 -tran=exclusive -db=test \
     -table=foo ~/foo.csv
Then, if something happens after inserting 9,000,000 rows,
it can be restarted by:
$ bulkload -load -log -commit=10000 -skip=9000000 -db=test \
     -tran=exclusive -table=foo ~/foo.csv

From what I've seen in the documentation, and the mailing
list archives, the solution to both of these questions is
to roll my bulk loader.

| Ron Johnson, Jr.        Home: ron(dot)l(dot)johnson(at)cox(dot)net     |
| Jefferson, LA  USA |
|                                                         |
| "I have created a government of whirled peas..."        |
|   Maharishi Mahesh Yogi, 12-May-2002,                   |
!   CNN, Larry King Live                                  |


pgsql-novice by date

Next:From: Andrew McMillanDate: 2002-05-19 11:13:39
Subject: Re: Some CSV file-import questions
Previous:From: thiemoDate: 2002-05-19 08:15:06
Subject: Forgotten the master password of db

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group