----- Original Message -----
From: "Andrew Hammond" <ahammond(at)ca(dot)afilias(dot)info>
To: "Deepblues" <deepblues(at)gmail(dot)com>
Sent: Sunday, February 27, 2005 9:28 PM
Subject: Re: [NOVICE] Import csv file into multiple tables in Postgres
> The brief answer is no, you can not import from a single csv file into
> multiple tables.
> If the csv file consists of two distinct sections of data, then you could
> of course split it into two csv files. If what you want to do is normalize
> existing data, then you should first import the existing data into a
> working table. Then you can manipulate it within the database.
> It is unlikely that you will need perl to do any of this.
I use perl a lot for stuff like this, but have found that in most cases, the
easiest thing to do is to load the data into a single postgresql table and
then create sql for doing the selects and inserts to then create the
multiple tables. This has the added advantage that you get to keep a copy
of the original data available in case you don't put every column into the
"working" database. If you end up doing this a lot, you can create a
separate "loader" schema that contains all of these raw csv tables in one
place, not visible by most users so as not to confuse the "working" schema.
In response to
pgsql-novice by date
|Next:||From: Sean Davis||Date: 2005-02-28 14:08:10|
|Subject: Re: Using upper() / decode() together|
|Previous:||From: Envbop||Date: 2005-02-28 04:02:39|
|Subject: Database Name|