From: | Craig Ringer <craig(at)postnewspapers(dot)com(dot)au> |
---|---|
To: | tony(at)exquisiteimages(dot)com |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Need help doing a CSV import |
Date: | 2010-07-14 12:16:52 |
Message-ID: | 4C3DAAB4.1040702@postnewspapers.com.au |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 14/07/2010 7:04 PM, tony(at)exquisiteimages(dot)com wrote:
> I am in the process of moving a FoxPro based system to PostgreSQL.
>
> We have several tables that have memo fields which contain carriage
> returns and line feeds that I need to preserve. I thought if I converted
> these into the appropriate \r and \n codes that they would be imported as
> carriage returns and line feeds, but instead they are stored in the
> database as \r and \n.
PostgreSQL doesn't process escapes in CSV import mode.
You can reformat the data into the non-csv COPY format, which WILL
process escapes. Or you can post-process it after import to expand them.
Unfortunately PostgreSQL doesn't offer an option to process escapes when
"CSV" mode COPY is requested.
I posted a little Python script that reads CSV data and spits out
COPY-friendly output a few days ago. It should be trivially adaptable to
your needs, you'd just need to change the input dialect options. See the
archives for the script.
--
Craig Ringer
From | Date | Subject | |
---|---|---|---|
Next Message | hubert depesz lubaczewski | 2010-07-14 12:28:49 | Re: Idle In Transaction |
Previous Message | Daniel Migowski | 2010-07-14 11:40:42 | Website FTP Server structure wrong |