Re: Finding Errors in .csv Input Data

From: Dimitri Fontaine <dimitri(at)2ndQuadrant(dot)fr>
To: Rich Shepard <rshepard(at)appl-ecosys(dot)com>
Cc: pgsql-general(at)postgresql(dot)org
Subject: Re: Finding Errors in .csv Input Data
Date: 2011-03-06 20:39:12
Message-ID: m239n0gf0v.fsf@2ndQuadrant.fr
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Rich Shepard <rshepard(at)appl-ecosys(dot)com> writes:
> I'm sure many of you have solved this problem in the past and can offer
> solutions that will work for me. The context is a 73-column postgres table
> of data that was originally in an Access .mdb file. A colleague loaded the
> file into Access and wrote a .csv file for me to use since we have nothing
> Microsoft here. There are 110,752 rows in the file/table. After a lot of
> cleaning with emacs and sed, the copy command accepted all but 80 rows of
> data. Now I need to figure out why postgres reports them as having too many
> columns.

Did you try pgloader yet?

http://pgloader.projects.postgresql.org/

http://pgfoundry.org/projects/pgloader/
https://github.com/dimitri/pgloader
http://packages.debian.org/sid/pgloader

Regards,
--
Dimitri Fontaine
http://2ndQuadrant.fr PostgreSQL : Expertise, Formation et Support

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Dimitri Fontaine 2011-03-06 20:42:37 Re: Covert database from ASCII to UTF-8
Previous Message Eduardo 2011-03-06 19:33:01 Re: Web Hosting