From: | Christopher Browne <cbbrowne(at)acm(dot)org> |
---|---|
To: | pgsql-sql(at)postgresql(dot)org |
Subject: | Re: Export tab delimited from mysql to postgres. |
Date: | 2004-10-12 00:46:02 |
Message-ID: | m3d5zoyd9x.fsf@knuth.knuth.cbbrowne.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
Quoth Theo(dot)Galanakis(at)lonelyplanet(dot)com(dot)au (Theo Galanakis):
> Could you provide a example of how to do this?
>
> I actually ended up exporting the data as Insert statements,
> which strips out cf/lf within varchars. However it takes an eternity
> to import 200,000 records... 24 hours infact???? Is this normal?
I expect that this results from each INSERT being a separate
transaction.
If you put a BEGIN at the start and a COMMIT at the end, you'd
doubtless see an ENORMOUS improvement.
That's not even the _big_ improvement, either. The _big_ improvement
would involve reformatting the data so that you could use the COPY
statement, which is _way_ faster than a bunch of INSERTs. Take a look
at the documentation to see the formatting that is needed:
http://techdocs.postgresql.org/techdocs/usingcopy.php
http://www.faqs.org/docs/ppbook/x5504.htm
http://www.postgresql.org/docs/7.4/static/sql-copy.html
--
output = ("cbbrowne" "@" "ntlug.org")
http://www3.sympatico.ca/cbbrowne/lsf.html
Question: How many surrealists does it take to change a light bulb?
Answer: Two, one to hold the giraffe, and the other to fill the bathtub
with brightly colored machine tools.
From | Date | Subject | |
---|---|---|---|
Next Message | Ramiro Batista da Luz | 2004-10-12 00:52:56 | |
Previous Message | Bruno Wolff III | 2004-10-11 23:40:50 | Re: Export tab delimited from mysql to postgres. |