Quoth Theo(dot)Galanakis(at)lonelyplanet(dot)com(dot)au (Theo Galanakis):
> Could you provide a example of how to do this?
> I actually ended up exporting the data as Insert statements,
> which strips out cf/lf within varchars. However it takes an eternity
> to import 200,000 records... 24 hours infact???? Is this normal?
I expect that this results from each INSERT being a separate
If you put a BEGIN at the start and a COMMIT at the end, you'd
doubtless see an ENORMOUS improvement.
That's not even the _big_ improvement, either. The _big_ improvement
would involve reformatting the data so that you could use the COPY
statement, which is _way_ faster than a bunch of INSERTs. Take a look
at the documentation to see the formatting that is needed:
output = ("cbbrowne" "@" "ntlug.org")
Question: How many surrealists does it take to change a light bulb?
Answer: Two, one to hold the giraffe, and the other to fill the bathtub
with brightly colored machine tools.
In response to
pgsql-sql by date
|Next:||From: Ramiro Batista da Luz||Date: 2004-10-12 00:52:56|
|Previous:||From: Bruno Wolff III||Date: 2004-10-11 23:40:50|
|Subject: Re: Export tab delimited from mysql to postgres.|