From: | x <xbdelacour(at)yahoo(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | COPY failure |
Date: | 2001-07-21 08:08:16 |
Message-ID: | 5.1.0.14.0.20010721035801.00a84100@mail.fmaudio.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hi.
I'm trying to run a script that executes multiple \copy commands in order
to import some 5 GB of data. All the input files are computer generated,
simple (5 numeric columns and "\N" for NULL in some cases), use default
delimiters, and _should_ be error free. But I keep getting error messages
for _some_ of the \copy commands.
eg,
pqReadData() -- backend closed the channel unexpectedly.
This probably means the backend terminated abnormally
before or while processing the request.
PQendcopy: resetting connection
Questions:
1. Are there any size restrictions on the input files?
2. How do I tell which file or better yet which line is tripping up the
system? I could cut the list in half repeatedly until I find the problem,
but that would be a huge waste of time given how long it takes to import
any of the data. I could setup more verbose logging on the backend, but
will that make a mess if my error is 2 GB into the import?
3. I'm running the Windows/cygwin version of the psql client and a Linux
backend, if that makes a difference.
Any help would be much appreciated.
-Xavier
_________________________________________________________
Do You Yahoo!?
Get your free @yahoo.com address at http://mail.yahoo.com
From | Date | Subject | |
---|---|---|---|
Next Message | Richard Huxton | 2001-07-21 10:37:45 | Re: Microsoft SQL Server Replication |
Previous Message | Martijn van Oosterhout | 2001-07-21 07:25:38 | Performance impact of NULLs and variable length fields |