Skip site navigation (1) Skip section navigation (2)

Re: BUG #4527: Prepare of large multirow insert fails without error

From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: "Vincent Kessler" <vincent(dot)kessler(at)quantec-networks(dot)de>
Cc: pgsql-bugs(at)postgresql(dot)org
Subject: Re: BUG #4527: Prepare of large multirow insert fails without error
Date: 2008-11-13 15:28:47
Message-ID: 20112.1226590127@sss.pgh.pa.us (view raw or flat)
Thread:
Lists: pgsql-bugs
"Vincent Kessler" <vincent(dot)kessler(at)quantec-networks(dot)de> writes:
> i am trying to do large multirow inserts using PQsendPrepare. I have not
> found a limit in the number of parameters or the size of the querystring, so
> i assume memory is the limit.
> When executing the PQsendPrepare function using a querystring of about 100kb
> in size and about 10000 parameters the function returns after timeout. A
> tcpdump shows a "parse" message with a length of 100kb but the transfer
> stops after roughly 30kb.

With such a large statement it's unlikely that the PQsendPrepare call
would have been able to push all the data out immediately.  Since it's
intended to not block the application, it would return with some data
still unsent.  You need to call PQflush periodically until the data is
all transmitted, if you want to run in nonblocking mode.

> The server log shows:
> LOG:  incomplete message from client
> LOG:  unexpected EOF on client connection

Although this explanation doesn't say why the client apparently dropped
the connection.  I think you need to show us a complete example of what
you're doing, if the above hint isn't sufficient.

			regards, tom lane

In response to

Responses

pgsql-bugs by date

Next:From: Vincent KesslerDate: 2008-11-13 17:03:22
Subject: Re: BUG #4527: Prepare of large multirow insert fails without error
Previous:From: Tom LaneDate: 2008-11-13 14:54:42
Subject: Re: BUG #4528: (rounding?) error with some timezone names

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group