Skip site navigation (1) Skip section navigation (2)

Re: BUG #4527: Prepare of large multirow insert fails without error

From: Vincent Kessler <vincent(dot)kessler(at)quantec-networks(dot)de>
To: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
Cc: pgsql-bugs(at)postgresql(dot)org
Subject: Re: BUG #4527: Prepare of large multirow insert fails without error
Date: 2008-11-13 17:03:22
Message-ID: 491C5DDA.7090909@quantec-networks.de (view raw or flat)
Thread:
Lists: pgsql-bugs
Tom Lane schrieb, Am 13.11.2008 16:28:
> "Vincent Kessler" <vincent(dot)kessler(at)quantec-networks(dot)de> writes:
>> i am trying to do large multirow inserts using PQsendPrepare. I have not
>> found a limit in the number of parameters or the size of the querystring, so
>> i assume memory is the limit.
>> When executing the PQsendPrepare function using a querystring of about 100kb
>> in size and about 10000 parameters the function returns after timeout. A
>> tcpdump shows a "parse" message with a length of 100kb but the transfer
>> stops after roughly 30kb.
> 
> With such a large statement it's unlikely that the PQsendPrepare call
> would have been able to push all the data out immediately.  Since it's
> intended to not block the application, it would return with some data
> still unsent.  You need to call PQflush periodically until the data is
> all transmitted, if you want to run in nonblocking mode.
> 

Thank you very much, that was exactly the problem. Everything works 
perfectly now.

Regards,

Vincent


In response to

pgsql-bugs by date

Next:From: proDate: 2008-11-13 22:30:47
Subject: BUG #4529: lc_messages in config ignored
Previous:From: Tom LaneDate: 2008-11-13 15:28:47
Subject: Re: BUG #4527: Prepare of large multirow insert fails without error

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group