Big query problem

From: Tomas Berndtsson <tomas(at)nocrew(dot)org>
To: pgsql-sql(at)postgresql(dot)org
Subject: Big query problem
Date: 2002-11-27 12:46:49
Message-ID: 80d6or9o3q.fsf@junk.nocrew.org
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-sql

I'm using 7.2.1, trying to run a query like this:

DELETE FROM table WHERE col1='something' AND col2 IN
('aasdoijhfoisdfsdoif','sdfsdfsdfsadfsdf', ... );

In the parantheses I have 6400 names, each about 20 characters. I'm
using libpq from C. This did not work very well, but the result was
very unexpected.

My application has several threads, each opening its own connection to
the database. The above query was run in a transaction followed by a
COMMIT. There was no error from running the above query, but instead,
it seems that the query was never run at all. As a side effect, every
other connection to the database always got:

NOTICE: current transaction is aborted, queries ignored until end of
transaction block

when trying to run a query. I thought that the transactions in
different connections didn't have anything to do with each other.

If I limited the number of names in the failing query to 3200, it
worked well and as expected.

Is there a limit in libpq of the length of a query? And if this is
exceeded, shouldn't PQexec() give an error?

Greetings,

Tomas

In response to

Browse pgsql-sql by date

  From Date Subject
Next Message D'Arcy J.M. Cain 2002-11-27 12:51:57 Re: Casting Money To Numeric
Previous Message Achilleus Mantzios 2002-11-27 12:23:12 FreeBSD, Linux: select, select count(*) performance