Re: "stack depth limit exceeded" executing DELETE FROM

From: Mark Lewis <mark(dot)lewis(at)mir3(dot)com>
To: "Cox, Brian" <Brian(dot)Cox(at)ca(dot)com>
Cc: pgsql-jdbc(at)postgresql(dot)org
Subject: Re: "stack depth limit exceeded" executing DELETE FROM
Date: 2006-09-05 19:00:33
Message-ID: 1157482833.9657.737.camel@archimedes
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-jdbc

Not a limit to a single DELETE per se, a single delete could remove
millions of rows, but you can't specify that many individual entries in
an IN (..) section.

This isn't a PG-specific issue by the way; Oracle limits you to about
1000 entries IIRC, and SQL Server limits it to 512 or thereabouts. It's
just a bad approach.

You want to refactor this to use a transaction and JDBC update batches,
to gain the benefits of batching without the stack overrun problems. If
you're concerned about speed you could try benchmarking to find the
optimum size for the number of items in the "IN (...)" clause, but in
our testing the optimum number for most databases was 1. For PG I think
it was 2. Moral of the story: it's hard to beat individual batch
deletes.

An alternative approach would be to add these doomed PK's to a temporary
table and then do "DELETE FROM myTable WHERE pk IN (SELECT victimId FROM
temp_doomed_table)"

-- Mark

On Tue, 2006-09-05 at 14:18 -0400, Cox, Brian wrote:
> The following fails with a "stack depth limit exceeded":
>
> DELETE FROM some-table WHERE pk IN (pk1, pk2, ..., pkN)
>
> where pk = primary key and N is large, may be > 50000.
>
> Is there some limit as to the number of rows that can
> be deleted in a single DELETE?
>
> Thanks,
> Brian
>
>

In response to

Browse pgsql-jdbc by date

  From Date Subject
Next Message AgentM 2006-09-05 19:18:46 Re: "stack depth limit exceeded" executing DELETE FROM
Previous Message Tom Lane 2006-09-05 18:59:05 Re: "stack depth limit exceeded" executing DELETE FROM