Re: Delete query takes exorbitant amount of time

From: Stephan Szabo <sszabo(at)megazone(dot)bigpanda(dot)com>
To: Karim Nassar <karim(dot)nassar(at)acm(dot)org>
Cc: Simon Riggs <simon(at)2ndquadrant(dot)com>, Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>, Christopher Kings-Lynne <chriskl(at)familyhealth(dot)com(dot)au>, pgsql-performance(at)postgresql(dot)org
Subject: Re: Delete query takes exorbitant amount of time
Date: 2005-03-26 23:18:16
Message-ID: 20050326151658.D82144@megazone.bigpanda.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

On Sat, 26 Mar 2005, Karim Nassar wrote:

> On Sat, 2005-03-26 at 07:55 -0800, Stephan Szabo wrote:
> > That seems like it should be okay, hmm, what does something like:
> >
> > PREPARE test(int) AS SELECT 1 from measurement where
> > id_int_sensor_meas_type = $1 FOR UPDATE;
> > EXPLAIN ANALYZE EXECUTE TEST(1);
> >
> > give you as the plan?
>
> QUERY PLAN
> -----------------------------------------------------------------------------------------------------------------------
> Seq Scan on measurement (cost=0.00..164559.16 rows=509478 width=6)
> (actual time=11608.402..11608.402 rows=0 loops=1)
> Filter: (id_int_sensor_meas_type = $1)
> Total runtime: 11608.441 ms
> (3 rows)

Hmm, has measurement been analyzed recently? You might want to see if
raising the statistics target on measurement.id_int_sensor_meas_type and
reanalyzing changes the estimated rows down from 500k.

In response to

Responses

Browse pgsql-performance by date

  From Date Subject
Next Message Karim Nassar 2005-03-27 00:44:47 Re: Delete query takes exorbitant amount of time
Previous Message Josh Berkus 2005-03-26 20:55:58 Re: How to improve db performance with $7K?