Re: Huge overestimation in rows expected results in bad plan

From: Andy Colson <andy(at)squeakycode(dot)net>
To: bricklen <bricklen(at)gmail(dot)com>
Cc: pgsql-performance(at)postgresql(dot)org
Subject: Re: Huge overestimation in rows expected results in bad plan
Date: 2010-11-09 22:48:04
Message-ID: 4CD9CFA4.7030600@squeakycode.net
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

On 11/9/2010 3:26 PM, bricklen wrote:
> Hi,
>
> I have a query that is getting a pretty bad plan due to a massively
> incorrect count of expected rows. All tables in the query were vacuum
> analyzed right before the query was tested. Disabling nested loops
> gives a significantly faster result (4s vs 292s).
> Any thoughts on what I can change to make the planner generate a better plan?
>
>
> explain analyze
> select c.id, c.transactionid, c.clickgenerated, c.confirmed,
> c.rejected, cr.rejectedreason
> from conversion c
> inner join conversionrejected cr on cr.idconversion = c.id
> where date = '2010-11-06'
> and idaction = 12906
> and idaffiliate = 198338
> order by transactionid;
>
>

> -> Seq Scan on conversionrejected cr (cost=0.00..191921.82
> rows=11012682 width=31) (actual time=0.003..1515.816 rows=11012682
> loops=72)
> Total runtime: 292668.992 ms
>
>
>
>

Looks like the table stats are ok. But its doing a sequential scan.
Are you missing an index?

Also:

http://explain.depesz.com/

is magic.

-Andy

In response to

Responses

Browse pgsql-performance by date

  From Date Subject
Next Message bricklen 2010-11-09 22:59:02 Re: Huge overestimation in rows expected results in bad plan
Previous Message bricklen 2010-11-09 21:26:47 Huge overestimation in rows expected results in bad plan