Nested loops row estimates always too high

From: "Carlo Stonebanks" <stonec(dot)register(at)sympatico(dot)ca>
To: <pgsql-performance(at)postgresql(dot)org>
Subject: Nested loops row estimates always too high
Date: 2007-09-18 20:07:33
Message-ID: F0670FC4734843F5A489C0737CD3046C@serenity
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

I am noticing that my queries are spending a lot of time in nested loops.
The table/index row estimates are not bad, but the nested loops can be off
by a factor of 50. In any case, they are always too high.

If this is always occurring, is this an indication of a general
configuration problem?

Carlo

select

pp.provider_id,

pp.provider_practice_id,

nearby.distance

from mdx_core.provider_practice as pp

join mdx_core.facility as f

on f.facility_id = pp.facility_id

join (select * from mdx_core.zips_in_mile_range('08820', 10) where zip
> '') as nearby

on f.default_country_code = 'US'

and f.default_postal_code = nearby.zip

and pp.facility_address_id is NULL

union select

pp.provider_id,

pp.provider_practice_id,

nearby.distance

from mdx_core.provider_practice as pp

join mdx_core.facility_address as fa

on fa.facility_address_id = pp.facility_address_id

join mdx_core.address as a

on a.address_id = fa.address_id

join (select * from mdx_core.zips_in_mile_range('08820', 10) where zip
> '') as nearby

on a.country_code = 'US'

and a.postal_code = nearby.zip

Unique (cost=67605.91..67653.18 rows=4727 width=16) (actual
time=8634.618..8637.918 rows=907 loops=1)

-> Sort (cost=67605.91..67617.73 rows=4727 width=16) (actual
time=8634.615..8635.651 rows=907 loops=1)

Sort Key: provider_id, provider_practice_id, distance

-> Append (cost=0.00..67317.41 rows=4727 width=16) (actual
time=176.056..8632.429 rows=907 loops=1)

-> Nested Loop (cost=0.00..38947.07 rows=3143 width=16)
(actual time=176.054..7867.962 rows=872 loops=1)

-> Nested Loop (cost=0.00..11520.79 rows=8121
width=12) (actual time=169.372..3041.010 rows=907 loops=1)

-> Function Scan on zips_in_mile_range
(cost=0.00..15.00 rows=333 width=40) (actual time=151.479..151.671 rows=66
loops=1)

Filter: (zip > ''::text)

-> Index Scan using
facility_country_postal_code_idx on facility f (cost=0.00..34.25 rows=24
width=15) (actual time=4.969..43.740 rows=14 loops=66)

Index Cond: ((f.default_country_code =
'US'::bpchar) AND ((f.default_postal_code)::text = zips_in_mile_range.zip))

-> Index Scan using provider_practice_facility_idx on
provider_practice pp (cost=0.00..3.36 rows=1 width=12) (actual
time=4.915..5.316 rows=1 loops=907)

Index Cond: (f.facility_id = pp.facility_id)

Filter: (facility_address_id IS NULL)

-> Nested Loop (cost=0.00..28323.07 rows=1584 width=16)
(actual time=170.310..762.472 rows=35 loops=1)

-> Nested Loop (cost=0.00..7791.77 rows=1579 width=12)
(actual time=170.289..612.579 rows=36 loops=1)

-> Nested Loop (cost=0.00..2595.96 rows=712
width=12) (actual time=167.017..354.261 rows=29 loops=1)

-> Function Scan on zips_in_mile_range
(cost=0.00..15.00 rows=333 width=40) (actual time=150.188..150.312 rows=66
loops=1)

Filter: (zip > ''::text)

-> Index Scan using
address_country_postal_code_address_idx on address a (cost=0.00..7.73
rows=2 width=17) (actual time=2.483..3.086 rows=0 loops=66)

Index Cond: ((a.country_code =
'US'::bpchar) AND ((a.postal_code)::text = zips_in_mile_range.zip))

-> Index Scan using facility_address_address_idx
on facility_address fa (cost=0.00..7.15 rows=12 width=8) (actual
time=7.652..8.901 rows=1 loops=29)

Index Cond: (a.address_id = fa.address_id)

-> Index Scan using
provider_practice_facility_address_idx on provider_practice pp
(cost=0.00..12.80 rows=16 width=12) (actual time=4.156..4.158 rows=1
loops=36)

Index Cond: (fa.facility_address_id =
pp.facility_address_id)

Total runtime: 8639.066 ms

In response to

Browse pgsql-performance by date

  From Date Subject
Next Message Galantucci Giovanni 2007-09-19 09:46:48 R: R: DELETE queries slow down
Previous Message Carlo Stonebanks 2007-09-18 19:59:44 Performance improves only after repeated VACUUM/ANALYZE