Re: Random Page Cost and Planner

From: David Jarvis <thangalin(at)gmail(dot)com>
To: pgsql-performance(at)postgresql(dot)org
Subject: Re: Random Page Cost and Planner
Date: 2010-05-26 20:21:19
Message-ID: AANLkTimgupRIdKLjErH7sf6qr8TNePYDjPn-0wyfcnYB@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

I was told to try OVERLAPS instead of checking years. The query is now:

SELECT
extract(YEAR FROM m.taken) AS year,
avg(m.amount) as amount
FROM
climate.city c,
climate.station s,
climate.station_category sc,
climate.measurement m
WHERE
c.id = 5148 AND
earth_distance(
ll_to_earth(c.latitude_decimal,c.longitude_decimal),
ll_to_earth(s.latitude_decimal,s.longitude_decimal)) / 1000 <= 30 AND
s.elevation BETWEEN 0 AND 3000 AND
s.applicable = TRUE AND
sc.station_id = s.id AND
sc.category_id = 7 AND
* (sc.taken_start, sc.taken_end) OVERLAPS ('1900-01-01'::date,
'2009-12-31'::date) AND*
m.station_id = s.id AND
m.taken BETWEEN sc.taken_start AND sc.taken_end AND
m.category_id = sc.category_id
GROUP BY
extract(YEAR FROM m.taken)
ORDER BY
extract(YEAR FROM m.taken)

25 seconds from cold, no full table scan:

http://explain.depesz.com/s/VV5

Much better than 85 seconds, but still an order of magnitude too slow.

I was thinking of changing the *station_category* table to use the
measurement table's primary key, instead of keying off date, as converting
the dates for comparison strikes me as a bit of overhead. Also, I can get
remove the "/ 1000" by changing the Earth's radius to kilometres (from
metres), but a constant division shouldn't be significant.

I really appreciate all your patience and help over the last sixteen days
trying to optimize this database and these queries.

Dave

In response to

Browse pgsql-performance by date

  From Date Subject
Next Message Konrad Garus 2010-05-27 07:24:23 Re: shared_buffers advice
Previous Message David Jarvis 2010-05-26 18:55:37 Re: Random Page Cost and Planner