Skip site navigation (1) Skip section navigation (2)

Re: Random Page Cost and Planner

From: David Jarvis <thangalin(at)gmail(dot)com>
To: pgsql-performance(at)postgresql(dot)org
Subject: Re: Random Page Cost and Planner
Date: 2010-05-26 20:21:19
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-performance
I was told to try OVERLAPS instead of checking years. The query is now:

    extract(YEAR FROM m.taken) AS year,
    avg(m.amount) as amount
  FROM c,
    climate.station s,
    climate.station_category sc,
    climate.measurement m
  WHERE = 5148 AND
      ll_to_earth(s.latitude_decimal,s.longitude_decimal)) / 1000 <= 30 AND
    s.elevation BETWEEN 0 AND 3000 AND
    s.applicable = TRUE AND
    sc.station_id = AND
    sc.category_id = 7 AND
*    (sc.taken_start, sc.taken_end) OVERLAPS ('1900-01-01'::date,
'2009-12-31'::date) AND*
    m.station_id = AND
    m.taken BETWEEN sc.taken_start AND sc.taken_end AND
    m.category_id = sc.category_id
    extract(YEAR FROM m.taken)
    extract(YEAR FROM m.taken)

25 seconds from cold, no full table scan:

Much better than 85 seconds, but still an order of magnitude too slow.

I was thinking of changing the *station_category* table to use the
measurement table's primary key, instead of keying off date, as converting
the dates for comparison strikes me as a bit of overhead. Also, I can get
remove the "/ 1000" by changing the Earth's radius to kilometres (from
metres), but a constant division shouldn't be significant.

I really appreciate all your patience and help over the last sixteen days
trying to optimize this database and these queries.


In response to

pgsql-performance by date

Next:From: Konrad GarusDate: 2010-05-27 07:24:23
Subject: Re: shared_buffers advice
Previous:From: David JarvisDate: 2010-05-26 18:55:37
Subject: Re: Random Page Cost and Planner

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group