From: | David Jarvis <thangalin(at)gmail(dot)com> |
---|---|
To: | Alexey Klyukin <alexk(at)commandprompt(dot)com> |
Cc: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>, Kevin Grittner <Kevin(dot)Grittner(at)wicourts(dot)gov>, pgsql-performance(at)postgresql(dot)org |
Subject: | Re: Random Page Cost and Planner |
Date: | 2010-05-26 16:30:19 |
Message-ID: | AANLkTil_n4-X5OBy4DxDFkwLQ-ywxTIVGmq4W9UNQ_80@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
Hi, Alexey.
Is it necessary to get the data as far as 1900 all the time ? Maybe there is
> a possibility to aggregate results from the past years if they are constant.
>
This I have done. I created another table (station_category) that associates
stations with when they started to take measurements and when they stopped
(based on the data in the measurement table). For example:
station_id; category_id; taken_start; taken_end
1;4;"1984-07-01";"1996-11-30"
1;5;"1984-07-01";"1996-11-30"
1;6;"1984-07-01";"1996-11-10"
1;7;"1984-07-01";"1996-10-31"
This means that station 1 has data for categories 4 through 7. The
measurement table returns 3865 rows for station 1 and category 7 (this uses
an index and took 7 seconds cold):
station_id; taken; amount
1;"1984-07-01";0.00
1;"1984-07-02";0.00
1;"1984-07-03";0.00
1;"1984-07-04";0.00
The station_category table is basically another index.
Would explicitly sorting the measurement table (273M rows) by station then
by date help?
Dave
From | Date | Subject | |
---|---|---|---|
Next Message | Stephen Frost | 2010-05-26 16:32:51 | Re: PostgreSQL Function Language Performance: C vs PL/PGSQL |
Previous Message | Eliot Gable | 2010-05-26 16:29:04 | Re: PostgreSQL Function Language Performance: C vs PL/PGSQL |