| From: | Ivan Voras <ivoras(at)freebsd(dot)org> |
|---|---|
| To: | pgsql-general(at)postgresql(dot)org |
| Subject: | Re: SELECTing every Nth record for better performance |
| Date: | 2009-12-04 10:15:23 |
| Message-ID: | hfanfh$909$1@ger.gmane.org |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
A. Kretschmer wrote:
> In response to Tom :
>> I have a big table that is used for datalogging. I'm designing
>> graphing interface that will visualise the data. When the user is
>> looking at a small daterange I want the database to be queried for all
>> records, but when the user is 'zoomed out', looking at an overview, I
>> want run a query that skips every nth record and returns a managable
>> dataset that still gives a correct overview of the data without
>> slowing the programme down. Is there an easy way to do this that I
>> have overlooked? I looked at:
>
>
> Do you have 8.4? If yes:
>
> test=# create table data as select s as s from generate_Series(1,1000) s;
> SELECT
>
>
>
> test=*# select s from (select *, row_number() over (order by s) from
> data) foo where row_number % 3 = 0 limit 10;
Won't this still read in the entire table and only then filter the
records out?
| From | Date | Subject | |
|---|---|---|---|
| Next Message | A. Kretschmer | 2009-12-04 10:25:34 | Re: SELECTing every Nth record for better performance |
| Previous Message | Andrew Gierth | 2009-12-04 10:07:58 | Re: [HACKERS] Installing PL/pgSQL by default |