From: | "jcfischer" <jcfischer(dot)lists(at)gmail(dot)com> |
---|---|
To: | pgsql-performance(at)postgresql(dot)org |
Subject: | nested query on last n rows of huge table |
Date: | 2006-02-24 14:13:31 |
Message-ID: | 1140790410.923941.261250@p10g2000cwp.googlegroups.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
Hi list
I'm fairly new to Postgres so bear with me. Googling and searching the
list, I didn't find anything that resembled my problem.
I have a large table with ca. 10 million inserts per day (fairly simple
data: timestam, a couple of id's and a varchar message)
I run a query every couple of minutes that looks at the new entries
since the last run and retrieves them for further processing (using a
WHERE eventtime > '2006-02-24 14:00:00' ) to limit to the most recent
entries
These queries run around 40-50 seconds (largely due to some LIKE %msg%
threwn in for good measure). Postgres performs a seq table scan on
those queries :-(
My idea is to limit the search to only the last n entries because I
found that a
SELECT * from table ORDER eventtime DESC limit 1000
is very fast. Because the inserts are in chronolgical order, I can
store the sequential id of the highest row from the last query and
subtract that from the current high row count to determine that number.
Is there a way to limit the expensive query to only those last 1000 (or
whatever) results?
I have tried to nest SELECTS but my SQL-fu is to limited to get
anything through the SQL processor :-)
thanks
Jens-Christian Fischer
From | Date | Subject | |
---|---|---|---|
Next Message | jcfischer | 2006-02-24 14:26:40 | Re: nested query on last n rows of huge table |
Previous Message | Brendan Duddridge | 2006-02-24 07:24:00 | Re: Really really slow query. What's a better way? |