From: | Joshua Tolley <eggyknap(at)gmail(dot)com> |
---|---|
To: | Scara Maccai <m_lists(at)yahoo(dot)it> |
Cc: | pgsql-general <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: using explain to get query expected time |
Date: | 2009-05-25 15:17:05 |
Message-ID: | 20090525151705.GE22115@eddie |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Mon, May 25, 2009 at 12:10:21AM -0700, Scara Maccai wrote:
> is there any chance to get the "Planner Cost Constants" right enough to get a "good" estimate in seconds of how long a query is supposed to run?
> The "rowcount" estimates are always good (there is no skew data at all in the db, values are pretty much "plain" distributed)
The most straightforward way I can think of would be to make sure your planner
constants (seq_page_cost, etc.) reflect time in meaningful units, which would
be very hard, and probably wouldn't work for reasons I don't know. Anyway, the
quick answer is no, 'cause it's really hard.
- Josh / eggyknap
From | Date | Subject | |
---|---|---|---|
Next Message | Stephen Frost | 2009-05-25 15:31:00 | Re: Assistance in importing a csv file into Postgresql |
Previous Message | Grzegorz Jaśkiewicz | 2009-05-25 15:14:09 | Re: Assistance in importing a csv file into Postgresql |