| From: | Rod Taylor <rbt(at)rbt(dot)ca> |
|---|---|
| To: | Bruce Momjian <pgman(at)candle(dot)pha(dot)pa(dot)us> |
| Cc: | PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org> |
| Subject: | Re: Script to compute random page cost |
| Date: | 2002-09-09 12:00:45 |
| Message-ID: | 1031572846.15580.36.camel@jester |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-hackers |
On Mon, 2002-09-09 at 02:13, Bruce Momjian wrote:
>
> OK, turns out that the loop for sequential scan ran fewer times and was
> skewing the numbers. I have a new version at:
>
> ftp://candle.pha.pa.us/pub/postgresql/randcost
>
> I get _much_ lower numbers now for random_page_cost.
The current script pulls way more data for Sequential scan than random
scan now.
Random is pulling a single page (count=1 for dd) with every loop.
Sequential does the same number of loops, but pulls count > 1 in each.
In effect, sequential is random with more data load -- which explains
all of the 0.9's.
Rod Taylor
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Tom Lane | 2002-09-09 13:33:54 | Re: bug? |
| Previous Message | Oliver Elphick | 2002-09-09 11:52:38 | Re: Script to compute random page cost |