ALTER TABLE foo ALTER COLUMN bar SET STATISTICS n; .....
I wonder what are the implications of using this statement,
I know by using, say n=100, ANALYZE will take more time,
pg_statistics will be bigger, planner will take longer time,
on the other hand it will make better decisions... Etc, etc.
I wonder however when it is most uselful to bump it up.
Please tell me what you think about it:
Is bumping up statistics is only useful for indexed columns?
When is it most useful/benefitial to bump them up:
1) huge table with huge number of distinct values (_almost_
2) huge table with relatively equally distributed values
(like each value is in between, say, 30-50 rows).
3) huge table with unequally distributed values (some
values are in 1-5 rows, some are in 1000-5000 rows).
4) huge table with small number values (around ~100
distinct values, equally or uneqally distributed).
5) boolean column.
I think SET STATISTICS 100 is very useful for case with
unequally distributed values, but I wonder what about
the other cases. And as a side note -- what are the
reasonable bounds for statistics (between 10 and 100?)
What are the runtime implications of setting statistics
too large -- how much can it affect queries?
And finally -- how other RDBMS and RDBM-likes deal
with this issue? :)
pgsql-performance by date
|Next:||From: Jan Wieck||Date: 2004-11-19 14:34:14|
|Subject: Re: sort_mem affect on inserts?|
|Previous:||From: Sean Chittenden||Date: 2004-11-18 21:10:28|
|Subject: Re: memcached and PostgreSQL|