From: | Shelby Cain <alyandon(at)yahoo(dot)com> |
---|---|
To: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Memory usage during vacuum |
Date: | 2004-03-25 18:23:28 |
Message-ID: | 20040325182328.43493.qmail@web41603.mail.yahoo.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Currently my default is 300 (yes - very large I know)
but overriding default_statistics_target with a value
of 1 and re-running vacuum analyze on the same large
table results in no change in maximum memory
consumption during the process that I can see. It
should be noted that I see this behavior only with one
table. All other tables, even relatively large ones
only result in a backend that grows to consume about
30 megabytes.
Here is some sample output:
c1scain=# set default_statistics_target=300;
SET
c1scain=# vacuum verbose analyze inventory_txns;
INFO: vacuuming "public.inventory_txns"
INFO: index "idx_inventory_txns_txndate" now contains
957655 row versions in 5770 pages
DETAIL: 1686 index pages have been deleted, 1686 are
currently reusable.
CPU 0.51s/0.21u sec elapsed 5.20 sec.
INFO: index "idx_inventory_txns_locitemnbr" now
contains 957655 row versions in 4752 pages
DETAIL: 0 index pages have been deleted, 0 are
currently reusable.
CPU 0.53s/0.09u sec elapsed 2.73 sec.
INFO: "inventory_txns": found 0 removable, 957655
nonremovable row versions in 39549 pages
DETAIL: 0 dead row versions cannot be removed yet.
There were 15914 unused item pointers.
0 pages are entirely empty.
CPU 2.16s/0.60u sec elapsed 15.76 sec.
INFO: analyzing "public.inventory_txns"
INFO: "inventory_txns": 39549 pages, 150000 rows
sampled, 944729 estimated total rows
I notice that postgresql decided to sample 150000
rows.
Now, when I try the following:
c1scain=# set default_statistics_target=1;
SET
c1scain=# vacuum verbose analyze inventory_txns;
INFO: vacuuming "public.inventory_txns"
INFO: index "idx_inventory_txns_txndate" now contains
957655 row versions in 5770 pages
DETAIL: 1686 index pages have been deleted, 1686 are
currently reusable.
CPU 0.37s/0.15u sec elapsed 5.29 sec.
INFO: index "idx_inventory_txns_locitemnbr" now
contains 957655 row versions in 4752 pages
DETAIL: 0 index pages have been deleted, 0 are
currently reusable.
CPU 0.54s/0.09u sec elapsed 2.93 sec.
INFO: "inventory_txns": found 0 removable, 957655
nonremovable row versions in 39549 pages
DETAIL: 0 dead row versions cannot be removed yet.
There were 15914 unused item pointers.
0 pages are entirely empty.
CPU 2.21s/0.51u sec elapsed 16.37 sec.
INFO: analyzing "public.inventory_txns"
INFO: "inventory_txns": 39549 pages, 150000 rows
sampled, 944729 estimated total rows
VACUUM
It still decided to sample 150000 rows. Am I missing
something obvious here? Shouldn't fewer rows be
sampled when I set the collection target to 1?
Regards,
Shelby Cain
--- Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> wrote:
> ... the issue is obviously ANALYZE and not VACUUM at
> all. What
> statistics targets are you using? It's hard to
> believe ANALYZE
> would eat that much space unless it's being asked
> for a really
> large target.
>
__________________________________
Do you Yahoo!?
Yahoo! Finance Tax Center - File online. File on time.
http://taxes.yahoo.com/filing.html
From | Date | Subject | |
---|---|---|---|
Next Message | Anoop Rajendra | 2004-03-25 18:36:58 | Problem with inserting into a triggerable table; |
Previous Message | Greg Stark | 2004-03-25 18:18:00 | Re: PHP or JSP? That is the question. |