From: | Adrian Calvin <acexec(at)yahoo(dot)com> |
---|---|
To: | pgsql-hackers(at)postgresql(dot)org |
Subject: | Question regarding effects of Vacuum, Vacuum Analyze, and Reindex |
Date: | 2002-11-19 03:02:40 |
Message-ID: | 20021119030240.23442.qmail@web13008.mail.yahoo.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers pgsql-performance |
To whom it may concern,
I am a java developer using postgres as a DB server. Me and my development team have a product comprised of about 50 tables, with about 10,000 records in the largest table. We have looked for concrete answers in books and the web for solutions to several problems that are plaguing us. Now we look to the source.
Issue #1 Massive deletion of records.
Q - If many (eg hundreds) records are deleted (purposely), those records get flagged for later removal. What is the best sequence of operations to optimize the database afterwards? Is it Vacuum, Re-index, then do a Vacuum Analyze.
Some of what I have read suggests that doing a vacuum without a re-index, can cause a given index to be invalid (ie entries pointing to records that do not match the index criteria).
This would then suggest that doing a Vacuum Analyze would create an incorrect statistics table.
Any help regarding the best maintenance policy, ramifications of mass deletions, vacuuming, and re-indexing would be most appreciated. Thanks
---------------------------------
Do you Yahoo!?
Yahoo! Web Hosting - Let the expert host your site
From | Date | Subject | |
---|---|---|---|
Next Message | Justin Clift | 2002-11-19 05:41:15 | Re: Press Release -- Just Waiting for Tom |
Previous Message | Lee Harr | 2002-11-19 01:22:24 | Re: Bug with sequence |
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2002-11-19 05:23:56 | Re: for/loop performance in plpgsql ? |
Previous Message | Robert Treat | 2002-11-18 18:48:32 | Re: selects from large tables |