Re: [HACKERS] Question regarding effects of Vacuum, Vacuum

From: Rod Taylor <rbt(at)rbt(dot)ca>
To: Adrian Calvin <acexec(at)yahoo(dot)com>
Cc: pgsql-performance(at)postgresql(dot)org
Subject: Re: [HACKERS] Question regarding effects of Vacuum, Vacuum
Date: 2002-11-19 13:46:39
Message-ID: 1037713599.83738.34.camel@jester
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers pgsql-performance

Moving thread to pgsql-performance.

On Mon, 2002-11-18 at 22:02, Adrian Calvin wrote:
> Q - If many (eg hundreds) records are deleted (purposely), those
> records get flagged for later removal. What is the best sequence of
> operations to optimize the database afterwards? Is it Vacuum,
> Re-index, then do a Vacuum Analyze.

Just run a regular vacuum once for the above. If you modify 10%+ of the
table (via single or multiple updates, deletes or inserts) then a vacuum
analyze will be useful.

Re-index when you change the tables contents a few times over. (Have
deleted or updated 30k entries in a table with 10k entries at any given
time).

General maintenance for a dataset of that size will probably simply be a
nightly vacuum, weekly vacuum analyze, and annual reindex or dump /
restore (upgrades).

--
Rod Taylor <rbt(at)rbt(dot)ca>

In response to

Browse pgsql-hackers by date

  From Date Subject
Next Message Justin Clift 2002-11-19 13:47:47 Re: Looking for a "Linux on Playstation 2" person to compile
Previous Message Tom Lane 2002-11-19 13:42:57 Re: char(n) to varchar or text conversion should strip

Browse pgsql-performance by date

  From Date Subject
Next Message Tom Lane 2002-11-19 14:04:02 Re: for/loop performance in plpgsql ?
Previous Message Federico /* juri */ Pedemonte 2002-11-19 09:49:26 Re: for/loop performance in plpgsql ?