Question regarding effects of Vacuum, Vacuum Analyze, and Reindex
To whom it may concern,
I am a java developer using postgres as a DB server. Me and my development team have a product comprised of about 50 tables, with about 10,000 records in the largest table. We have looked for concrete answers in books and the web for solutions to several problems that are plaguing us. Now we look to the source.
Issue #1 Massive deletion of records.
Q - If many (eg hundreds) records are deleted (purposely), those records get flagged for later removal. What is the best sequence of operations to optimize the database afterwards? Is it Vacuum, Re-index, then do a Vacuum Analyze.
Some of what I have read suggests that doing a vacuum without a re-index, can cause a given index to be invalid (ie entries pointing to records that do not match the index criteria).
This would then suggest that doing a Vacuum Analyze would create an incorrect statistics table.
Any help regarding the best maintenance policy, ramifications of mass deletions, vacuuming, and re-indexing would be most appreciated. Thanks
---------------------------------
Do you Yahoo!?
Yahoo! Web Hosting - Let the expert host your site
Moving thread to pgsql-performance.
On Mon, 2002-11-18 at 22:02, Adrian Calvin wrote:
Q - If many (eg hundreds) records are deleted (purposely), those
records get flagged for later removal. What is the best sequence of
operations to optimize the database afterwards? Is it Vacuum,
Re-index, then do a Vacuum Analyze.
Just run a regular vacuum once for the above. If you modify 10%+ of the
table (via single or multiple updates, deletes or inserts) then a vacuum
analyze will be useful.
Re-index when you change the tables contents a few times over. (Have
deleted or updated 30k entries in a table with 10k entries at any given
time).
General maintenance for a dataset of that size will probably simply be a
nightly vacuum, weekly vacuum analyze, and annual reindex or dump /
restore (upgrades).
--
Rod Taylor <rbt@rbt.ca>