vacuumdb error: tuple concurrently updated

Started by Harald Krakeover 23 years ago3 messagesbugs
Jump to latest
#1Harald Krake
harald@krake.de

while running a "vacuumdb -z" I got the following error message:

ERROR: simple_heap_update: tuple concurrently updated
vacuumdb: vacuum jartifice failed

Is this something to worry about? Especially data corruption?
(postgres 7.3 / linux)

regards,
harald.

#2Tom Lane
tgl@sss.pgh.pa.us
In reply to: Harald Krake (#1)
Re: vacuumdb error: tuple concurrently updated

Harald Krake <harald@krake.de> writes:

while running a "vacuumdb -z" I got the following error message:
ERROR: simple_heap_update: tuple concurrently updated
vacuumdb: vacuum jartifice failed

Is it likely that someone else was doing the same thing in another
session? This failure is known to occur if two backends concurrently
ANALYZE the same table --- the second one to try to update the
pg_statistics row fails because of the concurrent write. It's quite
harmless, except for preventing the rest of the VACUUM command from
completing.

regards, tom lane

#3Harald Krake
harald@krake.de
In reply to: Tom Lane (#2)
Re: vacuumdb error: tuple concurrently updated

On Thursday 26 December 2002 06:09 pm, Tom Lane wrote:

Is it likely that someone else was doing the same thing in another
session?

no ANALYZE in parallel but another client constantly running transactions
(over 6 tables with approx. 20 - 100 updates/inserts per transaction).
The postgres client generates a permanent cpu load of 80-90% for days.
To keep the transaction rate at a reasonable level we run a "vacuumdb -z"
via a shell script in a while-loop with a "sleep 3600" in between.
I suspect the vacuumdb returns to the shell when the ANALYZE has finished,
so it should be impossible that two ANALYZE were running at the same time (?)

As long as it's harmless, we can live with that.

Btw.: is there a tool to check the consistency of a postgres database?

regards,
harald.