Table Corruption

Started by mike focosiover 25 years ago3 messagesgeneral
Jump to latest
#1mike focosi
mike@goldendome.com

I have several php scripts that allow my clients to add stories to a
database. I've noticed that once in a while, when they attempt to
insert/update more than 8k into a table, the table is becoming corrupted.
When a select is done on the table, the database hangs. When this happens to
one table in one database, it brings down the whole postgreSQL server
backend. Not good.

I've created a database with the same scheme/data and have only been able to
recreate the problem once. Can't seem to do it everytime. PostgreSQL usually
gives the "PQsendQuery() -- query is too long..." error and just doesn't
follow thru with the query which is fine. I can deal with limiting my
clients to less than 8k stories. But sometimes the table gets corrupted by
a query that shouldn't be executed (ie: stopped at the too long warning).
Has anybody ever had any problems like the one I described. I've yet to
upgrade our database servers to 7+ (running 6.4) so I'm hoping maybe that
would solve the problem. But I'd like to know what the exact problem is.

thanks
-mike

mike focosi
senior applications developer
golden dome media
http://www.goldendomemedia.com
http://www.surfmichiana.com
http://www.wndu.com

#2Tom Lane
tgl@sss.pgh.pa.us
In reply to: mike focosi (#1)
Re: Table Corruption

"mike focosi" <mike@goldendome.com> writes:

I have several php scripts that allow my clients to add stories to a
database. I've noticed that once in a while, when they attempt to
insert/update more than 8k into a table, the table is becoming corrupted.

Has anybody ever had any problems like the one I described. I've yet to
upgrade our database servers to 7+ (running 6.4) so I'm hoping maybe that
would solve the problem. But I'd like to know what the exact problem is.

Try 7.0.* --- it's much more careful about checking row-size limits.
6.4 is ancient history ;-)

regards, tom lane

#3mike focosi
mike@goldendome.com
In reply to: Tom Lane (#2)
Re: Table Corruption

postgres has an 8k row limit. This can be changed to up to 32k by
recompiling

alledgedly postgres 7.1 will deal with this

I'm aware of the 8k limit and am hoping 7.1 gets here soon.

What's bugging me is that the queries that should not be committed, after
postgres warns about the 8k limit, are somehow corrupting the table. If the
query is causing an error, how is a record still updated/inserted? In other
words if the query fails why does it still change my table? Or is it
partially finishing the query off? Like, filling the row upto 8k (as much as
it'll take).

thanks
-mike

I have several php scripts that allow my clients to add stories to a
database. I've noticed that once in a while, when they attempt to
insert/update more than 8k into a table, the table is becoming corrupted.
When a select is done on the table, the database hangs. When this happens

to

one table in one database, it brings down the whole postgreSQL server
backend. Not good.

I've created a database with the same scheme/data and have only been able

to

recreate the problem once. Can't seem to do it everytime. PostgreSQL

usually

gives the "PQsendQuery() -- query is too long..." error and just doesn't
follow thru with the query which is fine. I can deal with limiting my
clients to less than 8k stories. But sometimes the table gets corrupted

by

Show quoted text

a query that shouldn't be executed (ie: stopped at the too long warning).
Has anybody ever had any problems like the one I described. I've yet to
upgrade our database servers to 7+ (running 6.4) so I'm hoping maybe that
would solve the problem. But I'd like to know what the exact problem is.