Overflow of attmissingval is not handled gracefully

Started by Tom Lanealmost 4 years ago3 messages
#1Tom Lane
tgl@sss.pgh.pa.us

Consider this admittedly-rather-contrived example:

regression=# create table foo(f1 int);
CREATE TABLE
regression=# alter table foo add column bar text default repeat('xyzzy', 1000000);
ERROR: row is too big: size 57416, maximum size 8160

Since the table contains no rows at all, this is a surprising
failure. The reason for it of course is that pg_attribute
has no TOAST table, so it can't store indefinitely large
attmissingval fields.

I think the simplest answer, and likely the only feasible one for
the back branches, is to disable the attmissingval optimization
if the proposed value is "too large". Not sure exactly where the
threshold for that ought to be, but maybe BLCKSZ/8 could be a
starting offer.

regards, tom lane

#2Andrew Dunstan
andrew@dunslane.net
In reply to: Tom Lane (#1)
Re: Overflow of attmissingval is not handled gracefully

On 2/28/22 18:21, Tom Lane wrote:

Consider this admittedly-rather-contrived example:

regression=# create table foo(f1 int);
CREATE TABLE
regression=# alter table foo add column bar text default repeat('xyzzy', 1000000);
ERROR: row is too big: size 57416, maximum size 8160

Since the table contains no rows at all, this is a surprising
failure. The reason for it of course is that pg_attribute
has no TOAST table, so it can't store indefinitely large
attmissingval fields.

I think the simplest answer, and likely the only feasible one for
the back branches, is to disable the attmissingval optimization
if the proposed value is "too large". Not sure exactly where the
threshold for that ought to be, but maybe BLCKSZ/8 could be a
starting offer.

WFM. After all, it's taken several years for this to surface. Is it
based on actual field experience?

cheers

andrew

--
Andrew Dunstan
EDB: https://www.enterprisedb.com

#3Tom Lane
tgl@sss.pgh.pa.us
In reply to: Andrew Dunstan (#2)
Re: Overflow of attmissingval is not handled gracefully

Andrew Dunstan <andrew@dunslane.net> writes:

On 2/28/22 18:21, Tom Lane wrote:

regression=# create table foo(f1 int);
CREATE TABLE
regression=# alter table foo add column bar text default repeat('xyzzy', 1000000);
ERROR: row is too big: size 57416, maximum size 8160

I think the simplest answer, and likely the only feasible one for
the back branches, is to disable the attmissingval optimization
if the proposed value is "too large". Not sure exactly where the
threshold for that ought to be, but maybe BLCKSZ/8 could be a
starting offer.

WFM. After all, it's taken several years for this to surface. Is it
based on actual field experience?

No, it was an experiment that occurred to me while thinking about
the nearby proposal to add a TOAST table to pg_attribute [1]/messages/by-id/1643112264.186902312@f325.i.mail.ru.
If we do that, this restriction could be dropped. But I agree that
there's hardly any practical use-case for such default values,
so I wouldn't mind living with the de-optimization either.

regards, tom lane

[1]: /messages/by-id/1643112264.186902312@f325.i.mail.ru