ERROR: cannot pass more than 100 arguments to a function

Started by Ján Mátéabout 6 years ago3 messagesbugs
Jump to latest
#1Ján Máté
jan.mate@inf-it.com

Hi PostgreSQL,

I noticed an interesting problem with the default max. number of function arguments, especially related to JSON processing.

I understand that there is a need to limit the max. number of arguments for functions, but the current limit (100) is simply too restrictive for tables with large number of columns (according to my findings it is >250 depending on column types).

A simple example:

- I have a table with 70 columns

- I want to select the most simple JSON output from that table:

SELECT
json_build_object(
'a', "col1",
'b', "col2",
'c', "col2",
...
)
FROM "schema"."table";

- with the 100 arguments limit I can select only 50 columns - 50 x (jsonKeyName + ColumnName) = 100 arguments

- if the limit of columns in the table is >250, why I cannot select at least 250 columns using the built-in json_build_object(...) function?

Please someone consider to increase the default limit from 100 to at least 500 (2x max. number of columns) for json_build_object(...) and other similar JSON functions.

Kind regards,

JM

#2Tom Lane
tgl@sss.pgh.pa.us
In reply to: Ján Máté (#1)
Re: ERROR: cannot pass more than 100 arguments to a function

=?utf-8?B?SsOhbiBNw6F0w6k=?= <jan.mate@inf-it.com> writes:

I understand that there is a need to limit the max. number of arguments for functions, but the current limit (100) is simply too restrictive for tables with large number of columns (according to my findings it is >250 depending on column types).

I'd suggest using one of the array- or aggregation-based approaches,
rather than insisting on writing it out with some hundreds of distinct
arguments. Any specific function-argument-count limit we might pick
would be too small for somebody, but those other methods can scale
much further. For example,

SELECT json_object(
array['a', 'col1',
'b', 'col2',
'c', 'col3']);
json_object
--------------------------------------------
{"a" : "col1", "b" : "col2", "c" : "col3"}
(1 row)

regards, tom lane

#3raf
raf@raf.org
In reply to: Ján Máté (#1)
Re: ERROR: cannot pass more than 100 arguments to a function

J�n M�t� wrote:

Hi PostgreSQL,

I noticed an interesting problem with the default max. number of
function arguments, especially related to JSON processing.

I understand that there is a need to limit the max. number of
arguments for functions, but the current limit (100) is simply too
restrictive for tables with large number of columns (according to my
findings it is >250 depending on column types).

A simple example:

- I have a table with 70 columns

- I want to select the most simple JSON output from that table:

SELECT
json_build_object(
'a', "col1",
'b', "col2",
'c', "col2",
...
)
FROM "schema"."table";

- with the 100 arguments limit I can select only 50 columns - 50 x
(jsonKeyName + ColumnName) = 100 arguments

- if the limit of columns in the table is >250, why I cannot select at
least 250 columns using the built-in json_build_object(...) function?

Please someone consider to increase the default limit from 100 to at
least 500 (2x max. number of columns) for json_build_object(...) and
other similar JSON functions.

Kind regards,
JM

If you need that many arguments to represent the columns in a table,
then you can use a single argument whose type is the table. e.g.

create table t (...);
create or replace function f(a t) ...;
select * from f(row(...));

If you need that many arguments, and they don't represent a table,
you can create an empty table just for its type definition and then
do the above.

cheers,
raf