abusing plpgsql array variables
If I want to pass in a text[] argument to a plpgsql function, at what
array size am I asking for problems? 100? 10,000? 100,000?
What severity of problems might I encounter? Bad performance? Postgres
refusing to run my query? A crashed backend?
On Wed, Sep 10, 2008 at 4:03 PM, Ben <bench@silentmedia.com> wrote:
If I want to pass in a text[] argument to a plpgsql function, at what array
size am I asking for problems? 100? 10,000? 100,000?What severity of problems might I encounter? Bad performance? Postgres
refusing to run my query? A crashed backend?
see for yourself:
postgres=# select length(array_to_string(array(select
generate_series(1,100000)), '|'));
length
--------
588894
(1 row)
Time: 107.283 ms
(array options can get inefficient if large, but it's mainly cpu bound stuff).
merlin
If I want to pass in a text[] argument to a plpgsql function, at what
array size am I asking for problems? 100? 10,000? 100,000?What severity of problems might I encounter? Bad performance? Postgres
refusing to run my query? A crashed backend?
Yeah, like you I was pretty worried about how it would handle using
larger arrays. But I was surprised to find that it did a super job of
handling even large arrays.
One warning though. If you are going to filter a table based on values
in a large array, don't do something like:
WHERE foo = ANY some_large_array
Instead explode it using a set returning function and join it like a table:
JOIN explode(some_large_array) e ON ...
Artacus <artacus@comcast.net> writes:
If I want to pass in a text[] argument to a plpgsql function, at what
array size am I asking for problems? 100? 10,000? 100,000?
Yeah, like you I was pretty worried about how it would handle using
larger arrays. But I was surprised to find that it did a super job of
handling even large arrays.
Note that this depends quite a lot on what datatype the array elements
are. Fixed-width types like int or float should be fast.
Variable-width types like text or numeric, not so much.
regards, tom lane