BUG #15083: [54000] ERROR: total size of jsonb array elements exceeds the maximum of 268435455 bytes
The following bug has been logged on the website:
Bug reference: 15083
Logged by: Vinston Pandiyan
Email address: vinston.pandiyan@gmail.com
PostgreSQL version: 10.0
Operating system: CentOS
Description:
Hi All,
When I try to update the rows into json array(of type jsonb). I am
getting following error.
[54000]: ERROR: total size of jsonb array elements exceeds the maximum of 268435455 bytes
268435455 bytes
Is there any way to fix it or increase jsonb size?
Thanks
On Friday, February 23, 2018, PG Bug reporting form <noreply@postgresql.org>
wrote:
The following bug has been logged on the website:
Bug reference: 15083
Logged by: Vinston Pandiyan
Email address: vinston.pandiyan@gmail.com
PostgreSQL version: 10.0
Operating system: CentOS
Description:Hi All,
When I try to update the rows into json array(of type jsonb). I am
getting following error.[54000] ERROR: total size of jsonb array elements exceeds the maximum of
268435455 bytesIs there any way to fix it or increase jsonb size?
Duplicate of 15079
Is there way to fix this? i want to keep 500k rows in json or jsonb field?
please help me with this, it is possible to do it in postgres?
On Fri, Feb 23, 2018 at 12:42 PM, David G. Johnston <
david.g.johnston@gmail.com> wrote:
Show quoted text
On Friday, February 23, 2018, PG Bug reporting form <
noreply@postgresql.org> wrote:The following bug has been logged on the website:
Bug reference: 15083
Logged by: Vinston Pandiyan
Email address: vinston.pandiyan@gmail.com
PostgreSQL version: 10.0
Operating system: CentOS
Description:Hi All,
When I try to update the rows into json array(of type jsonb). I am
getting following error.[54000] ERROR: total size of jsonb array elements exceeds the maximum of
268435455 bytesIs there any way to fix it or increase jsonb size?
Duplicate of 15079
On Fri, Feb 23, 2018 at 11:45 AM, Vinston Pandiyan <
vinston.pandiyan@gmail.com> wrote:
Is there way to fix this? i want to keep 500k rows in json or jsonb field?
please help me with this, it is possible to do it in postgres?
For the typical user, no, there is no way to get rid of the limitation -
you will need to decide how you want to go about avoiding it.
David J.
if you can suggest any workaround this? or it is possible to compress the
data and store?
On Fri, Feb 23, 2018 at 1:48 PM, David G. Johnston <
david.g.johnston@gmail.com> wrote:
Show quoted text
On Fri, Feb 23, 2018 at 11:45 AM, Vinston Pandiyan <
vinston.pandiyan@gmail.com> wrote:Is there way to fix this? i want to keep 500k rows in json or jsonb field?
please help me with this, it is possible to do it in postgres?For the typical user, no, there is no way to get rid of the limitation -
you will need to decide how you want to go about avoiding it.David J.