User demand, and idea, for C-code conversions from JSON arrays to PostgreSQL arrays
All,
Oven in the "JSON[B] arrays are second-class citizens" thread [1]/messages/by-id/CADkLM=fSC+otuBmzoJT6Riyksue3HpHgu2=Mofcv=fd0derhGg@mail.gmail.com I made
the observation that the only way to get a PostgreSQL array from a JSON
array is via the "elements->cast->array_agg" chain. For JSON arrays that
are homogeneous in nature the capability to go "directly" from JSON to
json[], text[], bigint[], etc... seems like it would have some value.
A couple, so far, of community members have chimed in. Maybe this ends up
on a ToDo somewhere but for the moment I figured I'd float the idea on its
own thread since the other one is really about a different (though somewhat
related) topic.
In the spirit of the json_populate_record like functions something with the
following signature seems usable:
as_array(anyarray, jsonb) : anyarray [2]ROTQ[3] - whose idea was it to put the type first and the data second?
so that actual calls look like:
SELECT as_array(null::text[], '["a", "b", "c"]'::jsonb)
For better or worse while every table gets a corresponding composite type
explicitly created types cannot be treated as row types in this situation
SELECT jsonb_populate_record(null::text[], '["a", "b", "c"]'::jsonb)
ERROR: first argument of jsonb_populate_record must be a row type
Loosening the restriction and allowing jsonb_populate_record to fulfill the
role of the above described "as_array" function would be something to
consider, but likely not possible or worth any effort in trying to make
happen.
David J.
[1]: /messages/by-id/CADkLM=fSC+otuBmzoJT6Riyksue3HpHgu2=Mofcv=fd0derhGg@mail.gmail.com
/messages/by-id/CADkLM=fSC+otuBmzoJT6Riyksue3HpHgu2=Mofcv=fd0derhGg@mail.gmail.com
[2]: ROTQ[3] - whose idea was it to put the type first and the data second?
[3]: Rhetorical Off-Topic Question