Bulk data insertion

Started by Jonathan Daughertyover 21 years ago2 messagesgeneral
Jump to latest
#1Jonathan Daugherty
jdaugherty@commandprompt.com

Hello,

I have a PL/PgSQL function that I need to call with some ARRAY
parameters. These array values are very large -- typically thousands of
elements. Each element is a 4-element array. This function is called
to do some sanity checking on the array data and use the individual
elements to do inserts where appropriate.

The problem is that I don't want to spend a lot of time and memory
building such a query (in C). I would like to know if there is a way to
take this huge chunk of data and get it into the database in a less
memory-intensive way. I suppose I could use COPY to put the data into a
table with triggers that would do the checks on the data, but it seems
inelegant and I'd like to know if there's a better way.

Thoughts? Thanks for your time.

--
Jonathan Daugherty
Command Prompt, Inc. - http://www.commandprompt.com/
PostgreSQL Replication & Support Services, (503) 667-4564

#2Tom Lane
tgl@sss.pgh.pa.us
In reply to: Jonathan Daugherty (#1)
Re: Bulk data insertion

Jonathan Daugherty <jdaugherty@commandprompt.com> writes:

The problem is that I don't want to spend a lot of time and memory
building such a query (in C). I would like to know if there is a way to
take this huge chunk of data and get it into the database in a less
memory-intensive way. I suppose I could use COPY to put the data into a
table with triggers that would do the checks on the data, but it seems
inelegant and I'd like to know if there's a better way.

Actually I'd say that is the elegant way. SQL is fundamentally a
set-oriented (table-oriented) language, and forcing it to do things in
an array fashion is just misusing the tool.

regards, tom lane