duplicate key value violates unique constraint "tableName_pk"
Hi ,
I am using "Copy From" while dumping bulk data into PGSQL DB from file.
While I am trying to insert the rows, if the primary key already exists,
the complete transaction terminated.
Postgres cannot handle the constraint violations , i found it into
http://wiki.postgresql.org/wiki/COPY#Caveats_with_implementation .
Is there any possible way to insert bulk data using "Copy From" with
REPLACE/IGNORE duplicates.
Thanks & Regards
Saravanan.
On Wed, Nov 7, 2012 at 4:51 PM, Saravanan Nagarajan <n.saravanan86@gmail.com
wrote:
Hi ,
I am using "Copy From" while dumping bulk data into PGSQL DB from file.
While I am trying to insert the rows, if the primary key already exists,
the complete transaction terminated.Postgres cannot handle the constraint violations , i found it into
http://wiki.postgresql.org/wiki/COPY#Caveats_with_implementation .Is there any possible way to insert bulk data using "Copy From" with
REPLACE/IGNORE duplicates.
Did you look at http://pgbulkload.projects.pgfoundry.org/pg_bulkload.html ?
That advertises to have the features that you are looking for. I would
expect pg_bulkload to also perform equal if not better than COPY command.
Thanks,
Pavan
On 11/07/2012 07:21 PM, Saravanan Nagarajan wrote:
Hi ,
I am using "Copy From" while dumping bulk data into PGSQL DB from
file. While I am trying to insert the rows, if the primary key already
exists, the complete transaction terminated.Postgres cannot handle the constraint violations , i found it
into http://wiki.postgresql.org/wiki/COPY#Caveats_with_implementation .Is there any possible way to insert bulk data using "Copy From" with
REPLACE/IGNORE duplicates.
Typically you COPY into a temporary or UNLOGGED table, then merge the
data into the target table using appropriate `UPDATE ... FROM ... WHERE`
and `INSERT ... SELECT ... WHERE` statements.
--
Craig Ringer