bulk insert unique contraint

Started by Janek Sendrowskiover 12 years ago5 messagesgeneral
Jump to latest
#1Janek Sendrowski
janek12@web.de

Hi,
 
I want to insert data in my table and I want to insert the rows, which don't violates the unique contraint of my id. I'm using a 64bit hash for my it.
If I do one insert statement, which inserts many rows it doesn't do anything if one row violates the unique contraint.
Is there a faster way than using multiple insert statements?
 
Janek

--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

#2Andy Colson
andy@squeakycode.net
In reply to: Janek Sendrowski (#1)
Re: bulk insert unique contraint

On 12/30/2013 8:14 AM, Janek Sendrowski wrote:

Hi,

I want to insert data in my table and I want to insert the rows, which don't violates the unique contraint of my id. I'm using a 64bit hash for my it.
If I do one insert statement, which inserts many rows it doesn't do anything if one row violates the unique contraint.
Is there a faster way than using multiple insert statements?

Janek

You could:

create table junk (like main);
copy to junk ...;
create index junkpk on junk(uid);
-- _not_ a unique index, optional
analyze junk;
select uid, count(*) from junk group by uid;
-- do something about the dups
insert into main select * from junk;
drop table junk;

-Andy

--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

#3Janek Sendrowski
janek12@web.de
In reply to: Janek Sendrowski (#1)
Re: bulk insert unique contraint

Hi Salah Jubeh,
 
My files don't have this format. Could it make sense to create such a file with php and import the data from it?
Or ist also possible the get certain rows from the file with regexp?
 
Janek

--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

#4salah jubeh
s_jubeh@yahoo.com
In reply to: Janek Sendrowski (#3)
Re: bulk insert unique contraint

Hello Janek,

which inserts many rows it doesn't do anything if one row violates the unique contraint.

You insert your data in a transactional way, this is the default behaviour .

To work around it,  do not insert data like

INSERT INTO <TABLE> VALUES (),()...;

but
INSERT INTO <TABLE> VALUES ();
INSERT INTO <TABLE> VALUES ();
Also make sure each statment runs in a separate transaction.

Is there a faster way than using multiple insert statements?

have alook on
COPY

Regards

 

My files don't have this format. Could it make sense to create such a file with php and import the data from it?
Or ist also possible the get certain rows from the file with regexp?

Sorry for not forwarding my reply earlier to the mailing list.

An easier option would be as suggested by another mail thread - andy colson -. Also here another option:

1. Disable the unique constraint 'ALTER ABLE .. DROP  CONSTRAINT'
2. Remove duplicates , have a look here http://wiki.postgresql.org/wiki/Deleting_duplicates
3. Enable the unique constraint.

Regards

On Monday, December 30, 2013 4:47 PM, Janek Sendrowski <janek12@web.de> wrote:

Hi Salah Jubeh,
 
My files don't have this format. Could it make sense to create such a file with php and import the data from it?
Or ist also possible the get certain rows from the file with regexp?

 
Janek

--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your
subscription:
http://www.postgresql.org/mailpref/pgsql-general

#5Janek Sendrowski
janek12@web.de
In reply to: salah jubeh (#4)
Re: bulk insert unique contraint

Hi Salah Jubeh,
 
I'm not in hurry ;)
Thanks for your answer. Thats what I've been searching for.
 
Janek Sendrwoski

--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general