Ignore when using COPY FROM
I have a ton of data in a text delimited file from an old legacy system.
When uploading it into postgres, I'd do something like this:
COPY stuff FROM 'stuff.txt' USING DELIMITERS = '|';
The problem is some of the rows in stuff.txt may not conform to the
stuff table attributes (duplicate keys, for example). The command above
terminates with no change to the table stuff when it encounters an error
in stuff.txt. Is there a way to have postgres ignore erroneous fields
but keep information about which fields weren't processed. I believe
Oracle has some support for this through an IGNORE clause.
Import Notes
Reference msg id not found: 200008290036.e7T0aZC77097@hub.org
Well, you could always make a table that has no constraints on duplicates
and COPY TO that one. Then, make a query that inserts the data into your
production table that handles the duplicates.
Adam Lang
Systems Engineer
Rutgers Casualty Insurance Company
----- Original Message -----
From: "Matthew Kennedy" <mkennedy@hssinc.com>
To: <pgsql-general@postgresql.org>
Sent: Tuesday, August 29, 2000 10:15 AM
Subject: [GENERAL] Ignore when using COPY FROM
Show quoted text
I have a ton of data in a text delimited file from an old legacy system.
When uploading it into postgres, I'd do something like this:COPY stuff FROM 'stuff.txt' USING DELIMITERS = '|';
The problem is some of the rows in stuff.txt may not conform to the
stuff table attributes (duplicate keys, for example). The command above
terminates with no change to the table stuff when it encounters an error
in stuff.txt. Is there a way to have postgres ignore erroneous fields
but keep information about which fields weren't processed. I believe
Oracle has some support for this through an IGNORE clause.
Import Notes
Reference msg id not found: 200008290036.e7T0aZC77097@hub.org
From: "Matthew Kennedy" <mkennedy@hssinc.com>
I have a ton of data in a text delimited file from an old legacy system.
When uploading it into postgres, I'd do something like this:COPY stuff FROM 'stuff.txt' USING DELIMITERS = '|';
The problem is some of the rows in stuff.txt may not conform to the
stuff table attributes (duplicate keys, for example). The command above
terminates with no change to the table stuff when it encounters an error
in stuff.txt. Is there a way to have postgres ignore erroneous fields
but keep information about which fields weren't processed. I believe
Oracle has some support for this through an IGNORE clause.
* Adam Lang <aalang@rutgersinsurance.com> [000829 08:29] wrote:
Well, you could always make a table that has no constraints on duplicates
and COPY TO that one. Then, make a query that inserts the data into your
production table that handles the duplicates.
Actually, last I checked COPY INTO actually checks the RULE system, so
I'm pretty sure one can setup a rule to check for violated constraints
and 'INSTEAD DO NOTHING'. :-)
-Alfred
Yes, I have same problem. This functionality is "almost" standard,
and
found on other RDBMS. How do new features get added to the todo list's.
Where can I view the current todo list?
Fredrick
Matthew Kennedy wrote:
Show quoted text
I have a ton of data in a text delimited file from an old legacy system.
When uploading it into postgres, I'd do something like this:COPY stuff FROM 'stuff.txt' USING DELIMITERS = '|';
The problem is some of the rows in stuff.txt may not conform to the
stuff table attributes (duplicate keys, for example). The command above
terminates with no change to the table stuff when it encounters an error
in stuff.txt. Is there a way to have postgres ignore erroneous fields
but keep information about which fields weren't processed. I believe
Oracle has some support for this through an IGNORE clause.
Import Notes
Reference msg id not found: 200008290036.e7T0aZC77097@hub.org
Added to TODO:
* Allow COPY to specify column names
Yes, I have same problem. This functionality is "almost" standard,
and
found on other RDBMS. How do new features get added to the todo list's.
Where can I view the current todo list?Fredrick
Matthew Kennedy wrote:
I have a ton of data in a text delimited file from an old legacy system.
When uploading it into postgres, I'd do something like this:COPY stuff FROM 'stuff.txt' USING DELIMITERS = '|';
The problem is some of the rows in stuff.txt may not conform to the
stuff table attributes (duplicate keys, for example). The command above
terminates with no change to the table stuff when it encounters an error
in stuff.txt. Is there a way to have postgres ignore erroneous fields
but keep information about which fields weren't processed. I believe
Oracle has some support for this through an IGNORE clause.
--
Bruce Momjian | http://candle.pha.pa.us
pgman@candle.pha.pa.us | (610) 853-3000
+ If your life is a hard drive, | 830 Blythe Avenue
+ Christ can be your backup. | Drexel Hill, Pennsylvania 19026