postgres import

Started by Antonios Katsikadamosover 19 years ago9 messagesgeneral
Jump to latest
#1Antonios Katsikadamos
antonioskatsikadamos@yahoo.com

Hi all,

I have a little problem. I have an .sql file ( db dump ) and i want to import it to postgres on linux.

Does anyone know how i can do it?

thnx a lot mates

---------------------------------
We have the perfect Group for you. Check out the handy changes to Yahoo! Groups.

#2Albe Laurenz
all@adv.magwien.gv.at
In reply to: Antonios Katsikadamos (#1)
Re: postgres import

I have a little problem. I have an .sql file ( db dump )
and i want to import it to postgres on linux.

Does anyone know how i can do it?

You feed it to the command line interface psql.

Example:
psql -h host -p port -d database -U user <dump.sql

Yours,
Laurenz Albe

#3Tomi NA
hefest@gmail.com
In reply to: Albe Laurenz (#2)
Re: postgres import

2006/10/31, Albe Laurenz <all@adv.magwien.gv.at>:

I have a little problem. I have an .sql file ( db dump )
and i want to import it to postgres on linux.

Does anyone know how i can do it?

You feed it to the command line interface psql.

Example:
psql -h host -p port -d database -U user <dump.sql

It's a good enough solution in most cases, but when the rowcount
starts to skyrocket, it simply doesn't seem to cut it (at least I
couldn't make it to). To load 1,5M rows (~230MB of INSERT statements),
I used gvim (wonderful tool!) to transform the INSERT statements into
a CSV file and then used an ETL (kettle - another wonderful tool) tool
to import the data into the database.
This could have probably been done much easier: I'd welcome a helpful
hint so as I know next time. :)

t.n.a.

#4Ron Johnson
ron.l.johnson@cox.net
In reply to: Tomi NA (#3)
Text manipulation tools (was Re: postgres import)

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 10/31/06 17:17, Tomi NA wrote:

2006/10/31, Albe Laurenz <all@adv.magwien.gv.at>:

[snip]

It's a good enough solution in most cases, but when the rowcount
starts to skyrocket, it simply doesn't seem to cut it (at least I
couldn't make it to). To load 1,5M rows (~230MB of INSERT statements),
I used gvim (wonderful tool!) to transform the INSERT statements into
a CSV file and then used an ETL (kettle - another wonderful tool) tool
to import the data into the database.
This could have probably been done much easier: I'd welcome a helpful
hint so as I know next time. :)

Unix is chock full of streaming text manipulation tools. In this
case, awk, Perl or Python would work well.

- --
Ron Johnson, Jr.
Jefferson LA USA

Is "common sense" really valid?
For example, it is "common sense" to white-power racists that
whites are superior to blacks, and that those with brown skins
are mud people.
However, that "common sense" is obviously wrong.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.5 (GNU/Linux)

iD8DBQFFR+NKS9HxQb37XmcRAo3XAKDr2V7T//IjcRIKoHe6IH01eqrE9gCfe8CO
g8eLsgHs7AtNJT6+F/2Byj4=
=RT7u
-----END PGP SIGNATURE-----

#5Tom Lane
tgl@sss.pgh.pa.us
In reply to: Tomi NA (#3)
Re: postgres import

"Tomi NA" <hefest@gmail.com> writes:

2006/10/31, Albe Laurenz <all@adv.magwien.gv.at>:

psql -h host -p port -d database -U user <dump.sql

It's a good enough solution in most cases, but when the rowcount
starts to skyrocket, it simply doesn't seem to cut it (at least I
couldn't make it to).

It certainly should work. We've seen some platforms where libreadline
seems to be unable to tell the difference between input from a terminal
and input from a file, and performs a boatload of processing that would
be useful for interactive input but is just overhead here. If that's
your problem, try this form instead:

psql -h host -p port -d database -U user -f dump.sql

regards, tom lane

#6Alban Hertroys
alban@magproductions.nl
In reply to: Tomi NA (#3)
Re: postgres import

Tomi NA wrote:

2006/10/31, Albe Laurenz <all@adv.magwien.gv.at>:

You feed it to the command line interface psql.

Example:
psql -h host -p port -d database -U user <dump.sql

It's a good enough solution in most cases, but when the rowcount
starts to skyrocket, it simply doesn't seem to cut it (at least I
couldn't make it to). To load 1,5M rows (~230MB of INSERT statements),

INSERT statements? You dumped with the -d flag, didn't you? Otherwise
you'd have seen COPY statements instead, which are much faster (and of
which much fewer are necessary, usually).

--
Alban Hertroys
alban@magproductions.nl

magproductions b.v.

T: ++31(0)534346874
F: ++31(0)534346876
M:
I: www.magproductions.nl
A: Postbus 416
7500 AK Enschede

// Integrate Your World //

#7Tomi NA
hefest@gmail.com
In reply to: Alban Hertroys (#6)
Re: postgres import

2006/11/1, Alban Hertroys <alban@magproductions.nl>:

Tomi NA wrote:

2006/10/31, Albe Laurenz <all@adv.magwien.gv.at>:

You feed it to the command line interface psql.

Example:
psql -h host -p port -d database -U user <dump.sql

It's a good enough solution in most cases, but when the rowcount
starts to skyrocket, it simply doesn't seem to cut it (at least I
couldn't make it to). To load 1,5M rows (~230MB of INSERT statements),

INSERT statements? You dumped with the -d flag, didn't you? Otherwise
you'd have seen COPY statements instead, which are much faster (and of
which much fewer are necessary, usually).

No I didn't, actually. :) The data was never in the database in the
first place: it was generated from a different source. True, it was
generated as a CSV file which I converted into INSERT statements, but
conversion between the two is not a problem (given 1.5GB of RAM).

t.n.a.

#8Albe Laurenz
all@adv.magwien.gv.at
In reply to: Tomi NA (#7)
Re: postgres import

psql -h host -p port -d database -U user <dump.sql

It's a good enough solution in most cases, but when the rowcount
starts to skyrocket, it simply doesn't seem to cut it (at least I
couldn't make it to).

INSERT statements? You dumped with the -d flag, didn't you?

No I didn't, actually. :) The data was never in the database in the
first place: it was generated from a different source. True, it was
generated as a CSV file which I converted into INSERT statements, but
conversion between the two is not a problem (given 1.5GB of RAM).

Then the best way is to convert it back to a CSV and use the COPY
statement to load in into the table (or \copy from psql).
You don't need any third party tools for that, it's all in PostgreSQL.

Yours,
Laurenz Albe

#9Tomi NA
hefest@gmail.com
In reply to: Albe Laurenz (#8)
Re: postgres import

2006/11/2, Albe Laurenz <all@adv.magwien.gv.at>:

psql -h host -p port -d database -U user <dump.sql

It's a good enough solution in most cases, but when the rowcount
starts to skyrocket, it simply doesn't seem to cut it (at least I
couldn't make it to).

INSERT statements? You dumped with the -d flag, didn't you?

No I didn't, actually. :) The data was never in the database in the
first place: it was generated from a different source. True, it was
generated as a CSV file which I converted into INSERT statements, but
conversion between the two is not a problem (given 1.5GB of RAM).

Then the best way is to convert it back to a CSV and use the COPY
statement to load in into the table (or \copy from psql).
You don't need any third party tools for that, it's all in PostgreSQL.

I had a problem with copy, but I can't remember what exactly...come to
think of it, it could have probably done the job...assuming I define
the primary key as DEFAULT nextval('id'), as I had no id in the rows I
was importing...nice to have alternatives. Thanks for the suggestion.

t.n.a.