dump of 700 GB database

Started by karsten vennemannabout 16 years ago2 messagesgeneral
Jump to latest
#1karsten vennemann
karsten@terragis.net

I have to write a 700 GB large database to a dump to clean out a lot of dead records on an Ubuntu server with postgres 8.3.8. What is the proper procedure to succeed with this - last time the dump stopped at 3.8 GB size I guess. Should I combine the -Fc option of pg_dump and and the split command ?
I thought something like

"pg_dump -Fc test | split -b 1000m - testdb.dump"
might work ?
Karsten

Terra GIS LTD
Seattle, WA, USA

#2John R Pierce
pierce@hogranch.com
In reply to: karsten vennemann (#1)
Re: dump of 700 GB database

karsten vennemann wrote:

I have to write a 700 GB large database to a dump to clean out a lot
of dead records on an Ubuntu server with postgres 8.3.8. What is the
proper procedure to succeed with this - last time the dump stopped at
3.8 GB size I guess. Should I combine the -Fc option of pg_dump and
and the split command ?

vacuum should clean out the dead tuples, then cluster on any large
tables that are bloated will sort them out without needing too much
temporary space.