Best pg_dump practices

Started by MTalmost 23 years ago3 messagesgeneral
Jump to latest
#1MT
m_tessier@sympatico.ca

Hi,

Now that I've painstakinly entered my data into the db, I'd like to perform regular backups using pg_dump as such

pg_dump -c -f dumpfile.sql dbname

This will give me the data in its original, pristine form. Note that using pg_dump this way means that the data gets dumped as copy too. Is there a way to dump only the db objects (ie. tables, sequences, etc) and exclude the data.

Then, as the db is used, I would perform daily backups, automated with cron.

pg_dump -a -f daily_dumpfile.sql dbname

I would then tar and gzip the daily_dumpfile.sql and upload it to a backup server.

Now, if the database should suddenly crash, I would do retrieve the dumpfiles, untar them and

psql dbname
\i dumpfile.sql
This would create the db in its original form.

\i daily_dumpfile.sql
This would bring the reconstituted db up to date.

Then again, �I'm not sure if this works. Furthermore, maybe someone could recommend a better way to perform this task.

--
Thanks,

Mark

#2Dmitri Bichko
dbichko@genpathpharma.com
In reply to: MT (#1)
Re: Best pg_dump practices

pg_dump -c -f dumpfile.sql dbname

This will give me the data in its original, pristine form. Note that

using pg_dump this way means that the data gets

dumped as copy too. Is there a way to dump only the db objects (ie.

tables, sequences, etc) and exclude the data.

The -s (--schema-only) flag dumps only the schema, not the data... At
least according to pg_dump --help

Dmitri

#3Berend Tober
btober@seaworthysys.com
In reply to: Dmitri Bichko (#2)
Re: Best pg_dump practices

pg_dump -c -f dumpfile.sql dbname

This will give me the data in its original, pristine form. Note that

using pg_dump this way means that the data gets

dumped as copy too. Is there a way to dump only the db objects (ie.

tables, sequences, etc) and exclude the data.

The -s (--schema-only) flag dumps only the schema, not the data... At
least according to pg_dump --help

It does in fact work as the manual describes. I've used it. What I usualy
do for back-ups though, although you wouldn't want to do this for a huge
database or with sensitive data, is

#!/bin/bash
NAIL=/usr/local/bin/nail
OUTPUT_FILE=dbname_`date +%Y%m%d`
OUTPUT_PATH=/tmp/

# Dump schema and data for backup
pg_dump -U dbname username > ${OUTPUT_PATH}${OUTPUT_FILE}.sql

# Compress for mailing
gzip ${OUTPUT_PATH}${OUTPUT_FILE}.sql

# Send it off-site
echo | ${NAIL} -a ${OUTPUT_PATH}${OUTPUT_FILE}.sql.gz -s ${OUTPUT_FILE}
btober_at_computer_dot_org

OUTPUT_FILE=globals_`date +%Y%m%d`
OUTPUT_PATH=/tmp/

# Dump user and group names for backup
pg_dumpall -g -U postgres > ${OUTPUT_PATH}${OUTPUT_FILE}.sql

# Send it off-site
echo | ${NAIL} -a ${OUTPUT_PATH}${OUTPUT_FILE}.sql -s ${OUTPUT_FILE}
btober_at_computer_dot_org

~Berend Tober