Shell script to Backup/Dump Database
Hi,
I'am looking fo a shell script do Backup/Dump my PostgreSQL twice a day.
I will need to do a full dump (schema + data), schema dump (only schema) and
data (only data)
There is something done on this subject? I'am very poor in shell script,
there are some scripts on the web ready to use?
Best Regards,
On 12 May 2010 16:30, Andre Lopes <lopes80andre@gmail.com> wrote:
Hi,
I'am looking fo a shell script do Backup/Dump my PostgreSQL twice a day.
I will need to do a full dump (schema + data), schema dump (only schema)
and data (only data)There is something done on this subject? I'am very poor in shell script,
there are some scripts on the web ready to use?Best Regards,
Yes, several of us discussed something like this recently. Are you looking
to back up all databases on the cluster or just an individual database?
If it's a cluster, you could have a shell script like this:
#!/bin/bash
mkdir /tmp/`date +\%Y\%m\%d`
pg_dumpall -U postgres | gzip > /tmp/`date +\%Y\%m\%d`/FULL.sql.gz
pg_dumpall -c -U postgres | gzip > /tmp/`date +\%Y\%m\%d`/SCHEMA.sql.gz
pg_dumpall -a -U postgres | gzip > /tmp/`date +\%Y\%m\%d`/DATA.sql.gz
If you want a single database, try this:
#!/bin/bash
mkdir /tmp/`date +\%Y\%m\%d`
pg_dump -U postgres my_database | gzip > /tmp/`date
+\%Y\%m\%d`/my_database_FULL.sql.gz
pg_dump -s -U postgres my_database | gzip > /tmp/`date
+\%Y\%m\%d`/my_database_SCHEMA.sql.gz
pg_dump -a -U postgres my_database | gzip > /tmp/`date
+\%Y\%m\%d`/my_database_DATA.sql.gz
If you want each individual database, try this:
#!/bin/bash
mkdir /tmp/`date +\%Y\%m\%d`
query="select datname from pg_database where not datistemplate and
datallowconn;"
for line in `psql -U postgres -At -c "$query" postgres`
do
pg_dump -U postgres "$line" | gzip > /tmp/`date
+\%Y\%m\%d`/"$line"_FULL.sql.gz
pg_dump -s -U postgres "$line" | gzip > /tmp/`date
+\%Y\%m\%d`/"$line"_SCHEMA.sql.gz
pg_dump -a -U postgres "$line" | gzip > /tmp/`date
+\%Y\%m\%d`/"$line"_DATA.sql.gz
done
Obviously you can change the output location to something other that /tmp if
you're not going to transfer it anywhere. Just make sure the path exists.
If you are copying it away immediately after, make sure you delete it or
they'll build up.
If you want to schedule it, bung it in a shell script like
backup_databases.sh and stick it in the root crontab.
Regards
Thom
On 12 May 2010 16:30, Andre Lopes <lopes80andre@gmail.com> wrote:
Hi,
I'am looking fo a shell script do Backup/Dump my PostgreSQL twice a day.
I will need to do a full dump (schema + data), schema dump (only schema)
and data (only data)There is something done on this subject? I'am very poor in shell script,
there are some scripts on the web ready to use?Best Regards,
Yes, several of us discussed something like this recently. Are you looking
to back up all databases on the cluster or just an individual database?
If it's a cluster, you could have a shell script like this:
#!/bin/bash
mkdir /tmp/`date +\%Y\%m\%d`
pg_dumpall -U postgres | gzip > /tmp/`date +\%Y\%m\%d`/FULL.sql.gz
pg_dumpall -c -U postgres | gzip > /tmp/`date +\%Y\%m\%d`/SCHEMA.sql.gz
pg_dumpall -a -U postgres | gzip > /tmp/`date +\%Y\%m\%d`/DATA.sql.gz
If you want a single database, try this:
#!/bin/bash
mkdir /tmp/`date +\%Y\%m\%d`
pg_dump -U postgres my_database | gzip > /tmp/`date
+\%Y\%m\%d`/my_database_FULL.
sql.gz
pg_dump -s -U postgres my_database | gzip > /tmp/`date
+\%Y\%m\%d`/my_database_SCHEMA.sql.gz
pg_dump -a -U postgres my_database | gzip > /tmp/`date
+\%Y\%m\%d`/my_database_DATA.sql.gz
If you want each individual database, try this:
#!/bin/bash
mkdir /tmp/`date +\%Y\%m\%d`
query="select datname from pg_database where not datistemplate and
datallowconn;"
for line in `psql -U postgres -At -c "$query" postgres`
do
pg_dump -U postgres "$line" | gzip > /tmp/`date
+\%Y\%m\%d`/"$line"_FULL.sql.gz
pg_dump -s -U postgres "$line" | gzip > /tmp/`date
+\%Y\%m\%d`/"$line"_SCHEMA.sql.gz
pg_dump -a -U postgres "$line" | gzip > /tmp/`date
+\%Y\%m\%d`/"$line"_DATA.sql.gz
done
Obviously you can change the output location to something other that /tmp if
you're not going to transfer it anywhere. Just make sure the path exists.
If you are copying it away immediately after, make sure you delete it or
they'll build up.
If you want to schedule it, bung it in a shell script like
backup_databases.sh and stick it in the root crontab.
Regards
Thom
Thanks for the reply.
I will test today.
There are some good practices that I should do? For example, I have this
database in a shared hosting, should I add the functionality of send the
dumps by FTP to my PC once a week? Give some more ideias to avoid data loss
in case of disaster.
Best Regards,
On Wed, May 12, 2010 at 4:45 PM, Thom Brown <thom@linux.com> wrote:
Show quoted text
On 12 May 2010 16:30, Andre Lopes <lopes80andre@gmail.com> wrote:
Hi,
I'am looking fo a shell script do Backup/Dump my PostgreSQL twice a day.
I will need to do a full dump (schema + data), schema dump (only schema)
and data (only data)There is something done on this subject? I'am very poor in shell script,
there are some scripts on the web ready to use?Best Regards,
Yes, several of us discussed something like this recently. Are you looking
to back up all databases on the cluster or just an individual database?If it's a cluster, you could have a shell script like this:
#!/bin/bash
mkdir /tmp/`date +\%Y\%m\%d`
pg_dumpall -U postgres | gzip > /tmp/`date +\%Y\%m\%d`/FULL.sql.gz
pg_dumpall -c -U postgres | gzip > /tmp/`date +\%Y\%m\%d`/SCHEMA.sql.gz
pg_dumpall -a -U postgres | gzip > /tmp/`date +\%Y\%m\%d`/DATA.sql.gzIf you want a single database, try this:
#!/bin/bash
mkdir /tmp/`date +\%Y\%m\%d`
pg_dump -U postgres my_database | gzip > /tmp/`date
+\%Y\%m\%d`/my_database_FULL.sql.gz
pg_dump -s -U postgres my_database | gzip > /tmp/`date
+\%Y\%m\%d`/my_database_SCHEMA.sql.gz
pg_dump -a -U postgres my_database | gzip > /tmp/`date
+\%Y\%m\%d`/my_database_DATA.sql.gzIf you want each individual database, try this:
#!/bin/bash
mkdir /tmp/`date +\%Y\%m\%d`
query="select datname from pg_database where not datistemplate and
datallowconn;"
for line in `psql -U postgres -At -c "$query" postgres`
do
pg_dump -U postgres "$line" | gzip > /tmp/`date
+\%Y\%m\%d`/"$line"_FULL.sql.gz
pg_dump -s -U postgres "$line" | gzip > /tmp/`date
+\%Y\%m\%d`/"$line"_SCHEMA.sql.gz
pg_dump -a -U postgres "$line" | gzip > /tmp/`date
+\%Y\%m\%d`/"$line"_DATA.sql.gz
doneObviously you can change the output location to something other that /tmp
if you're not going to transfer it anywhere. Just make sure the path
exists. If you are copying it away immediately after, make sure you delete
it or they'll build up.If you want to schedule it, bung it in a shell script like
backup_databases.sh and stick it in the root crontab.Regards
Thom
On 12 May 2010 17:02, Andre Lopes <lopes80andre@gmail.com> wrote:
Thanks for the reply.
I will test today.
There are some good practices that I should do? For example, I have this
database in a shared hosting, should I add the functionality of send the
dumps by FTP to my PC once a week? Give some more ideias to avoid data loss
in case of disaster.Best Regards,
Can you afford to lose 6 days worth of data? If not, get it to send backups
to another place on a daily basis. You can always add in an scp step to the
script to copy to another unix box, bearing in mind you'll need to add the
ssh key for the user on your shared host to ~/.ssh/authorized_hosts at your
destination. Or just use FTP like you mentioned.
Thom