How can I run a Cron job to make backups of a MySQL server by SSH
Solution 1
You can use this handy one-liner, run from the local side of the connection:
ssh user@remoteserver "mysqldump -h myhost -u myuser -pmypass mydb | gzip" > /path/to/my/dump.sql.gz
In short, the output generated by your mysqldump
SSH command will be piped to gzip
on the remote side of the connection and then redirected to stdout
, which is then redirected to /path/to/my/dump.sql.gz
on your NAS. Only the compressed data will be sent over the network.
You can expand this slightly by running the following in a script (this is equivalent to what John has specified in his answer):
ssh user@remoteserver "mysqldump -h myhost -u myuser -pmypass -D mydb | gzip > /tmp/dump.sql.gz"
scp user@remoteserver:/tmp/dump.sql.gz /path/to/my/dump.sql.gz
ssh user@remoteserver "rm -f /tmp/dump.sql.gz"
This is a slightly longer-winded approach and will dump/compress the entire database first, then copy it over the network via scp
, and finally remove the remote copy.
Solution 2
Ok I sharing my personal method that i used to take backup of all mysql database daily with archive.
-
Create a file mysqlbackup.sh in /bin or any other place you like with following code
#!/bin/bash # modify the following to suit your environment export DAYS="3" export DB_BACKUP="/backup/" export DB_USER="root" export DB_PASSWD="<you root password>" # title and version echo "" echo "mySQL_backup" echo "----------------------" echo "* Deleting OLD Backup ..." # this will delete old file older than DAYS find $DB_BACKUP -mtime +$DAYS -exec rm -rf {} \; echo "* Creating new backup..." # Next command take backup compressed with bzip2 save in directory DB_BACKUP mysqldump --all-databases | bzip2 > $DB_BACKUP/mysql-`date +%Y-%m-%d-%H:%M`.bz2 echo "----------------------" echo "Done" exit 0
- Save that file
- Mark that script executable by using chmod +x filename
- Setup that script in cron for daily run 0 0 * * * /bin/mysqlbackup.sh that will create back at 12 midnight each day.
- Rsync /backup to you nas daily using cron. for setting up rsync you can use below link. http://www.thegeekstuff.com/2011/07/rsync-over-ssh-without-password/ https://blogs.oracle.com/jkini/entry/how_to_scp_scp_and
OR
- You can modify the original script that include rsync your folder also in that case you dont need to setup second cron.
Best of luck....
Related videos on Youtube
Pluc
Updated on September 18, 2022Comments
-
Pluc over 1 year
I have a production server running CentOS with a MySQL database. At home, I have a QNAP NAS (runs a little embed linux). I want to make a Cron job on my NAS to backup the data from my CentOS production server. The problem is that I dont want to transfer the data uncompressed (the SQL dump will end up arround 5gb). I want to SSH onto the server, run the SQL dump, compress the result, download it.
The is this possible? What would be the most efficient way?
-
Kl4m over 10 yearsTo save a lot of bandwidth, use "gzip --rsyncable" to create the file, then use rsync instead of scp to transfer.
-
ObiwanKeTobi over 10 years@Pluc - not quite. The first example would gzip the output of the
mysqldump
before it hits the network. Note the position of thegzip
command - i.e. before the closing quote of the SSH command. Anything within the double quotes is executed remotely. @Kl4m -rsyncable
does not have any effect if you're transferring the dump once. It only affects subsequent transfers. -
Pluc over 10 years@CraigWatson Ok, thank you. Could you just change "-D" to "-d" in your answer? It is case sensistive :)
-
ObiwanKeTobi over 10 yearsMy mistake, not sure where I got
-D
from. The correct flag is-B
to select multiple databases, or you can just specify a single database as the last argument with no flag. -
Pluc over 10 years@CraigWatson Yes, but please fix your answer to avoid confusion with future readers.
-
ObiwanKeTobi over 10 yearsI edited my answer a couple of seconds after posting my comment - my SSH command now uses the 'no flag' approach.