Hourly database backup

7,078

Solution 1

Write a small script like:

#!/bin/bash
#
# Do the Backup
#
CURTIME=`date "+%Y%m%d-%H%M"`

mysqldump --user=<dbusername> --password=<dbpassword> --all-databases | lzma -c -9 -z >/backup/db-${CURTIME}.dump.lzma

and put this to crontab. (It would be easier to run in hourly: then you can just put the script in /etc/cron.hourly.

Solution 2

Here is the bash script I have running on my webserver. It has served me well for some time now.

It includes a part at the end to remove old backups. You can specify the number of files you would like to keep in the variables section. You have to uncomment that part of the script to get it to run.

#!/bin/sh

#################################################################
#  Define your variables here:
#################################################################

FILESTOKEEP=7
BACKUP_DIR=/home/user/backups
BMYSQL_USER=mysql_user
BMYSQL_PWD=mypassword

DATE=$(date +"%m-%d-%Y")_$(date +"%T")

BMYSQL_HOST=localhost
BMYSQL_DBNAME=--all-databases
BMYSQL_DBFILENAME=MYSQL_BACKUP_$DATE


#################################################################
#  Make sure output directory exists.
#################################################################

        if [ ! -d $BACKUP_DIR ]; then
            mkdir -p $BACKUP_DIR
        fi


#################################################################
#  Create backup
#################################################################

        mysqldump --host=$BMYSQL_HOST --user=$BMYSQL_USER --pass=$BMYSQL_PWD $BMYSQL_DBNAME | gzip > $BACKUP_DIR/$BMYSQL_DBFILENAME.gz

#################################################################
#  Remove old backups 
#  - this will list files according to date (DESC)
#  - skip the first few files (FILESTOKEEP)
#  - remove all files past that
#  NOTE: Make sure not to save the backups into any directory
#  where there are other files other than these backup ones.
#
#  Uncomment when you are confident in rest of setup
#################################################################

#      cd $BACKUP_DIR
#      ls -t1 | tail -n +$(($FILESTOKEEP+1)) | xargs rm

I like saving my scripts into a directory in my user home. This may not be standard practice, but it sure makes it easy to find and edit them later on (read a couple years later). I would save this script as:

/home/user/scripts/hourly_backup_mysql.sh

To put that in your cron for every two hours open a terminal and type (as root):

crontab -e

In there, enter a new line like this:

0 0-23/2 * * * /home/user/scripts/hourly_backup_mysql.sh

(If you are unfamiliar with VIM, to add that line you will first need to press 'I' for Insert. Then go to the end of the file and press Enter for a new line. Paste the string and edit it. When you are finished editing the string, press ESC to leave the Insert mode. Then type ':wq', which will Write & Quit the VIM editor.)

This will now run every 2 hours and if you uncommented the remove section, will also remove old backups.

Cheers!

Solution 3

Per table db dump and gzip and rsync to some place else:

#!/usr/bin/ruby
require 'mysql'

rsyncTargets = [
        ["files1" , "/path/to/backup/dir/"],
        ["files2", "/path/to/backup/dir/"],
        ["files3", "/path/to/backup/dir/"]
          ]

tempDir = "/mnt/extra-space"
dumpUser = "root"

`rm -r /mnt/extra-space/*`

con = Mysql.real_connect('localhost',dumpUser,'','')

con.query('show databases').each do |db|
  print " Dumping - " + db[0] + "\n"
  Dir.mkdir("#{tempDir}/#{db[0]}")
  con.query("use #{db[0]}")
  con.query('show tables').each do |table|
    `mysqldump -u #{dumpUser} #{db[0]} #{table[0]} | gzip > #{tempDir}/#{db[0]}/#{table[0]}.sql.gz`
  end
end
con.close()

rsyncTargets.each do |server|
  `rsync -av --delete /mnt/extra-space/ rsync://#{server[0]}:#{server[1]}`
end

Taken from: http://techhelplist.com/index.php/tech-tutorials/42-databases/80-ruby-script-for-mysql-dump-gzip-cron-jobs

Share:
7,078

Related videos on Youtube

Fcoder
Author by

Fcoder

Updated on September 18, 2022

Comments