How to take MySQL Database backup and put it in Amazon s3 every Night by using Cron tab?
Solution 1
Should be pretty straightforward:
- backup your database using mysqldump
mysqldump -u [uname] -p[pass] myfleet | gzip -9 > myfleet.sql.gz
- upload your dump file to S3 using a command line client (e.g. http://s3tools.org/s3cmd:
s3cmd put myfleet.sql.gz s3://<bucketname>/myfleet.sql.gz
Just add this to your cron job (you might want to use some kind of numbering scheme for the dump files, in case you want to keep several versions).
Solution 2
You can also use STDOUT and the AWS CLI tool to pipe the output of your mysqldump straight to S3:
mysqldump -h [db_hostname] -u [db_user] -p[db_passwd] [databasename] | aws s3 cp - s3://[s3_bucketname]/[mysqldump_filename]
For example:
mysqldump -h localhost -u db_user -ppassword test-database | aws s3 cp - s3://database-mysqldump-bucket/test-database-dump.sql
The mysqldump command outputs to STDOUT by default. Using -
as the input argument for aws s3 cp
tells the AWS CLI tool to use STDIN for the input.
Solution 3
mysqldump --host=$HOST --user=$USER --password=$PASSWORD $DB_NAME --routines --single-transaction | gzip -9 | aws s3 cp - s3://bucket/database/filename.sql.gz
will directly store file to s3.
Solution 4
If the source DB is on AWS and is of type Aurora.Mysql you can backup directly to S3 with a command like
SELECT * FROM employees INTO OUTFILE S3 's3-us-west-2://aurora-select-into-s3-pdx/sample_employee_data';
See https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.SaveIntoS3.html for details.
maaz
Updated on June 09, 2022Comments
-
maaz almost 2 years
I have got one server in Rackspace and i'm already running a cron job evry day night to process something...(some account related operation- that will send me email every mid night). my application is in groovy on grails. now i want to take mysql database (called myfleet) backup on every mid night and put that file in Amezon S3 . how can i do that? do i need to write any java or groovy file to process that? or is it can be done from Linux box itself? i have already got account in Amezon S3 (bucket name is fleetBucket)
-
maaz almost 13 yearsThank you Frank Schmitt... I did not get it, can you expand it bit. what does this snippet means?
-
maaz almost 13 yearsyes ... i git it. i have gone through . s3tools.org/s3cmd: and made the required setup... is just adding s3cmd put myfleet.sql.gz s3://<bucketname>/myfleet.sql.gz this line in cron tab will work? this is the command right? i need to specify that for every night run this script right?
-
Frank Schmitt almost 13 yearsgzip -9 compresses the output of mysqldump (9 is the compression factor).
-
maaz almost 13 yearsthanks.Can u give the whole cron job script for running this command in midnight and put in s3?
-
Federico almost 5 yearsNote that this might not generate a file in the machine running the client, but the data does flow through it, which is quite important for non trivial db sizes. See my answer for a truly direct way.
-
nimai over 4 yearsthe aurora constraint is a big one. Also, note that this doesn't dump the database and is therefore not really what I would call a backup: it only exports the data from one table. My view on "backup" of a database is that it should save the data + schema, so that you can replicate it in a new database in case of db disaster.