Back Up Your Database
Back Up Your Database
Backing up your database doesn't have to be complicated, but it IS necessary. See how to setup a basic script that will back up your database and move it to the cloud.
Join the DZone community and get the full member experience.Join For Free
New whitepaper: Database DevOps – 6 Tips for Achieving Continuous Delivery. Discover 6 tips for continuous delivery with Database DevOps in this new whitepaper from Redgate. In 9 pages, it covers version control for databases and configurations, branching and testing, automation, using NuGet packages, and advice for how to start a pioneering Database DevOps project. Also includes further research on the industry-wide state of Database DevOps, how application and database development compare, plus practical steps for bringing DevOps to your database. Read it now free.
Creating a regular backup of your database is one of the tasks that can be very easily accomplished but that is quite often neglected until the minute you face the first disaster. Backup solutions can be of very complex types which you might engineer to simple ones like copy and pasting. Depending on your application, the necessity for complex solutions might vary. So what could be a minimum solution that is viable and stable?
The simplest solution, in my opinion, should contain these elements:
- Create a dump of the database.
- Copy the dump to another location (different from the DB server).
Now, depending on your RDBMS and operating system, you can take different approaches to implementing these two tasks. On a Linux system with a MySQL or PostgreSQL, my approach would be to write a shell script which does these two functions. One such a script might look like this:
#!/bin/bash DATE=`date +%Y%m%d-%s` FILE=/path_to_bckp_folder/$DATE-blog.sql mysqldump db_name > $FILE
Line 3 creates the
DATE variable, which holds the current timestamp. It is used to construct the name of the backup file. Line 5 creates the
FILE variable, which holds the complete path of the backup file. Line is the MySQL command to dump the database named
db_name to the path determined in line 5.
So far, this code completes the first task of backup up the database. The next task is to copy it somewhere else so we don’t have the database and backup on the same disk. Now this “somewhere” could be some other server, some other disk, or somewhere else, such as cloud storage. My preferred choice would be to put it to a cloud storage like AWS S3. Having AWS CLI installed, copying the backup file to an AWS S3 bucket is a one-line code:
aws s3 cp $FILE s3://bucket_path
The complete file would look like:
#!/bin/bash DATE=`date +%Y%m%d-%s` FILE=/path_to_bckp_folder/$DATE-blog.sql mysqldump db_name > $FILE aws s3 cp $FILE s3://blog-db-bckp
I know a lot of people might not agree with me that it is a good solution, but in my opinion, this is the minimum code that does the work. What you need to do next is add this script to
/etc/cron.daily folder and your script will execute daily and do the backup for you (please make sure that permissions are correctly set so it can be executed).
As a bonus point, I would also add a “done” notification to a script which would send an email or post to a Slack channel or whatever notification that suits you so you know that your script has executed.
Published at DZone with permission of Arian Celina , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.