MYSQL / WordPress Database Backup

mysqllogo While my server image gets backed up daily by Slicehost, and my web files get backed up daily by me, there’s a potential gap in my WordPress database backup strategy. I back it up whenever I think of it but since it’s a manual process it’s done less and less frequently. If my database gets corrupt my only option is to restore the last server image unless I happened to do a WordPress backup recently. While pretty quick, it’s also a bit drastic.

So it’s time to remedy the situation. The WordPress codex has instructions for various ways to backup a WordPress database, I decided to go with a variation of “Using Straight MySQL Commands”.  It’s the easiest and most reliable way for me to do the backup.

I’ll be setting up the backup so it runs daily via a cron job, then as part of my regularly scheduled file backup it gets copied down to my Mac so it’s off the server. Once it’s on my Mac the versions get managed by my backup software in the event I need to go back a day or two.

I’m running MySQL 5 on Ubuntu 8.10 Server, here’s how I set it up:

Enable Cron

I haven’t been running Cron for my user and I wanted the SQL backup to run under my user ID. So the first thing I had to do is enable user level cron for my user ID I’ll do this by creating a cron.allow file and add my ID.

sudo nano /etc/cron.allow

I type my ID on the first line and save the file.

Setup the MySQL Logon

Rather than specify the user name and password on the command line I’ll set up a configuration file to handle all the MySQL logons. While the user name and password will be in clear text I’ll secure the file so only my ID can access it.

sudo nano /home/demo/.my.cnf

In these examples “demo” is my home directory, substitute your directory. Note the leading ‘.’ in the file name. I add the following lines to the file. Substitute your ID/Password as appropriate. Note that this is the MySQL ID and password, not your Ubuntu ID.

[client]

user=root

password=idpassword

Save the file.

Create the Bash Script and Supporting directories

I’ll create a directory to hold all my backups and then a bash script to run the backup.

mkdir /home/demo/backup

Then I’ll lock down the security on the directory so only I can access or read it.

chmod 750 /home/demo/backup

Now I’ll create the bash script file.

Open the editor and create the file: nano /home/demo/mysqlbackup.sh

Type the following lines into the file (the second line wraps in this display but is typed all on one line)
#!/bin/sh

mysqldump --add-drop-table -h localhost --all-databases | bzip2 -c >/home/demo/backup/mysqlbackup.sql.bz2

chmod 750 /home/demo/backup/mysqlbackup.sql.bz2

The first line just says the file is a bash script.

The second line does all the real work. It runs the application mysqldump. The –add-drop-table parameter was recommended by WordPress. This adds a drop table command to the beginning of the backup. During the import (restore) it will drop the table name if it already exists so that you don’t have to delete it yourself..

I’m running everything on the server so the host parameter is localhost and –all-databases will dump all dbs to one file as the name implies.

Everything is piped to bzip2 so that the output file is compressed. The output file is specified after the >.

It’s worth noting that compressing the file is CPU intensive and the server will take a brief 100% hit during the backup. My backup takes a couple seconds and results in a 4mb file. It’s 27MB uncompressed and the cpu rarely tops 30% when doing a uncompressed backup. But I copy the file to my PC so the compression is well worth the 1 or 2 second hit.

The chmod line sets security on the backup file so that only I can see it. The file contains things like passwords and I’m paranoid so I take the additional step of verifying the security settings. The default profile for my ID is to create files that are readable by the world. Even if the directory is blocked to others it’s no real effort for me to take the additional step.

Test the script by running: /home/demo/mysqlbackup.sh

The backup should should run and create the file in the backup directory. Once it does all that’s left is to schedule to cron job.

Schedule the Cron Job

Run: crontab –e

This will open the crontab file in your default editor (nano in my case). You can find more info on creating the cron file in the Ubuntu documentation. In my case I added the following line to the crontab file:

0 5 * * * /home/demo/mysqlbackup.sh

This will run the backup at 5am every morning. (My server is set to UTC). When you save the file it will be checked for errors and if none are found you’re ready to go.

I’d previously written about how I schedule website backups with Transmit. While that article is almost two years old I do the same process today. After being copied to my Mac the file is then backed up by my regular backup software which handles keeping a few recent versions of the file around.

Now I can sleep better tonight knowing I have a backup of my MySQL databases.