As the year draws to a close it’s a good time to review my backup strategy and do a few test restores. My last backup review was in November of 2010. My overall philosophy hasn’t changed from last year – a file needs to exist in three places, two of them geographically separated. In addition, the backup has to be automatic since I’m lazy and forgetful. As for testing the restores it’s fairly simple. I restore some files from the oldest backups, some from the newest backups and some in between and compare them to the live files. I don’t restore everything and don’t do any full system restores. Now on to the strategy and tools used.
All my data is kept on my Windows Home Server so my backup strategy is centered around it. And yea, many would probably consider it a bit overboard since I go way above the three file rule for the important stuff.
Windows Home Server Backups
Cloudberry Backup for Windows Home Server
I’ve been using the Cloudberry WHS add-in for most of the year and it’s served me well. I have 7 backup plans configured and they back up to 6 destinations. In total, I’m currently backing up a tad under 6 TB of data on the Windows Home Server with all of that 6 TB going to at least two different locations.
The important stuff gets backed up every hour to Amazon S3. This currently totals 13.6 GB of data which include versions going back a month. I also keep deleted files for 10 days before they’re purged. I do compress and encrypt the data that goes to S3. I compress because I pay for bandwidth/space and I encrypt since it’s in the cloud. This means I need to use Cloudberry to do any restores. Cloudberry does support server-side encryption in Amazon S3 but I do the encryption using my own key within the Cloudberry software before it leaves my PC. Cloudberry just added Real-time data protection (a.k.a. continuous backup) which I’m currently testing instead of hourly. I haven’t used it enough to decide whether to use it instead of the hourly schedule.
My Windows Home Server has four data drives and also four data backup drives. So there’s a Cloudberry backup plan for each data drive/backup drive pair. I use the drive level backup (rather than share level) option to select the files for backup. For each data drive I select the “Server Folders” directory on the drive and backup everything it contains. This avoids the recycle bin, shadow copies and other non-data files. These run once a night about an hour apart and everything gets backed up. Since alls the drives are local SATA or eSATA the backup is pretty quick. The initial backups did take awhile, especially since the MicroServers aren’t speed demons.
I have a second server in the house that’s currently running Windows Server 2008. The hardware was my original WHS 2011 hardware but at this point I haven’t repurposed it so it runs Windows Server 2008 and serves as a backup destination, I power it up every weekend and manually trigger the backups. There are two backup plans setup but they go to the same file share on the server. There’s one plan for Video and there’s another plan for everything else. The main reason for two plans is a hold over from the days when I was backing up a lot of video. The “Everything Else” backup typically takes longer since it goes through hundreds of thousands of files and backs up tens of thousands although the total data is relatively small.
I don’t compress or encrypt any of the local backups. I can go directly to the file system and pull out the files if I need them although I do need the software if I want to more easily find versions or maintain the file structure.
I recently began using CrashPlan for additional offsite backup. It’s an economical way to store a lot of data offsite. It doesn’t officially support WHS but it’s been working well. I’ve had a couple occasions where it stopped doing backups and I had to cycle the service on the server. But the test restores have been fine. Now that the initial backups are done I’ve limited the backups to between 2AM and 7AM every day and I’ve throttled the bandwidth. I currently backup everything except video. I’m currently backing up 288 GB to CrashPlan. The problem with that amount of data is my bandwidth cap of 250 GB. So if I ever need to do a full restore I’ll either need to wait two months or have them send me the data on a drive.
Windows Home Server Native Backup
I have an external drive attached and use the WHS native backup to save the files needed for a OS recovery. My testing doesn’t include a OS restore although it did work way back when I needed it. Worst case is an OS and add-in re-install then restore the data.
As part of my offsite backups I have two 2 TB drives that I rotate offsite. Every week I bring one to my office desk drawer and bring the other home. The drive at home is attached to my Windows 7 PC and every night a batch files runs Robocopy to update the drive with everything that’s not a video file. This drive is encrypted (Truecrypt) so I don’t have to worry if it’s stolen from my desk.
There’s no offsite backup for my videos. I have two backups in the house along with the original source disks but if the worst happens to the house there’s nothing off site. I figure I’d have more to worry about than the videos and that is what insurance is for. I used to keep copies of the ripped videos on drives offsite but that became a hassle to keep updated as the number of drives grew. Especially since I tended to use old, retired 1 TB or smaller drives.
I need the software (CrashPlan or Cloudberry) to do a restore. Being able to pick a file from the local backups can work in a pinch but that would be an exception. In the past I used to keep the same shares on another server and do a share to share robocopy every night. In theory this made it easy to quickly switch to the new server. Now that I have several Micro Servers I’ve approached a hardware failure by already having a duplicate of anything in house and I can swap hardware around and restore any lost files. Actually, I don’t have duplicates of some of hardware used for backups, but as there’s redundancy across the backups I figure I can lose updates to one while I wait for a new part. Of course, the failure will come at the worst time.
Windows PC & Virtual Machine Backups
This is simple. I use Windows Home Server 2011 backup to backup my Windows 7 PCs and virtual machines. As I said, data is on the Windows Home Server itself so there’s not really much data to back up. To keep backups small I exclude the Virtual Machine disk files from backup. For testing I just went in and pulled a couple files out of the backup, I didn’t do a full restore.
In 2010 I used Jungle Disk for to do some Offsite Windows backups but I no longer use it. I don’t do any backups directly from a Windows PC to any offsite destination. Everything goes to WHS.
Mac Mini Backups
For my Mac Mini (Desktop) I use SuperDuper to clone the hard drive every night. This gives me a disk I can boot from should my Mini’s drive fail. I also use Time Machine to back things up. Time Machine is a hold over from when I kept local data on the Mac. It’s useful should I need to recover an old configuration file so I keep it.
I also run Arc Backup on the Mini to backup files to Amazon S3. I back up my application support folder along with my Documents folder (which is mostly empty). I have Arq Backup limited to $1/mth in Amazon charges which limits it to over 10 GB although it’s still only using 2.9 GB
In 2010 I used Jungle Disk to do my offsite Mac backups but didn’t like the direction the software was taking. Plus it didn’t support Amazon Reduced Redundancy Storage which would increase my costs.
MacBook Air Backups
I have a Seagate Portable drive I attach when I’m home and do a Time Machine backup to it. I’ll also pack it if I’ll be traveling for a few days but it typically stays home for short trips. I do use the encryption feature of time machine in case the drive gets lost.
I also use Arq Backup on the Air and it’s also limited to $1/mth (or 10.75 GB). Because it’s a laptop so does have more files locally, at least at times, there’s currently 3.9 GB stored on Amazon S3. I limit the backups to data files and a few configuration files. This is useful if I’m traveling since it moves the files far from the laptop.
Amason S3 pricing isn’t the most straightforward because there are charges for bandwidth and other operations in addition to space used. My total Amazon S3 charge for November was $3.52 and this is pretty standard although down a bit from September and October when it was $4. The November charge was $1.81 for the storage and $1.71 for those other charge, I use the Amazon Reduced Redundancy Storage option to keep costs down. I’m saving a total of 19 GB with Amazon S3. Amazon does offer Free Usage Tier which is not included in my prices as it’s only good for a year.
My Windows Home Server backup software of choice is Cloudberry since it’s so flexible. It’s gotten new features since I started using it and while I don’t use them all it’s nice to see the software gets continual care. CrashPlan is my choice if there’s a requirement to store a lot of files in the cloud. CrashPlan is a bit less flexible when it comes to local sources but it is a better choice if you want more offsite flexibility.
Arq Backup is my choice for Mac backups and replace Jungle Disk. In addition to a feature set I prefer, Arq backup does an excellent job of handling file attributes and file bundles that are common on the Mac platform.
Are you sure your backups work? You do backups, right?