Categories
Backup

Backup Review

Backup Logo - Laptops connected to backupAs the year draws to a close it’s a good time to review my backup strategy and do a few test restores. My last backup review was in November of 2010. My overall philosophy hasn’t changed from last year – a file needs to exist in three places, two of them geographically separated. In addition, the backup has to be automatic since I’m lazy and forgetful. As for testing the restores it’s fairly simple. I restore some files from the oldest backups, some from the newest backups and some in between and compare them to the live files. I don’t restore everything and don’t do any full system restores. Now on to the strategy and tools used.

All my data is kept on my Windows Home Server so my backup strategy is centered around it. And yea, many would probably consider it a bit overboard since I go way above the three file rule for the important stuff.

Windows Home Server Backups

Cloudberry Backup for Windows Home Server

I’ve been using the Cloudberry WHS add-in for most of the year and it’s served me well. I have 7 backup plans configured and they back up to 6 destinations. In total, I’m currently backing up a tad under 6 TB of data on the Windows Home Server with all of that 6 TB going to at least two different locations.

The important stuff gets backed up every hour to Amazon S3. This currently totals 13.6 GB of data which include versions going back a month. I also keep deleted files for 10 days before they’re purged. I do compress and encrypt the data that goes to S3. I compress because I pay for bandwidth/space and I encrypt since it’s in the cloud. This means I need to use Cloudberry to do any restores. Cloudberry does support server-side encryption in Amazon S3 but I do the encryption using my own key within the Cloudberry software before it leaves my PC. Cloudberry just added Real-time data protection (a.k.a. continuous backup) which I’m currently testing instead of hourly. I haven’t used it enough to decide whether to use it instead of the hourly schedule.

My Windows Home Server has four data drives and also four data backup drives. So there’s a Cloudberry backup plan for each data drive/backup drive pair. I use the drive level backup (rather than share level) option to select the files for backup. For each data drive I select the “Server Folders” directory on the drive and backup everything it contains. This avoids the recycle bin, shadow copies and other non-data files. These run once a night about an hour apart and everything gets backed up. Since alls the drives are local SATA or eSATA the backup is pretty quick. The initial backups did take awhile, especially since the MicroServers aren’t speed demons.

I have a second server in the house that’s currently running Windows Server 2008. The hardware was my original WHS 2011 hardware but at this point I haven’t repurposed it so it runs Windows Server 2008 and serves as a backup destination, I power it up every weekend and manually trigger the backups. There are two backup plans setup but they go to the same file share on the server. There’s one plan for Video and there’s another plan for everything else. The main reason for two plans is a hold over from the days when I was backing up a lot of video. The “Everything Else” backup typically takes longer since it goes through hundreds of thousands of files and backs up tens of thousands although the total data is relatively small.

I don’t compress or encrypt any of the local backups. I can go directly to the file system and pull out the files if I need them although I do need the software if I want to more easily find versions or maintain the file structure.

CrashPlan Backup

I recently began using CrashPlan for additional offsite backup. It’s an economical way to store a lot of data offsite. It doesn’t officially support WHS but it’s been working well. I’ve had a couple occasions where it stopped doing backups and I had to cycle the service on the server. But the test restores have been fine. Now that the initial backups are done I’ve limited the backups to between 2AM and 7AM every day and I’ve throttled the bandwidth. I currently backup everything except video.  I’m currently backing up 288 GB to CrashPlan. The problem with that amount of data is my bandwidth cap of 250 GB. So if I ever need to do a full restore I’ll either need to wait two months or have them send me the data on a drive.

Windows Home Server Native Backup

I have an external drive attached and use the WHS native backup to save the files needed for a OS recovery. My testing doesn’t include a OS restore although it did work way back when I needed it. Worst case is an OS and add-in re-install then restore the data.

Sneakernet Offsite

As part of my offsite backups I have two 2 TB drives that I rotate offsite. Every week I bring one to my office desk drawer and bring the other home. The drive at home is attached to my Windows 7 PC and every night a batch files runs Robocopy to update the drive with everything that’s not a video file. This drive is encrypted (Truecrypt) so I don’t have to worry if it’s stolen from my desk.

Backup Shortcomings

There’s no offsite backup for my videos. I have two backups in the house along with the original source disks but if the worst happens to the house there’s nothing off site. I figure I’d have more to worry about than the videos and that is what insurance is for. I used to keep copies of the ripped videos on drives offsite but that became a hassle to keep updated as the number of drives grew. Especially since I tended to use old, retired 1 TB or smaller drives.

I need the software (CrashPlan or Cloudberry) to do a restore. Being able to pick a file from the local backups can work in a pinch but that would be an exception. In the past I used to keep the same shares on another server and do a share to share robocopy every night. In theory this made it easy to quickly switch to the new server. Now that I have several Micro Servers I’ve approached a hardware failure by already having a duplicate of anything in house and I can swap hardware around and restore any lost files. Actually, I don’t have duplicates of some of hardware used for backups, but as there’s redundancy across the backups I figure I can lose updates to one while I wait for a new part. Of course, the failure will come at the worst time.

Windows PC & Virtual Machine Backups

This is simple. I use Windows Home Server 2011 backup  to backup my Windows 7 PCs and virtual machines. As I said, data is on the Windows Home Server itself so there’s not really much data to back up. To keep backups small I exclude the Virtual Machine disk files from backup. For testing I just went in and pulled a couple files out of the backup, I didn’t do a full restore.

In 2010 I used Jungle Disk for to do some Offsite Windows backups but I no longer use it. I don’t do any backups directly from a Windows PC to any offsite destination. Everything goes to WHS.

Mac Mini Backups

For my Mac Mini (Desktop) I use SuperDuper to clone the hard drive every night. This gives me a disk I can boot from should my Mini’s drive fail. I also use Time Machine to back things up. Time Machine is a hold over from when I kept local data on the Mac. It’s useful should I need to recover an old configuration file so I keep it.

I also run Arc Backup on the Mini to backup files to Amazon S3. I back up my application support folder along with my Documents folder (which is mostly empty). I have Arq Backup limited to $1/mth in Amazon charges which limits it to over 10 GB although it’s still only using 2.9 GB

In 2010 I used Jungle Disk to do my offsite Mac backups but didn’t like the direction the software was taking. Plus it didn’t support Amazon Reduced Redundancy Storage which would increase my costs.

MacBook Air Backups

I have a Seagate Portable drive I attach when I’m home and do a Time Machine backup to it. I’ll also pack it if I’ll be traveling for a few days but it typically stays home for short trips. I do use the encryption feature of time machine in case the drive gets lost.

I also use Arq Backup on the Air and it’s also limited to $1/mth (or 10.75 GB). Because it’s a laptop so does have more files locally, at least at times, there’s currently 3.9 GB stored on Amazon S3. I limit the backups to data files and a few configuration files. This is useful if I’m traveling since it moves the files far from the laptop.

Amazon S3

Amason S3 pricing isn’t the most straightforward because there are charges for bandwidth and other operations in addition to space used. My total Amazon S3 charge for November was $3.52 and this is pretty standard although down a bit from September and October when it was $4. The November charge was $1.81 for the storage and $1.71 for those other charge, I use the Amazon Reduced Redundancy Storage option to keep costs down. I’m saving a total of 19 GB with Amazon S3. Amazon does offer Free Usage Tier which is not included in my prices as it’s only good for a year.

Summary

My Windows Home Server backup software of choice is Cloudberry since it’s so flexible. It’s gotten new features since I started using it and while I don’t use them all it’s nice to see the software gets continual care. CrashPlan is my choice if there’s a requirement to store a lot of files in the cloud. CrashPlan is a bit less flexible when it comes to local sources but it is a better choice if you want more offsite flexibility.

Arq Backup is my choice for Mac backups and replace Jungle Disk. In addition to a feature set I prefer, Arq backup does an excellent job of handling file attributes and file bundles that are common on the Mac platform.

Are you sure your backups work? You do backups, right?

 

Categories
OS Quest Trail Log

The OS Quest Trail Log #66: Slow Month Edition

Image of a giy coming out of a computer screenIt was another slow month on the quest as real life tended to intrude and some best laid plans went bad.

Home Cloud

I had been working on setting up remote access to multiple servers through pfSense and thought I had things worked out. I’d even posted the introductory article. Then two things happened. First, a IP address change for my cable modem didn’t make it’s way to updating DNS. My previous testing had been to force an address change which required restarting the modem and changing its spoofed mac address. So in other words, a hardware reset and configuration change. This time the routing ip address change was detected, but failed to make it’s way to the DNS records. So I made some config changes but will have to wait until Comcast changes my IP again so I can see if it helps.

The second item was a comment by Jared that turned on a light bulb. He mentioned about using layer 7 for the routing which is something pfSense can’t do. But, I also have Untangle and had used it as a router in the past. The light bulb went off because Untangle works on Layer 7 so should be able to route based upon the destination address. So I’ll be looking at switching back the Untangle again if it can doe this without having to do port mapping which will greatly simplify things.

This is one of those times I wished I had built the router on a VM and could just fire up different virtual machines for testing. But the MicroServers are the next best thing since I can just swap out hard drives for my testing purposed and not lose the old configuration.

Further complicating things was the death of my version 1 Windows Home Server. It wasn’t unexpected and in fact the server had been replaced, just not stripped for parts. The problem presents as a bad hard drive but if history repeats itself it will be another bad sata port on the motherboard. Not worth fixing so it’s time to yank the drives and reuse them. I’ll build another WHS V1 as a virtual machine for my testing purposes.

CrashPlan Backup Status

CrashPlan ran into it’s first hiccups this month. There was a day long network outage back on Nov 14th. In my case CrashPlan said it couldn’t connect long after they posted the issue was resolved. I went in and manually told it to connect and it immediately started backing up again.

I had a second issue where at exactly 1AM (my time) the backup stopped and CrashPlan wouldn’t connect. But this time I could connect to my account over the web so it wasn’t the same type of problem as before. A quick search of the CrashPlan website revealed and old technote on this problem with the solution being to restart the CrashPlan server or the entire PC. I opted for a server reboot and that did resolve the problem.

Since those outages I’ve also noticed that the top upload speed I see is generally slower. In the past I would frequently see it nearing it’s 2 Mbps upload ceiling (that I configured) where as now it hovers around 1 Mbps.  There could be any of a dozen other things affecting this speed but I do see speeds greater than 2 Mbps up when I test other transfers (like a file to my web server).

I haven’t soured on CrashPlan. It’s a low cost service $42/yr (after a discount) for unlimited backup. Test restores worked fine after these outages so it does appear this was a network problem and not a problem affecting data.

As for what’s backed up I’ve been hindered more by Comcast’s data caps than CrashPlan’s capacity. I’ve backed up 178.4 GB consisting of 231,297 files. At this point I’m trying to decide what else I want to back up, There’s no point in backing up my movies as they are so large it could take me years to back them up and stay below my cap. And if I ever had to restore them doing so online would also take years and I probably wouldn’t want to pay to have the hard drives shipped. Any sort of backup to a friends computer would have the same data cap issue so while that’s a nice feature the CrashPlan online solution seems more reliable, despite recent problems.

Holiday Tech Deals

I pretty much avoided any Black Friday/Cyber Monday deals. I didn’t see much that I wanted or see anything I knew was a good deal (as opposed to the merchant just promoting it as a good deal) and I could use. The one exception was a NewEgg deal for the HP MicroServer. At $250 it was a good deal and while I don’t need a sixth for my collection it was tempting. By the time I talked myself into taking a look at it they were sold out.

I did buy some discounted iTunes gift cards from Apple and Best Buy. I use them instead of a credit card both for safety and as a way to budget my expenses in an environment where it’s much too easy to buy things.

I do suspect we’ll see additional deals between now and Christmas so I’ll keep checking. Anyone see a good deal they’d recommend?

Domain Price Increases

If you own any domains be aware that the registry fee Verisign charges for .com domains will go up 51 cents (5%) and .net domains will go up 46 cents (10%) on January 15th. The increase is not retroactive so you can extend your registration at the current prices before that time. Whether your registrar increases their prices and by how much is up to them and can vary. I’m sure some will bump their prices by the percentage rather than the actual increase. You can register .com and .net domains for up to 10 years into the future and I’ve done that for this domain along with a couple others I know I’ll want to keep.

The Month Ahead

With the December holidays things are likely to be busy in the non-tech parts of life but I do have some vacation days during December which may make up for that lost time. I’ll be giving Untangle another try as a router to see if it can better handle the remote access. Beyond that we’ll see what pops up and catches my attention.

 

Categories
OS Quest Trail Log

The OSQuest Trail Log #65: October Blizzard Edition

Picture of trees covered with October SnowAnd mother nature keeps right on attacking, Not content to wait until, winter officially starts mother nature decided to hit Connecticut with some nice, heavy snow clinging to all the picturesque October foliage. Eventually bringing much of that foliage, and the limbs it was on, crashing to the ground and bringing along power lines for good measure. I lost power on Saturday and just got it back Thursday, with cable/internet following on Friday. So I went through gadget withdrawal for a few days. The picture above is from Saturday afternoon after a couple hours of snow and before the trees started coming down. Luckily the ones around me missed cars and buildings. While not everyone was so lucky, I was pretty suprised by how many downed trees there were that managed to find open areas rather than other nearby targets. But on the the tech…

The highlight of the month for me was my first podcast as a guest on the Home Server Show podcast.

New Software

I installed CrashPlan backup on Windows Home Server, taking advantage of a discount offering unlimited online backup for $42/year instead of the normal $49. The backups been working well although I’d been hoping to get a bunch up there in October since I had plenty of space left in my cap this month, The power outage ruined that. It’s uploaded just over 54GB with about 15GB in the queue that started uploading once the internet came alive. But it’ll be later in November before I add much more. I want to avoid having to throttle myself by using too much early on. I figure I can do about 100GB a month and stay under my cap but want to play it safe.

I moved from using Untangle as both a router and unified threat manager (UTM) to using pfSense as a router but leaving Untangle as the UTM. I’ve been happy with the results and was just beginning to dig into some of the features during the snow-shortened weekend. I’ve started digging into pfSense a bit, more poking around than R & D. I also plan to do some testing to see if a caching proxy will reduce the bandwidth I use. I figure I need to make sure it will cache software and patches in order to make a dent in the bandwidth I use. (The cache in Untangle didn’t actually serve much from it’s cache when I tested it.)

Updated Software

It seems like everything I use was upgraded. But the highlights were…

I put Lion on my desktop Mac Mini only to find my upgrade fear imposed delay was unwarranted. Everything worked find with only a minor Synergy frustration due to Lion’s new feature where mouse movement doesn’t wake it. I’ve no plans to put Lion on anything besides this Mini and my Air. The other Macs have no reason to upgrade or the upgrade will remove features I use.

iOS is also updated to iOS5 of course. Despite some frustrations I managed to get both my iPhone 4 and iPad 2 upgraded. I’ve experienced shorter battery life for sure although no where near as bad as some complaints I read. I tend to keep things turned off and I hadn’t enabled much of iCloud. I saw the worst performance the days I was home and had a wireless connection. Despite typical usage the battery drained far faster than when I was in the office with wireless on but no network to connect to. No scientific test but typical usage each day. By 5PM at home the battery would be around 20% while usually above 50% in the office. But I just read Apple has an update in beta that’s supposed to help.

Not really software, but Google Reader saw an update. I use Google Reader on my computers (with the account being used by iPad apps). The timing was bad, the update came as I was grabbing some battery powered 3G access during the blackout so the last thing I wanted was change. I’ve trying to avoid hating it just because it’s different. I didn’t use any of the discontinued features so no complaints there. But I found it easy to blow through a bunch of articles and star ones I want to read later. That’s become a problem. Ignoring the performance problems (very uneven scrolling) the star isn’t in the same place in every post. It’s now at the end of the title, rather than the first things. And speaking of the titles, while it has a clean look the article titles blend right in. Sure the star at the bottom of the post is always at the very left, but it’s hard to find and I star based on title. I’ll be looking for a new desktop reader and use my iPad more to check through the feeds.

Google+

While I still maintain my status as the last human not on Facebook, I finally broke down and joined Google+ when they enabled it for Google Apps users. It wasn’t much later that I lost power so I don’t have much to say about it yet. I did find the Google+ iOS app doesn’t like Google Apps users and tells me to go get an invite when I log on, but the web interface is fine from the iPhone.

iCloud

iCloud was making news and I moved my MobileMe account to iCloud. I wasn’t a big MobileMe user having been burned by Apple’s cloud services in the past. I think my problem with iCloud (beyond not trusting Apple to keep things running) will be that it requires users to dive into the deep end, accept the way it works, and don’t expect a lot of options. I gave photo stream a try. Problem was my camera fumbling uploaded more bad pictures than intended pictures. Not really an iCloud problem, but still a problem. But I have no doubt it will improve over time and I’ll be drawn into the iCloud.

Web Work

I spent some time with the plumbing of the website. I seems like I have a bunch of minor issues that I can’t seem to get to (or keep putting off). At least I was able to tackle a logrotate configuration change. I also changed my caching plugin back to WP-Supercache. I always liked the plugin but stopped using it after it broke due to an upgrade. It’s working again and I’m using it. I did make an errant mouse click and enabled compression which didn’t work (possibly because I have compression enable in Apache). Unfortunately it went unnoticed in my testing and I didn’t notice a problem until my views went way down.

My change to WP-Supercache seemed to cause another problem which went unnoticed until recently. It doesn’t seem to have been rampant, but it was frequent. I don’t quit understand the problem completely but I don’t think it was a WP-Supercache bug. In short my Adsense ads would often display a “Page Not Found” error in the frame for the ad. I set up the ads to only display ads to new visitors. View a few pages over the course of a couple months and the ads are supposed to go away. I think WP-Supercache would sometimes cache the “don’t display” page which would cause a problem when a new “display the ad” visitor arrived and the code ran to display the ad.

Home Networking

I had been hoping I had pfSense and dynamic DNS setup to handle IP address changes so I could remotely access multiple home servers using my own domain. Well, when I got my internet back today Comcast gave me a new IP address as I hoped. But alas, no update to DNS, So it’s time to do some more research. I’ll be tackling that this weekend. I’m hoping I just have a pfSense setting wrong. [Update: Got this working so hopefully a write-up soon.]

The Month Ahead

The only thing I really want to get done is getting those home servers set up with Dynamic DNS and pfSense. [Update: Rather easy fix so hopefully a write-up soon] After that I’ll see what catches my interest. I’m also hoping this past week isn’t a sample of what’s to come this winter.