End of an Era

My Windows Home Server era lasted six and a half years but finally came to an end this month.

Windows Home Server splash screen tileJuly 2014 brought the end of an era that began in January of 2008. I shut down my Windows Home Server. Except for a brief two month fling with an Ubuntu home server I’ve had a Windows Home Server running for the last six and a half years. There’s nothing replacing it. Although, an existing Synology NAS takes over some duties.

My Windows Home Server started with two small drives on a HP Windows Home Server version 1. It grew to a home built box with over 20 TB of disk by the time WHS 2 was released. Eventually it began to shrink and by the time I shut it down it was an HP MicroServer with four 3 TB drives plus an OS drive. My needs continued to shrink and even this was more than I needed.

By far most of my drive space was used by video files. These, along with files being archived, were all that was on my Windows Home Server. All my non-video data had been moved to my Synology NAS.

The growth of streaming and cloud services meant my local video library rarely grew. Even in the rare cases where I bought a video, all else being equal, I’d prefer a cloud purchase and not have to worry about local storage. My Blu-Ray purchases for the past year could be counted on one hand.

I rarely accessed the WHS files, yet the server was running 24 hours a day, seven days a week. So as I was looking to downsize and save electricity, this was an obvious first choice.

So I cleaned up the files on my Synology DS1511+ NAS which I uses for backups and files storage and copied my video library to the available space. I had so many duplicate files and backups I was also able to free up another five 3 TB drives that were in an expansion unit and still have room for the WHS files.

So I copied the Windows Home Server files to the Synology 1511+ and then copied them to a few of the freed up drives to be put in storage as a backup. The Synology 1511+ just gets fired up every weekend to refresh backups and verify the drives still spin.

I moved a couple of the 3 TB drives to my Synology 212+ NAS which serves as my main data storage for what I consider my active data. The extra space will be used for time machine backups and future needs.

Windows Home Server will be supported into 2016 so there was no rush for me to replace it. Despite this, time has moved on and now my Synology NAS is better suited to my needs which doesn’t include needing terabytes of files being always available.

Windows Home Server Is Terminal

While not unexpected, Microsoft made it official, Windows Home Server joins the Zune, Kin and others in the Microsoft Product graveyard. But that doesn’t mean its dead yet.

tombstone_ripWhile not unexpected, Microsoft made it official, Windows Home Server joins the Zune, Kin and others in the Microsoft product graveyard. But that doesn’t mean its dead yet. It will be available as an OEM DVD (such as from Newegg) through 2013. Plus, mainstream support doesn’t end until April 16, 2016 so we’ll have security fixes through then at least. I assume we’ll also get general bug fixes if they’re bad enough. OEMs can install it on devices through 2025 but that seems more bizarre than realistic.

Microsoft’s Plan? From the Windows Server 2012 Essentials FAQ (PDF Link):

Q: Will there be a next version of Windows Home Server?
A: No. Windows Home Server has seen its greatest success in small office/home office (SOHO) environments and among the technology enthusiast community. For this reason, Microsoft is combining the features that were previously only found in Windows Home Server, such as support for DLNA-compliant devices and media streaming, into Windows Server 2012 Essentials and focusing our efforts into making Windows Server 2012 Essentials the ideal first server operating system for both small business and home use—offering an intuitive administration experience, elastic and resilient storage features with Storage Spaces, and robust data protection for the server and client computers.

Unless they discount the $425 retail price of the license I don’t see a lot of homes using SBS 2012e as a home server. (Except for enthusiasts who have a Technet subscription.)

I’ll be running my Windows Home Server 2011 until something better suited for me comes along. My Windows Home Server doesn’t know it’s terminal so it will keep chugging along. Technology is constantly changes as are my storage needs. April 2016 is the earliest I would be forced off WHS. I suspect it will seem like old tech long before that and I’ll move to something more appropriate for the times. That is, assuming I still want a central storage box. I’m already heavily invested in Synology NAS’s, which I love, so they certainly have an edge. But if they could replace my WHS they would have done so already.

Considering or running WHS? Does Microsoft’s announcement change anything for you?

Synology to Windows Home Server Using iSCSI

I’ve been looking at the capabilities of the Synology NAS products by looking over the Synology DiskStation 212j. This time around I gave it a spin as an iSCSI target from Windows Home Server 2011. The goal is to have the Synology NAS accessed by WHS 2011 as a local drive. No additional software is needed, it’s all built in.

Image of Synolog DeskStation 212jI’ve been looking at the capabilities of the Synology NAS products by looking over the Synology DiskStation 212j. This time around I gave it a spin as an iSCSI target from Windows Home Server 2011. There’s links at the end for more information about iSCSI, but for my purposes here it can be thought of as a way to present a network connected drive as a local drive to the operating system. The Synology NAS will be addressed by WHS 2011 as a local drive. No additional software is needed, it’s all built in to Synology and Windows Home Server.

This was configured using the Synology DiskStation 4 beta software although the DiskStation 3 software is set up the same way based on the information at the Synology website.

iSCSI Target Types

The Synology DiskStation software supports three different configuration types as an iSCSI LUN:

Regular Files – this configures the target on an already created file volume. This allows flexibility in allocating space. It can be increased anytime, as long as there’s space available on the volume.

Block Level (Single LUN on RAID) – this configures the target on available disks. There can’t be anything else on the disks used and they will be completely allocated. This provides the best performance (according to Synology). The disks can be configured for RAID.

Block Level (Multiple LUNs on RAID) – this configures the target on available disk space. Space already allocated to volumes can’t be used, but the disk(s) can be shared with file volumes.

Configuring iSCSI

The Synology website has good instructions on configuring iSCSI with their software so I won’t repeat it here. But for my simple requirements I was able to run through the wizard and accept the defaults. I didn’t set up any advanced options. When configuring a “Regular Files” LUN the size defaults to 1 GB so I did increase that to a more useful size.

Configuring iSCSI on Windows Home Server 2011 was a bit different than documented by Synology so I’ll run through it here. The configuration is the same for Windows 7 and Windows Storage Server 2008 R2 Essentials. I suspect Windows Server 2008 R2 is also the same along with the other related software such as Small Business Server 2008.

This needs to be done on the server itself so a Remote Desktop connection is needed (assuming the server is headless). Go to Control Panel and select “Set up iSCSI Initiator”. Then answer “Yes” to the prompt to start the iSCSI service.

iSCSI Control Panel iSCSI Service notice

The iSCSI properties dialog will appera – select discovery tab then click the “Discover Portal” button and enter the IP address (or DNS name) of the Synology NAS. Once the info is entered you should see the iSCSI target on the Synology NAS although it will still be listed as inactive. To establish the connection click the “Connect” button. In a strange twist of terminology you want to leave the default “Add this connection to the list of Favorite Targets” in order to make the connection persistent.

iSCSI Discovery Properties dialog Discovered targets list Favorite Connections prompt

At this point the connection is established and the status will change to “Connected”. Once the connection is established you’ll need to switch over the “Disk Management” section of the Computer Management console.

iSCSI properties after connection  Computer Management

When you click on “Disk Management” you’ll be prompted to initialize the disk. If the disk will be larger than 2 TB select “GPT” as the partition table type. Right-click on the newly added disk and select “New Simple Volume” from the context menu. Run through the wizard and when the wizard is done, so are you.

Initialize disk prompt  Create volume menu selection  Drive after formating

Now the disk can be used like any other local disk.

Benchmarks

Performance isn’t a reason for doing iSCSI, at least not with a home network and a low-end Synology DS212j. It’s going to be slower than a local SATA drive, but since I can, I did some benchmarks.

This is Windows Home Server 2011 running on an HP MicroServer with relatively slow Western Digital 1TB Green Drives. It’s a Gigabit network using the MicroServer’s onboard NIC. When running the benchmarks I kept network traffic to a minimum, no streaming video or file copies, but I didn’t turn any devices off, so there was the normal background network traffic. Everything is connected to the same switch.

The DS212j had two 7200 RPM drives in it. One a Western Digital Caviar Black and the other a Hitachi HDT721010SLA360 drive. Both are on Synology’s compatibility list.

The first benchmarks show the local drives, the second shows a “Regular Files” iSCSI target.

Local Drive benchmarks  iSCSI Regular Files benchmarks

I also set up each type of Block Level LUN and benchmarked them. The first is the Single LUN setup which should be the best performer, the second is a Multi LUN setup.

iSCSI single LUN benchmarks  iSCSI Multi LUN connection benchmarks

Wrapping Up

Being able to use the Synology boxes as an iSCSI target is a nice feature. Since it’s accessed over the network it’s not going to out perform a local drive unless you got a data-center class network to run it over. iSCSI doesn’t allow multiple PCs to access the same LUN (except with cluster aware software) since there’s no file locking, so it’s not a suitable replacement for a file share.

The more I explore the Synology software the more I’m considering one of their larger models. While I don’t see any immediate need to swap out anything I use for an iSCSI connected Synology NAS, I do think that an investment in a Synology DiskStation would eventually be used as an iSCSI connected drive somewhere in the future.

Additional Links:

Wikipedia article about iSCSI

Synology iSCSI Best Paractices

Synology iSCSI – How to Use

Apple Software On WHS Shares

I run a mixed Windows/Mac home and all my data resides on my Windows Home Server no matter whether it’s Windows or Mac. This means my iPhoto library, iTunes library, Aperture library are all on my Windows Home Server. I recently noticed that these libraries were saving deleted files forever. Here’s why.

Trashes folder on a WHS shareI run a mixed Windows/Mac home and all my data resides on my Windows Home Server no matter whether it’s Windows or Mac. This means my iPhoto library, iTunes library, Aperture library are all on my Windows Home Server. I recently noticed that these libraries were saving deleted files forever.

The libraries are a directory structure that OS X understands and may present to the user as a single file. For example, iPhoto displays as a single file in OS X unless “show package contents” is selected. Even though my iPhoto library is on a WHS share OS X displays it to me as a single file bundle. As long as the files remain within the library structure all is well. Libraries that maintain their own internal trash bin (i.e. iPhoto and Aperture, maybe more) end up trying to move the files to the OS X trash bin when you empty the library’s trash bin.

I recently noticed that when I emptied the trash in iPhoto that it moved the files to a “.Trashes” folder on my WHS share. (Note the leading dot)  See the first graphic to see what I mean, click it to enlarge) Well actually I noticed this huge .Trashes folder and then found it came from iPhoto and Aperture. If this was an OS X drive running on OS X it would be part of the trash bin and get emptied when I emptied the trash. Aperture also worked the same way once I checked. On the WHS share they live forever,  even OS X didn’t see it as part of the recycle bin.

The .Trashes folder could be deleted just like any other folder without causing a problem. The next time you empty a library’s trash it will be recreated. To see the folder you need to enable viewing hidden files and folder (click for full size for the Windows 7 setting below):

 

Show Hidden Folders Option

I also found that iTunes saved replaced apps to the .Trashes folder. Luckily it doesn’t save replaced or deleted podcasts. If it did I’d probably have run out of disk space. iTunes doesn’t seem to save anything I delete on my own, only the apps it replaced.

It’s only my apps that maintain their own library structure that have this issue. Deleting regular files on my WHS from OS X deletes them immediately.

I guess there is a price to pay for trying to get Microsoft and Apple to play together. But this is a small prices since it’s easily fixed with a scheduled task to delete the directory.

Acer Aspire Windows Home Server AH342-U2T2H

The Acer Aspire AH342 Home Server is one of the few Windows Home Servers still available for retail sale in the US. It’s a WHS V1 server and Newegg has been had them at clearance prices recently. Since it’s old software it’s not a simple out-of-the-box experience so I had one pass through my hands recently. I updated the WHS V1 on the server and got things going. I also had a chance to do some quick benchmarking and hardware testing.

Acer Aspire AH342 Home ServerThe Acer Aspire Windows Home Server seems to be one of the few Windows Home Servers that can still be purchased in the US. Just before Christmas Newegg had it on sale for $290. After Christmas it went back up to $350 but then dropped further to $260 (it’s list price is $449). Since it includes Windows Home Server v1, and not the latest version, I suspect we’ll see more discounting as Acer tries to clear out it’s stock. Hopefully they’ll have a WHS 2011 version and stay in the market. I took a look at the Acer Aspire AH342-U2T2H.

Windows Home Server v1 will end-of-life in January 2013 so any WHS v1 purchase needs to take that into account. It’s not like the server will turn into a pumpkin at that time, but Microsoft will stop providing updates. This will be after the Windows 8 release date so hopefully Microsoft would release new connector software if it’s needed for WHS. If you’re going to be using the server for remote access, meaning it’s accessible from the internet, the lack of security updates after 2012 would be a concern. If the server is going to only be accessed by computers in the home then it’s less of a concern.

The hardware should support Windows Home Server 2011 if you want to install it later. There’s no onboard video so you’ll either need to install a PCIe x1 video card or do a blind unattended install. The server comes with 2GB of RAM and the specs say that 2GB is the max so that could be an issue depending on what add-ins you install. The Atom D510 CPU is 64-bit so can run WHS 2011.

This server was purchased to provide backup and central storage for a few PCs, basically a low cost NAS. There’s only one drive so to use folder duplication a second drive would have to be added. Because hard drive prices haven’t returned to pre-flood pricing I’m contributing one of my slightly used 2 TB drives for use in the server.

Initial Setup

Because the WHS software delivered with the server is quit old I couldn’t use if for setup since I have Windows 7 clients. If I had Vista or XP clients I could have installed the bundled software and then upgraded. Since I only had Windows 7 I followed these steps:

  1. Unpacked, plugged in and powered on the server. While it was doing its initial setup I went to step 2.
  2. Download the latest connector software from Microsoft and burn it to a CD.
  3. Once the LEDs stopped blinking I was ready to move on. The quick start light said all the blue LEDs would be on solid which is a bit confusing. The panel LEDs include a network LED which blinks for network activity and a hard drive light which blinks for activity. The status LED was blue and red while the drive lights were blue and purple. I moved on once things seemed to settle down.
  4. I popped the connector CD into a Windows 7 PC and ran it. The screenshots for the installation are below. Click for a larger picture.
    1   3
     4  5
      6 7
      8 9
  5. After logging onto the Windows Home Server my next step was to remove the McAfee Anti-virus software. I don’t use AV on my own WHS and if the owner wanted AV McAfee would be my last choice. As it is the included license is limited to 60-days so removing it wasn’t a problem for the server owner. The version pre-installed won’t work once WHS is updated although there might be an update from McAfee (I didn’t bother to inquire).I uninstalled McAfee through Add-Remove programs after RDP’ing into the server. It can’t be removed through the add-in manager.
  6. While still RDP’d into the server I ran Windows update and installed all the available updates.

At this point the Acer Aspire is a basic Windows Home Server v1 box with the latest updates.

Hardware & Features

The server comes with one 2TB Western Digital Green Drive (WD20EADS). I’d prefer a small system drive since I don’t like to share the OS drive with data, In this case it’s not much of a concern since I don’t expect heavy usage. So to take advantage of folder duplication I’ll be adding a second drive which is also a W20EADS drive. For testing purposes I added two more drives.

The server also has a nice compact form factor and will look good on a shelf. There’s also an eSata port and several USB ports (all USB 2). The front USB port has a one-button copy feature I’ll talk about later.

It’s also surprisingly quiet. I’ve got four drives installed and I’m doing a file copy. Even sitting next to the server I have to strain to hear the fan and the drives are silent.

There’s some multimedia software that will probably go unused and I don’t have time to test them. The console has tabs for “iTunes Server” and “Digital Media Server” and Firefly Media Server is installed. The server did show up as a “Media Server” for my LG Blu-Ray player and I was able to stream a video from the server.

The Lights Out add-in is also included although it is an old version (v0.8) so it needed to be upgrade. The add-in was licensed with an oem license but after the upgrade the license reverted to the trial version. Once the trial is over the license will revert to a community addition license which, according to this, has all the features of v0.8 plus a few more. The upgrade was done like installing any other add-in. I didn’t need to uninstall the original add-in although doing so probably would have been a good idea.

The One-button USB copy is interesting but I’d prefer it didn’t try to think so much. I tested with a drive full of DVD rips. It copied the drive to the public share as expected but then it copied about 50 of the .BUP and .IFO files to the video directory and renamed them to avoid duplicates. Pretty useless on their own and breaking the rip directory since they’re missing. It was also interesting that other files with the same names were left alone. So if you already have files in an organized directory structure this feature may change the structure so you may want to skip it and do a regular copy.

The expansion slot allows a video card to be added should one be needed. But it’s a PCI Express x1 slot which isn’t common among video cards. I’d be more inclined to look for a USB 3 expansion card to add some external drives. It will need to be a low-profile card.

I wish Acer would drop the McAfee AV add-in which I view as nothing but crapware. Even if it worked, it’s still only a 60-day license. The Light-Out adding is outdated but at least it was a full license. The included add-in and its license doesn’t provide any benefit once the latest version is installed.

I attached a Lian-Li EX-503 External Enclosure via the eSata port. The server could see four out of the five drives in the enclosure so the eSata port can handle a port multiplier but only up to four drives. There were also four drives in the server bays. I didn’t do any benchmarking or other testing beyond verifying that drives could be seen.

Power Consumption

I did some quick power measurements using a Kill-a-watt power meter. The server was plugged into the Kill-a-Watt which was plugged into the UPS outlet. I started with all 4 drive bays populated. There were three Western Digital 2 TB EADS drives including the one that shipped with the server as the OS drive. The fourth drive was a Hitachi  Deskstar 7K200 drive (2 TB, 7200 RPM).

With all four drives the power usage was between 52 and 56 watts. The 52 watts was when the server was idle, at least as far as access goes. Some background processes may be running although CPU usage did remain low. The 56 watts was during file copies or drive removal processing although it mostly stayed at 55 watts under load.

I removed the Hitachi drive and usage dropped to 44 to 46 watts with occasional and brief drops below 44 watts. When folder duplication was active the power usage was 46 watts.

With two W20EADS drives installed the power usage was 36 watts while idle and 37 watts while processing a client backup. During folder duplication, when both drives would be active, the power usage was 37 watts.

With just the original drive delivered with the server the power usage was 29 watts while idle.

Drive benchmarks

The benchmarks below are the screenshots of the ATTO benchmark results. ATTO was run locally on the server (double-click for full size).

ATTO Benchmark for Drive C:  ATTO Benchmark for Drive D:

There’s not much of a difference between C: and D: since they are the same physical drive.

The screenshot below shows the results of a robocopy from my Windows 7 PC to a server share with duplication enabled.

RobocopyResults_Win7ToAspire

The reported speed for the file transfer was about 2 GB per minute. If my math is right at 8 bits per Byte and 60 seconds per minute that’s about 271 Mbps. Turning the results to MB/s shows a speed of 33.94 MB/s which is significantly slower to the ATTO results run directly on the server, but includes all the server and network overhead. Additional tests produced similar results.

The screenshot below shows the results of a robocopy from the Aspire AH342 to my PC. The copy was started after the server completed drive balancing and wasn’t doing anything else.

Results of RoboCopy from Aspire H342 to Win7 PC

Assuming my math is again correct this is 231 Mbps and 28.93 MB/s.

The file copies were done with mostly video files so the average file size was pretty large and there wasn’t a lot of overhead opening a lot of files.

Summary

The price is certainly the big attraction although if you’re going to add three hard drives to max it out the cost will go up considerably at today’s prices. But if you have the drives or can wait for the flood-induced prices to drop it’s worth it. Personally I think a second drive should be added in order to enable folder duplication or to do backups so that will increase the cost.

Returning to Windows Home Server v1 was both nostalgic and a reminder of the frustrations WHS v1 brought. Removing a drive brings down the server while it’s processed which can be time consuming (hours). That’s not something most people will do as a regular activity so it’s not too much of a concern. There was also the occasional slowdown as some process ran (backup cleanup, drive balancing). After using WHS 2011 for about a year WHS V1 just looked and felt old.

I was impressed with the Acer Aspire AH342 Home Server. It will make a good NAS for sharing files and PC backups, which is why it was bought. But it’s not a product someone can buy off the shelf and expect to get running unless they’re familiar with WHS or have only Windows XP and Vista machines. But once the software’s age related issues are worked out it performed well. Plus I like the nice small cube form-factor and it’s quiet. It can be out in the open and on all the time.

Cloudberry Continuous Data Protection

I enabled CDP for my “hourly” backup soon after installing the update. It didn’t work exactly as I expected but the differences didn’t affect the actual backups. I also did some additional stress testing to check out performance and this is what I found.

Backup Logo - Laptops connected to backupCloudberry recently added Continuous Data Protection (CDP) to their backup software, including Cloudberry Backup for Windows Home Server 2011. This seems like something I can use to replace my hourly backup so I decided to do some testing. I switched the hourly backup to use a CDP schedule instead. I use the hourly backup to move my most important files offsite (Amazon S3) soon after they’re created. I don’t use RAID so if I lose a disk the data needs to be restored from backup and the hourly backup was my solution. The new CDP option seems like a good fit.

I enabled CDP for my “hourly” backup soon after installing the update. It didn’t work exactly as I expected but the differences didn’t affect the actual backups. I also did some additional stress testing to check out performance and this is what I found.

Using CDP

The CloudBerry blog post said…

Under the hood the changes are captured instantly but the data is uploaded to Cloud storage every 10 minutes.

But I found that the backups occur or are checked for every minute. According to the logs they’re uploaded immediately.  The blog posts also mentions this is configurable although I’ve yet to find where this can be configured in the WHS add-in.

Using CDP takes some getting used to because it changes the way the add-in reports it’s status. It would be less of a problem for someone not used to a regular schedule or is less concerned with checking the status regularly.

The screenshot below shows the status screen for my “hourly” backup several days after CDP was enabled.

Cloudberry CDP status screen

 

Some items of note:

  • The job status is always “running”. The status message uses the term “instant backup” when waiting for files to be backed up.
  • The “files uploaded” only shows the status for the last “instant backup”. If there was nothing to do then the files uploaded is 0. Checking the history shows that file uploads and purges are taking place as required. So while disconcerting, it’s only a cosmetic problem.
  • Since the backup just never ends there’s no email updates for success or errors. I used email to let me know if there was a backup error. The emails only go at the end of a job so even if there are errors (such as an open file) I don’t get an email. In fact, trying to set up an email status report for a CDP backup resulted in a error. This error read more like a bug than a message saying the feature was unavailable.
  • Rather than a 10 minute interval, a 60 second countdown begins when a “instant backup” is completed. Any waiting files are then backed up and uploaded to Amazon S3.
  • If I stop the backup by clicking “Stop Backup” it doesn’t restart. Rebooting the server does restart the CDP backup jobs.
  • Error handling is inconsistent. In my testing the backup would typically ignore errors created by open/locked files. These were valid and when the files where closed they would be backed up. So it was good the job kept running. But there was one instance where a file was moved after being flagged for backup but before it was backed up. This was a valid condition (an iTunes podcast download which downloads to a temp directory and is then moved). The backup job recorded this as an error but then stopped any additional processing. Since CDP backup plans don’t seem to restart on their own this is a problem.

Stress Testing

There wasn’t any noticeable impact changing the my hourly backup to use CDP. The HP MicroServers are relatively low powered and not capable of doing many intense tasks at once. The only add-in I run is the Cloudberry Backup add-in so I was a bit concerned it would impact streaming or other activity. There’s no noticeable load on the server while it’s waiting/looking for updated files. When there’s files to back up the load isn’t any more than the hourly backup and in theory may be less since it spreads the backups out over the hour rather than all at once. Most of the files in this backup plan get updated overnight through automated jobs (website backups, etc…) while the rest of the changes are data file changes. Still I decided to do some load testing.

I copied 121,000 files totaling 60 GB to the same drive I would be streaming a video from. I also copied that set of test files to a second drive. As a control I watched a streaming video while the files were being copied. I RDP’d into the server to do the copies so they were all local drive to drive copies. The streaming worked for awhile until a certain point where it first became slightly annoying until it eventually became unwatchable. At this time there were two file copies going on. One was copying from a directory on the drive (where the files were streaming from) to a second directory on the same drive. Another copy was running from a second drive to the streaming drive.

I have 7 backup plans. A full description can be found in my recent backup review but for purposes of this test I set all backup plans to use CDP. There were three backup plans that matched my test files so they began backing up while the other 4 just watched for files. Each of the test drives had a backup plan dedicated to them and would be doing local backups to eSATA drives so the backup wouldn’t be hindered by network or other limitations. So each drive would be backing up as quickly as the data could be read and written to disk. The third plan included all four drives in the server and backed up to a NAS. So this would be reading from both test drives but only one at a time.

Like the file copies my video stream started off fine and ran for awhile but then it became annoying as it would frequently stop and need to catch up. So no worse than a comparable file copy although still too annoying to be acceptable (while subjective, I doubt anyone would be happy). Not surprising since the backup is not much more than a file copy.

Once the backups were done and the backup plans were just watching I didn’t have a problem streaming and reading files off the server.  Deleting the test files and then letting Cloudberry update their status (I save deleted files for several days so they weren’t actually purged) didn’t affect streaming.

Summary

The good news was that my testing showed that CDP didn’t add any significant overhead above the actual file copies. The bad news is my server isn’t designed to handle a lot of simultaneous activity or file copies. Because of the way I have the shares and rives set up and the way I use the server I may not notice an impact even with CDP set on all plans. Two of the plans go to destinations that aren’t always online so CDP isn’t a good option for them and the other plans rarely have simultaneous changes. Still, CDP is far from a universal solution for me.

I’ve left CDP enabled for what was my hourly backup to Amazon S3 but I’ve returned all the other backup plans to their previous schedule. A lot of times there’s no need for immediate backup and I’d rather wait until all updates are made or a set of files is fully processed. Because of what I send to Amazon S3 I’m less likely to have issues and it’s been fine since being enabled. I do fel I need to monitor it more than I did when I used a hourly backup, if only to make sure it’s still running and that may end up being enough to go back to an hourly schedule if I don’t become more comfortable with CDP’s reliability.

[Update Dec 30, 2011]: I was able to configure email notifications for one CDP plan and it did send a notification when that plan ended with a failure. Unfortunately the CDP plans don’t restart on their own when an error is encountered so I’ve gone back to an hourly schedule for critical backups.

pfSense + 1 Public IP = Home Cloud

My goal is to be able to access several of my home servers from outside my home. A dynamic IP address from my ISP and the limitations of NAT add some complications. But pfSense will help sort things out. This is a summary of what I hope to accomplish in my Home Server Farm series and a overview of my plan to get there.

Home Cloud Graphic
[Update: As mentioned in Trail Log #66 I’ve rethought this project and will be looking at alternatives.]

Now that I’ve ben running pfSense for a problem-free month it’s time to start using it for more than cool charts and graphs. My first goal is to be able to make multiple servers available from the internet. I’ve got Windows Home Server v1 and Windows Home Server 2011 servers running and ready to go. Once those are going I’ll want to add my development web server to the mix so I can do development and testing from outside the home. I’ve spent some time testing various options and I’ve settled on a solution that I think will work. At least all the individual pieces work, time to see if they fit together.

The main obstacle for me is that I have one public IP which needs to address the various internal servers. Those internal servers run the same services on the same ports. The nature of NAT port forwarding is all traffic coming into the WAN connection for a port gets forwarded to the same computer. I can’t parse port 80 (http/web) traffic and make a decision where it needs to go. This is the major obstacle. Another minor issue is that my public IP is dynamic and can change whenever Comcast wants to change it. (Although when I want it to change it’s surprisingly hard to do).

Another requirement is that I use my own domain, and not just a subdomain of some DDNS provider.

One problem I have, with no real solution, is that my home servers may not be accessible from sites behind a proxy server or firewall. Such as the office I work in for my day job. The proxy server will only pass ports 80 and 443 out of the office. So what I’ll end up doing is picking my main server and set it up to be accessed using port 80 and 443 as normal. The other servers won’t be accessible from my office. (A home VPN connection will be a future project.)

I’ll get into the specific configuration details in later articles but I’ve decided on the following approach:

  1. I’ll be using DNS-O-Matic to handle the dynamic DNS updates. This is a free service from the OpenDNS people, although an account is required.
  2. My DNS provider is DNS Made Easy. I’ve used them for a few years and they’re reasonably priced and reliable. They do support Dynamic DNS updates so I’ll use them.
  3. I’ll use pfSense of course. Rather than change the ports my servers use I’ll map a unique incoming port to the standard port used by the appropriate server. For example, traffic coming in to my WAN on port 8081 will go to port 80 on my Server 1. Incoming traffic on port 8082 will go to port 80 on my server 2. So I’ll have to remember what port redirects to which server but there’s no configuration changes needed on the server. I’ll be using pfSense 2 but pfSense 1.3 may work too as it seems to have all the features I use.

The basic steps I’ll be taking are:

  1. Map out what services I want to use, what port I want to use to access them externally, and what server and port they run on in my house.
  2. Setup pfSense so it can find the servers and add some aliases so I don’t get confused or have to remember IP addresses.
  3. Configure dynamic DNS so my DNS provider learns about the new IP address when I get it from my ISP.
  4. Add port forwarding and firewall rules to handle the port forwarding mapped out in step 1.
  5. Test and fix my mistakes.

I had wanted to handle this from within pfSense but my DNS provider (DNS Made Easy) isn’t directly supported and the RFC 2136 method won’t work either. I’m not willing to use a different DNS service. I did find references to add code to pfSense in order to add DNS Made Easy support. I decided against this to avoid forgetting about it and overwriting the code in a pfSense update. I also didn’t want to worry about a change breaking the code. While a third party service is one more thing that can break, it seemed the least problematic.

I also looked at changing the ports used by Windows Home Server. While I did find some write-ups on how to do this for version 1 there were caveats. WHS 2011 seemed to be more problematic and changing ports would break other features, My own brief test to change the port on WHS 2011 was a failure. Keeping the default ports on the servers and remapping them with pfSense seems to be a clean solution. I will need to remember to include the port in the URL, but other than that it’s pretty basic and worked in my testing, There might be some features that won’t be accessible but I haven’t found them yet.

Since I have only one public IP address and I’m using the port to map to the correct server I don’t really need to set each server up in DNS. I could use one name and then pick the server via the port, But I’ll use names anyway as it will make changes easier and help me keep things straight. It will also make life easier if I get more public IPs.

Finally, I’ll be testing using my cell network so as to access the servers externally. Testing from within the home isn’t useful and adds its own set of problems. I won’t be breaking access from within my house, but it won’t be a good way to test external access. pfSense has some security settings that kick in if it detects a private IP address as the source on the WAN port.

Now it’s time to start putting it together. I’ll use this post as a central repository with links to my other articles and resources on this topic so you can check back here to see everything on the topic I’ll call “Home Cloud”. I’ll be starting off by setting up two Windows Home Servers, a version 1 server and an 2011 server.

The place to start is with my pfSense 2.0 installation back in early October.

CrashPlan Update – Week 2

I’ve been running CrashPlan on Windows Home Server 2011 for a couple of weeks, give or take a extended power outage. Overall I’m impressed and I like it. This summarizes my impressions so far. While WHS 2011 isn’t officially supported, the software works well, just no add-in.

Backup Logo - Laptops connected to backupWell, not exactly week 2 due to a 5 day power outage but it feels like two weeks and it’s time for an update. I installed CrashPlan on Windows Home Server 2011 and have uploaded the first 70 GB to their online backup service. Upload speed has been good. I generally limited it’s bandwidth usage and it’s done a good job of staying near the  limit while not going over. When I opened it up it was likely affected more by my connection’s limitations than an throttling by CrashPlan. So no complaints there.

They can also backup to another PC, a friends PC (running CrashPlan) or a locally attached folder. I don’t think I’ll use anything other than their online storage. I like Cloudberry Backup better for backing up to other computers (on my network) and to local drives. Cloudberry will back up to a share and not need any software installed on the PC. Backing up to a friends computer with CrashPlan would require that computer to be online and for them to have CrashPlan installed. I’d still be using my bandwidth (and theirs) but not get much more reliability than cloud storage. One benefit that backing up to a friends computer has is the ability to seed the backup with a hard drive and then to get that hard drive back for a restore if needed at no cost. This would avoid the bandwidth of the first backup or a complete restore without the cost and time lost when sending them to CrashPlan. So these are definitely good features, just not ones I’m likely to use, at least not yet.

The idea of having PCs I support back up to my server is intriguing but my bandwidth caps makes me leery of becoming a data center.

Crashplan rescans the drives to verify backup selections at 3AM every day (configurable). This stops the backup for a short while but then the backup starts again with the refreshed file list. In my case this would refreh changes and put already backed up files before previously selected files were ever backed up. I kind of liked this since it meant backed up files were kept relatively fresh. On the downside it takes longer to ge at least one copy of all files up there. It’s only my observation that it seemed to refresh previously backed up files, it may not have been 100% consistent. For me the scan is taking about 10 minutes for 230,000 filet totaling about 70 GB.

The test restores worked fine. I was able to restore while files were still being backed up. With over 200,000 files backed up at the time, the files I selected were quickly restored to my desktop. The restore messages were a bit confusing which is my only complaint. The screenshot below is typical when a restore is finished:

CrashPlan Restore message

It says it’s unable to restore, yet the restore is already done. The rest of the restore options are pretty intuitive, The default options restore to the desktop and don’t overwrite any files which are pretty safe selections. Although in the case of a server I generally avoid filling up personal directories like the desktop since they are on the C: drive which is usually smaller than any other drive. Can’t really complain since this is desktop backup software, Just have to remember that large retores go to a drive with the space.

The backup also runs fine whether or not I’m logged on to the server (such as through RDP) without needing an hacks or workarounds.

CrashPlan does have an iOS app but it doesn’t support people like me who insist on our own encryption keys, so I haven’t tried that out.

I haven’t had any noticeable performance hit wile doing the backup. I generally limit the backup to uploading at 500 kbps when I’m home. This is about 1/4 of my rated upstream bandwidth, and about 1/3 what I usually see my upstream running at when under load and during peak net usage times (like after dinner when the entire neighborhood jumps on.) There hasn’t been any noticeable impact on streaming or file access when the backup runs. I also didn’t have any streaming issues when the nightly file scan ran.

I’ll be holding off adding any more files to my CrashPlan backup for a couple weeks. I figure I have about 100 GB of my Comcast cap that I can use for these backups in a normal month but want to wait awhile to make sure it’s a normal month. I’m already backing up the directories that typically change, so there will still be backups and I can see how CrashPlan handles versioning and deleted files.

The only negative is fairly obvious. Since CrashPlan doesn’t officially support Windows Home Server there’s no add-in. It’s necessary to remote desktop into the server (assuming it’s headless) and run the client. But that’s a relatively minor downside. I’m hesitant to trust my backups to software that isn’t officially support for the way I use it, but I haven’t read about any problems or encountered any myself. I’m confident enough that I turned off some offsite backups to S3 and I’ll trust those to CrashPlan. Not everything, the critical stuff goes to Amazon S3 too but is relatively small.

Update Dec 3rd: The latest CrashPlan update is included in Trail Log #66. A few hiccups bit going well.]

CrashPlan on Windows Home Server 2011

CrashPlan was running a special offer, $42 for a year of unlimited backup for one computer. I decided to give it a try and even though it’s not an officially supported operating system it’s been talked up pretty well in the WHS community. So here’s my installation experience.

CrashPlan recently ran a discount offer for their one computer, unlimited backup plan so I decided to give it a try. [While the email I received said the promotions will end mid-October, when I check today it’s still active. The URL is: www.crashplan.com/mobilize.]

I’ve been using CrashPlan to back up my parent’s PC and it’s been working well. My main reason for using it on their PC was the ability to backup to a local disk in addition to online storage. (It can also back up to other PCs over the internet but that wasn’t a factor in it’s choice.)

CrashPlan Installation

Install CrashPlan on the Windows Home Server

CrashPlan setup wizard

It is the same software no matter what CrashPlan subscription plan you have – start by installing the software as a trial install. Download the Windows 64-bit version for Windows Home Server 2011. RDP (Remote Desktop) into the server and run the installer locally. I accepted the defaults for the entire wizard. Nice and simple. At the end of the installation CrashPlan will start and you’ll either create a new account or link to an existing account.

Create New Account

Crashplan Account Creation

When the setup wizard completes CrashPlan will start. I want to create a new account for this testing and since this is a one computer subscription there’s no reason to add it to the CrashPlan account I already have for my parents. I enter the information to create the account.

Setup Encryption Key

CrashPlan security screen

One of my requirements is that the backups be encrypted using my own encryption key which is not available to the backup provider. So I went into Settings and selected “Replace With Your Own Data key (Advanced)” so that I could enter my own key. The CrashPlan docs indicate this encryption key will also be used for any additional computers I add to the account.

I click the passphrase option and enter in a 63 character passphrase

Acknowledge The Risk

media_1319402208640.png

Using my own encryption key brings some warning.

Default Backup Settings

CrashPlan backup configuration screen

The default settings aren’t very appropriate for a server so I’ll be changing them. I also want to select what to be backed up in relatively small groups to avoid bumping up to my bandwidth cap. I click the “Change” button under Files so I can deselect the Administrator’s home directory and add my first group of files to backup.

Select Files to Back Up

File selection screen

I select the files I want to back up. They total about 12.5 GB.

Start The Backup

Start the CrashPlan backup

Click the “”Start Backup” button to, well, start backing up. It’s initial estimate is that it was take just under 4 days although this was soon cut in half. The backup will continue even if I shut down the GUI and log off the user.

Adjust Bandwidth Limit

media_1319404908097.png

By default the outbound bandwidth was limited to 300 kbps. I’m in no particular hurry to get the backup done and I don’t want to impact my other internet activity, including other backups. So while this is well under my upload bandwidth I still lower it 100 kbps to avoid impacting performance while I’m home. At night and when I’m out for work I’ll bump it back to 300 kbps. At 100 kbps CrashPlan estimates 4.5 days to upload the 12.5 GB.

Conclusion

While CrashPlan isn’t officially supported under Windows Home Server 2011, and I’m leery of using it because of that, CrashPlan is generally reviewed positively so I’m going to give it  a shot. Initially I’ll back up some files that don’t already get backed up to the cloud. They’re relatively large files (music, video, archived software) that don’t change a lot. With a data cap from my ISP it’s not feasible to store terabytes of data offsite, even if I had the bandwidth I’d hit the cap. While pricey, CrashPlan does offer the ability to get a hard disk in the mail as a restore solution. It’s a bit pricey but something I’d only need to pay for in a pinch.

So I’ll be doing some testing to see how CrashPlan works with Windows Home Server 2011. Anybody already using CrashPlan with WHS 2011?

OS Quest Trail Log #61: Long Weekend Edition

Here in the States we have a nice 3 day weekend thanks to the Independence Day holiday. I took advantage of the time to move this site back to it’s original, but now upgraded, web server. In addition to website redesign this past month also saw network and Windows Home Server changes,

Here in the States we have a nice 3 day weekend thanks to the Independence Day holiday. I took advantage of the time to move this site back to it’s original, but now upgraded, web server. It had been running on a second server for about a month while I did the upgrades. I ended up going with an entire site redesign rather than just tweaking the old design to work with the changes. That was mainly because those “tweaks” were looking like more work than expected so I took the opportunity to change things up. Since the new design has been active for less than a day I’ll hold off on any details until the dust settles. If you notice any problems or have any suggestion feel free to leave a comment. But it wasn’t all about the website in June…

Windows Home Server 2011

On the Windows Home Server front I ended up doing a Windows Home Server 2011 restore rather than rebuild the RAID array. That went well. Even so, I ended up moving my Windows Home Server to a new HP MicroServer. I’ve been happy with the performance although there was a bit of a rough spot after the migration. But theses performance issues seemed to be unrelated to the WHS itself but were network issues. Things do slow down a bit if the system drive is heavily used but that’s a rare occasion. I have the system drive in the optical drive bay and the reputation of that SATA port is that it’s slow. I’ve been hesitant to try the third-party BIOS that’s supposed to improve that ports performance but I may give it a shot.

Network Upgrades

I’ve been upgrading my network over the past month. I swapped out my daisy chain of switches and replaced them with a single D-Link DGS-1024D switch. As part of that change I also swapped out some older or unnecessarily long cables. If nothing else it makes things a lot neater and organized.

The other issue I had ended up being traced to the NIC in my Windows 7 desktop. Many of my problems went away when I switched it back to the mother board NIC. That’s my second Intel NIC to go bad this year. I’m beginning to sour on the Intel name. Until now I’ve gone with a Intel NIC for every PC as a standard reflex when building a PC. No more.

The biggest piece of the home network upgrades was the implementation of a software router/UTM (UTM = unified threat manager). I’ve been playing around with various options. While I liked pfSense as a router, it didn’t play well with my DSL so I looked elsewhere. I started with ClearOS, thinking it provided a nice router & UTM along with typical file & web server functions. But it didn’t last long, suffering from stability and performance issues once I installed it so I moved to Untangle. So far Untangle is working fine and seems like a keeper.

This about wraps up my network upgrades. I’ll look for a new NIC for my desktop so I’m not tied to the onboard NIC but it’s not a priority. I’d like to implement pfSense but that’s more for fun than out of need since Untangle is working fine.

Website Upgrades & Changes

While there’s been obvious changes to the look of The OS Quest there’s also been some changes under the hood. As mentioned in my server OS review the site is now running on Ubuntu 10.04 LTS. This upgrade is basically what forced the other changes as the Ubuntu 9 that it was running on reached its end-of-life. Since it was an OS upgrade I went with a temporary second server rather than risk an in place upgrade.

I also switched the WordPress theme to WP-Clear. My old theme had become out of date and there wasn’t a direct (a.k.a. easy) upgrade so I took the opportunity to make some changes. While this theme is less flexible than my other one it was significantly easier to set up. Part of the problem is that the old theme had so many options I could play with it forever, The upgrade broke just enough things to require some significant work to upgrade. At least with WP-Clear there’s a much more manageable set of options. I also didn’t have to revisit and update too many old posts.

Update July 5th: I’ve reverted back to my previous theme (mostly). There were some things I didn’t like with WP-Clear so before going too far I rolled back and will re-evaluate my WordPress theme selection.

I also moved the commenting system to Intense Debate but I’m still allowing “guest” comments and not requiring a logon. At least not yet. Spam has been annoying so I might require a logon in the future. My testing shows it’s as easy to get out of Intense Debate as it is to get in so I’m not locked in.

Home Media

My living room finally entered the 21st century when I replaced my tube TV with a VIZIO XVT323SV LED TV and a Blu-Ray player. I’ve yet to link either of these directly to my home server but it’s on the list for the future. At this point I’ve been copying files to a USB drive and playing them from that and it works well. Batch files and some manual attention keeps the USB stick up to date with the shows and movies I want to watch.

On Tap

I’ll probably spend most of this month with WordPress and server software. The site redesign has left me with some things I want to look at a try out. I also want to make some changes to the content structure of this site although exactly what changes won’t be decided until I do some testing. Over the next couple of days I’ll eventually reach the point of no return on this site change and will be shutting down my old server. This post will be the first new content since the site redesign. (As I type this I still have the option to switch back to my old site with no more than a DNS change and still have all my content as it was.)

On the computer side of things it’ll be mostly cleaning up. My Windows Home Server version 1 hasn’t been powered up in a couple of weeks so it’s safe to say it can be retired for parts. I’ll pull the 2 TB hard drives in it and reuse them. But the memory/cpu/motherboard is probably obsolete (they won’t run Windows Home Server 2011 or any other 64-bit OS). There’s always the Linux/NAS option but I have no real need for that these days. I’m sure I’ll find a use for the case though.

So enjoy the holiday barbeque and fireworks if your in the U.S.