OS Quest Trail Log #59: Spring Is Here Edition

springWinter didn’t leave without a fight, dropping a coating of snow on the first full day of Spring and being unseasonable cold since then. Between working the day job and getting outside to shake off the cabin fever now that the weather is better has kept things slow on the quest. The list of things I want to do is growing rather than shrinking. Still, there has been progress.

Windows Home Server

I resurrected Windows Home Server and flattened my Ubuntu Home Server box. I’m not unhappy with Windows Home Server 2011 but performance hasn’t been up to the levels I saw with Ubuntu. It’s not so much that peak performance is lower, it’s that the performance is inconsistent. Granted, it is pre-release software and I have had RAID issues, so I can’t point fingers with certainty. On the software side I have looked at a couple Windows Home Server 2011 Add-Ins for backup.

Microsoft has said that that Windows Server 2011 has been finalized and will start appearing in early April. So once that happens I’ll rebuild the server with the final release and get a better idea about performance.

Still, don’t be shocked if I wake up one Saturday and decide to switch back to Ubuntu.

iPad 2 – Not For Me, At Least Not Yet

Apple’s iPad 2 was released and I admit to being shocked with all the attention it received. I have the iPad 1, use it daily, and love it. But I’m at a loss to understand why the new version seems to exceed the first once popularity by so much.

Basically the new version is slimmer/lighter, faster and with cameras. The first two are the features I’d be most interested in but I’d have to see and touch it to decide if it’s worth it. Which is one reason I’ve avoided stopping in the Apple store, to avoid the impulse to buy. Whether it’s easier to hold for extended periods of time is hard for me to tell by just reading the specs. As for speed, it’s likely to be more noticeable with apps that are still on the drawing board. I have a couple apps that are slow (Bento) but from what I’ve seen apps may or may not benefit from two cores so it’s hard to tell.


Naturally, there’s been some frustrations over the past month.

I’m a fan of Windows Live Mesh to sync files between m,y computers, including Macs. Maybe it’s more coincidence than cause, but my Mesh has stopped working on my two OS X Leopard Macs but continues to work on my two Snow Leopard Macs. Those are older Macs which have continued to work fine for what little they’re used for so I never upgraded. Maybe it’s time.

Also in the category of minor annoyances – I use a hot swap drive on my Windows 7 machine so I can easily copy files to various external drives. Last week the machine decided that whenever I swapped drives it would drop the DVD drive until I rebooted. Hard to say when it started because until recently the DVD received little use, so maybe some update triggered it. Since then the problem has become less frequent

News & Links Of Interest

I’m not a fan of Drobo, finding my own Drobo serviceable but relatively slow and finicky at times. Ars Technica has run the first part of what will be a two part review of the Drobo FS. This is shaping up to be a the most comprehensive Drobo review I’ve seen. My own experience makes it extremely unlikely I’ll buy a replacement no matter what they say but if your considering a Drobo it would be worth a read. I can’t argue with the statement that Drobo rivals the now obsolete drive pool in Windows Home Server for ease of use. If ease of use tops your requirements then ignore me, check out the Ars’ review and consider Drobo.

It’s probably more the irony that gets it in the news, but The Register is reporting cross-site scripting vulnerabilities in security software vendor McAfee’s website.

Amazon launched their “Cloud Locker” music service and beats Apple & Google to the punch. But managed to do it by skipping any agreement with the music labels. Seems to have potential, unfortunately most of my music is not supported by their player (Apple Loseless and ogg vorbis), Oh well, I can still upload the files for safe keeping and I do have some mp3 and regular aac formatted files. The music player seems to be the only way to upload files in bulk, so only supported formats, but I suspect there will be work a around or two in  a matter of days. Five GB of free space is a tempting target and it does seem to be a nice differentiator for Amazon.

Apple’s hardware-free WWDC has brought forth rumors that iPhone 5 will be “delayed” (not really sure how something that isn’t announced can be delayed). If true, makes my Verizon iPhone 4 purchase instead of waiting seem downright smart. I can’t claim I expected a delay (I didn’t, but decided to buy anyway, planning to skip 5), but I don’t mind looking smart.

There was a story that Samsung installed a rootkit on their laptops a-la Sony BMG rootkit, but direct from the manufacturer this time. If true it’s incredibly stupid and would make me avoid Samsung products. Why do manufacturers think they have free reign to surreptitiously collect information about their users? The rootkit is a key logger capable of collecting IDs and passwords. What could go wrong?

So this wraps up what has been the third light month this year on the quest.

WHS 2011 Backup Add-Ins: Cloudberry & KeepVault

CloudberryConsoleAs part of my Windows Home Server 2011 evaluation I’ve been looking at two backup add-ins – Cloudberry Online Backup for Windows Home Server and KeepVault. I’ve been using KeepVault with my Windows Home Server v1 for just under a year. I’ve heard about Cloudberry off and on but never looked into it, probably turned off by the name.

Cloudberry was back on my mind as I heard they had a WHS 2011 add-in available in beta, and I already had the KeepVault for Windows 2011 beta. I found both can happily co-exist on the same server so I’ve been checking both out. Cloudberry has a 15-day full-feature trial which I’ve been using. KeepVault also has a trial period but I’m already a subscriber and use the basic plan. There are differences between the software and each has it’s strengths and weaknesses.

Backup Destinations

On area where there are significant difference are the backup destinations:


  • KeepVault’s own cloud storage which I’ve been using
  • A local disk physically attached to the server


  • Amazon S3, including lower cost Reduced Redundancy Storage (RRS) which I use
  • Network share on the same network. I’ve backed up to my WHS v1 machine.
  • Local hard drive attached to the server
  • Additional cloud services: Microsoft Azure, Meszeo, Dunkel, and Walrus

My Pick: Hands down, Cloudberry is the winner for flexibility. I’ve been able to back up to my local WHS v1 over my local LAN with Cloudberry while KeepVault would require an attached drive for a backup. KeepVault would be limited to the size of the attached hard drive and would need a different job for each attached drive. Cloudberry has one job that backs up to the 10TB drive pool.


This is where there are significant differences.


  • Software & Storage: Subscription fee based on storage. There’s no per PC or software charge. Costs range for $46/yr for 40 GB on up and slightly less per GB as the storage increases. For example, 130 GB is $139/yr. The yearly cost is 10% off the monthly subscription and is what I used for my comparisons. So you can go month to month at a higher cost. For comparison purposes I’m considering $46/yr for 40 GB.


  • Software: Cloudberry only provides the software, the backup services (such as Amazon S3) are not provided by Cloudberry. Software is priced per PC or server. The WHS add-in is $30. They also use a subscription (aka maintenance) model for upgrades at a cost of $6 per year (20% of the software price). So every year I would pay $6 to get another year of upgrades.
  • Storage: I use Amazon S3 Reduced Redundancy Storage. RRS is a lower cost option which, as the name implies, isn’t as well protected. For example, regular S3 storage can survive the loss of 2 Amazon data centers, RRS can only survive the loss of 1. This is more than suitable for my backups since by nature they are already redundant. I used Amazon’s pricing calculator and came up with $8.27 the first month with 40GB transferred in and then $5.77 a month after that assuming only 2 GB of replacement data was sent and I stayed at 40GB.  But if I drop down to just 10 GB it is $2.98 for the first month with 10GB data transfer in and then $1.41/mth after that.

My Pick: Cloudberry as it will probably cost me less as I will be well under 40 GB, although it will probably be a push for the first year due to the software cost. Even if I go to 40GB the pricing is comparable as I was liberal in my data transfer estimates after the first month. But for those who want consistency and expect to be near the top of their subscription limit KeepVault may be a better choice. Whether the Amazon S3 “pay for what you use” pricing model is a strength or weakness depends on your usage. If you send a lot of data in a month the transfer charges can add up as shown by my first month costs.

Additional Features


I looked at the KeepVault basic subscription. The considerably more expensive Pro subscription ($163/yr for 40GB) does include additional features although it’s unclear to me if the WHS add-in supports them all.

  • 128-bit encryption


  • Multiple encryption options and each job can have it’s own encryption key.
  • Multiple backup jobs (“plans” in Cloudberry speak), each with it’s own configuration options.
  • Files deleted on the PC can be deleted from the backup storage. This is optional and a time to wait before deletion is configurable. You’re also warned about upcoming deletes and can chose to save the files.
  • Can get granular with backups at the directory or even file level. You can also select backups at the disk level.
  • Backup selects or exclusions can be set by file type, folders can be skipped, file backups can be limited to files modified after a set date, and more

My Pick: Again another hands down choice for Cloudberry as the Swiss Army knife of Windows Home Server backup add-ins.


Looks like I’ll be switching my online backup over to Cloudberry and Amazon S3. With Cloudberry’s local backup abilities and lower or comparable costs it’s not really even a contest.

Browser Usage On TheOSQuest

BrowserUsageWith Internet Explorer 9 just released and Firefox 4 about to be unleashed I thought I’d take a look at browser usage on the OS Quest. This is as reported by Google Analytics. My own site access isn’t included in the totals.

Firefox leads the pack with 31.85% of my site visits being done with Firefox (over the last 30 days). Almost all these visitors were using a flavor of Firefox 3.6 but 2.63% of Firefox visitors were already on Firefox 4.

Internet Explorer is in second place with 23.52% of visitors. 2.24% of IE visitors have ignored Microsoft’s pleas to get off IE 6. But this is considerably less than the current estimate that 12.0% of the world is still on IE 6 (yikes!). But 9.67% of visitors using IE already have IE 9. I find that surprising since I don’t believe MS has done any pushes yet and I haven’t seen much about it beyond the usual tech suspects. This puts it only 1 percentage point behind IE 7 visitors. But 77.39% of IE users are using IE8.

Chrome brings 21.41% of my visitors. Being a Chrome fan myself I’ve been pleased to see that number grow over time. The versions are well distributed among various sub-versions of Chrome 8, 9 & 10. I find the spread of versions interesting since Google does the silent auto-update that’s not easy to disable. (Well, not easy to find how to disable it.) About 1/4 of Chrome visitors are on OS X.

Safari has dropped to 4th place, falling just behind Chrome with 20.47% of the share. What’s shocking is that 1.4 % of the Safari users have Safari on Windows. About 13% of the Safari users were from various I-devices.

Chrome went above Safari in October 2010 and stayed there. But since October 2010 only Chrome and Safari have increased their share as both IE and Firefox have been used by fewer visitors since then, with Firefox losing almost 5 percentage points.  IE only lost a point.

As for operating systems – in October 2010 64% of visitors arrived using Windows while 23.81% were on a Mac. Linux was 9.8% in third place and the iPad was 4th with 1.27%. Nothing else topped 1%. In the last 30 days Windows has dropped a bit to 62.1% while Macs have popped over 4 points to 28.04%. Linux plummeted to 5.67%. The iPad grew to 1.99% while the iPhone broke 1% and is at 1.12%.

One final observation – IE is barely in first place among Windows users at 23.45% while Firefox is at 21.5% and Chrome has 14.72% of Windows users. So at least among my visitors, the majority of Windows people are seeking out replacements for Internet Explorer. Good Work! Although IE 9 is getting positive reviews and buzz (I have yet to install it myself) so maybe that trend will change.

As for my own browser usage – I had been a long time Firefox user but a few months ago I switched to Chrome for it’s speed (both performance and a UI I find faster to use) along with ease of syncing bookmarks and configuration among PCs. Plus the incognito mode was great for running multiple mail accounts. But I recently moved back to Firefox for many sites as Chrome causes a conflict with Text Expander for me on the Mac and has some problems with a few sites I use.

While I don’t have any excitement for IE 9, I’ll be giving Firefox 4 a spin when it’s released in a couple days. Because of my Chrome issues Firefox 4 has a chance of breaking my Chrome habit.

Anybody switched browsers recently? Anyone considering switching due to one of the new releases?

Windows Home Server 2011 Resurrected

My Ubuntu Home Server caseI previously declared Windows Home Server v2 (aka Vail) as dead to me when Microsoft announced the removal of drive extender from the product. Now with the passing of time and the availability of the release candidate I’ve resurrected Windows Home Server 2011. Part of the reason for my change of heart is the way I’m now storing my video library. It’s taking considerable space and rather than going with redundancy and a backup I’m thinking of staying with just a backup. I’m less concerned about the down time than I am about the time that would be needed to re-rip and re-encode the library. So I’ll be looking at redundancy (such as RAID) to provide better reliability for the files I consider critical, and I’ll stick with just backups for less critical files such as my video library, While the way WHS not manages disks seems cumbersome, it may actually make managing my backups easier since the video library doesn’t change once the video is there.

Initially I started with a plan to install the WHS 2001 Release Candidate on a test box to give it a spin, and I did that. But after some poking around the test server I gave into my impulsive nature and flattened my Ubuntu Home Server and installed the WHS RC1 software. I’m using the exact same hardware.

I decided to fully commit to the new drive extender-less Windows Home Server philosophy and not try to shoehorn in my old way of doing things. So my initial configuration is:

  • The OS will be installed on the 320GB WD notebook drives which I’ll mirror using the motherboard RAID.
  • The 3Ware 4 port SATA RAID controller will be configured for RAID 5 in hardware and provide a single drive with 5.5 TB of usable space (RAID 5 with four 2 TB drives)
  • The remaining six SATA ports each has a 2 TB drive attached. There’s no RAID

My video library is the largest single consumer of disk space, requiring nearly 10 TB, so it’s going to have to be split apart. Slightly related, I’ve decided this is just too big so I’ll be re-encoding the videos to smaller files, but this will take time. But at least I don’t have to consider adding even more disk space, at least for the short term.

So my critical files will go on the RAID array to provide redundancy should a single drive fail. This way a poorly timed drive failure won’t keep me from being able to work with the files. I’ll rely simply on backups for my video library since I can live without them until a restore is done.I’ll be splitting my video library into many shares since a share can’t span drives. This has the added benefit of limiting the impact of a single drive failure to just the share.

Installation & Configuration

The installation was straight-forward. I hooked a monitor and installed from DVD. I configured the two 320 GB drives as a mirror using the motherboard RAID for the  Gigabyte SATA ports (re-branded JMicron). My standard practice is to do any windows installation with just the OS drive connected and that’s what I did here. An unanticipated benefit of the modular power supply was the ease of disabling drives by just disconnecting them from the PSU rather than having to pull cables from each drive. Once the OS was installed I connected the remaining drives. I’ve ended up with 9 drive letters (logical drives). The system drive is divided into two partitions (60 GB for the OS and the rest as a D: drive) and I’m not using D: for anything. I did try copying files to D: (just once) and this had a noticeable impact on performance. Then there’s the RAID array and the six 2 TB drives which are now independent.

My WHS 2011 drive listEverything except the video library goes on the RAID array, with room to spare. Space requires some of my video library on the RAID array too, but most is split across other drives. What I ended up doing is splitting the directory structure of my video library into separate shares. I split my library by genre and topic but had to get even more granular to keep shares under 2 TB. All totaled, I have 44 shares on the server. This sounds extreme but hasn’t been a problem in practice. I’ve never been one to map drives browsing, typing or scripting the share name isn’t a change for me, I’m actually finding it a little easier to manage than having to browse through a directory structure,

I had played around with software mirroring in WHS 2011 RC1 and it didn’t give me a lot of confidence,  In some test t wouldn’t RAID the entire drive, only the system partition (which would probably be OK if there was a reason, but sometimes it would and sometimes it wouldn’t for the same drive and partitions), But the motherboard RAID has been even worse, Even after a clean shutdown the array comes up as needing a rebuild on every reboot. It also frequently boots with a “conflict” error, requiring me to pick a drive as the master to rebuild from. This is also after clean shutdowns (every shutdown has been clean) and led to a problem I’ll talk about later. I’m still running the mother board RAID for the OS but pretty much consider it unreliable. Only reason I haven’t removed it is because I don’t want to be force into a re-install or rebuild just yet. My plan was always to do a clean install of the final release so I’m hoping things last until then. I’ve bitten the bullet and have spent some money to buy a hardware RAID controller. It’s still in the box so I’ll wait until it’s in use before I write about it.


As I mentioned, the OS RAID has caused me problems. At one point the system failed to boot with a conflict error from the mirror and required me to pick a master to rebuild from. I picked the wrong one and when the system had booted I found that most of the shares I created were gone as were some other OS changes. Since the shares are simply directories on the drive with this version I could just recreate the shares and point them to the directories. Tedious and frustrating to have to repeat work, but no files lost.

Two add-ins were also broken as they kept their data on the system drive. Rather than try to recover I simply uninstalled the add-ins, deleted their data, and re-installed them.

I had been remiss and this conflict problem happened before I set up a backup for the server OS drive so a restore wasn’t an option When I did try to set up a backup it consistently failed. It’s possible that the corruption affected the OS so I can’t say for sure what the backup problem is. At this point if push comes to shove I flatten the OS and re-install. I haven’t bothered trying to get any sort of OS backup working (but the data is backed up) since it’s only a matter of time before  do a re-install. The backup did work on my test server so there’s nothing inherently wrong with the backup.

Initial Observations

I like that restoring the shares after OS corruption was easy and problem free, Having the separate drives does have it’s benefits. While I have a RAID array for maximum update of critical files I no longer consider it important for my video library. I’m also wondering if I need RAID at all. It’s not like all my hardware is redundant. I will keep the OS  on a mirror (but a hardware mirror) since that seems prudent. But I’ll be thinking about which would be the bigger hassle, losing a drive and having to wait for “critical” files to be restored, or losing a RAID controller and having to reconfigure the drives and do a longer restore. To truly be safe I’d have to keep a spare controller of the same model & firmware on the shelf to allow a quick replacement and I can’t say my needs warrant that cost.

The release candidate has been a solid performer. The problems I’ve had were caused by hardware and not the software. I haven’t done any benchmarks but this doesn’t feel as peppy as my Ubuntu Home Server. I’ve been hitting the server pretty hard with file copies and backups running but file copies to and from the server have been about 30% slower even when nothing else seems to be happening. On the other hand performance stays the same when multiple copies or streams are being run. I can’t say the decrease is the server or something else going on with my PC or network so I’ll withhold final judgment.

The wizard to move shares sounds gimmicky but it’s actually pretty nice. I had to use it to move all the original shares off the system drive after the initial installation. I’ve also used it to move around existing shares to re-arrange disk usage, Naturally it takes longer when there are files since the files are copied. The share is unavailable during the move, but all other shares are still up and running,

For all the complaining I and others did about the loss of drive extender it wasn’t perfect. I will miss the ability to pool drives but folder duplication wasn’t the most efficient use of disk space. And disk migrator could be a resource hog when it ran. So rather than dwell on the losses I’ll try to find the strengths of WHS 2011. I don’t want the complication of software RAID all around. While this was more than reliable and viable with Ubuntu it seems brittle under Windows. Maybe my impression is tainted by long distant fights with early software RAID in Windows, but I can’t bring myself to trust it.

I expected to see more integration with Windows 7 libraries (there really isn’t any in my setup) so I’ll have to do more reading and research in that area, I had hoped that despite all my shares I could integrate them into my video library on Windows 7 but have been unsuccessful so far.

I do miss Ubuntu, especially when I see those file copies going slower under WHS than under Ubuntu. Unfortunately my OS drive issues do provide an excuse for the less than stellar performance. Performance isn’t bad and if I hadn’t run Ubuntu I’d probably say it was fine,  It’s entirely possible I’ll wake up one morning and decide to switch back.

TonidoPlug: Formatting a Attached Hard Drive

TonidoPlug Logo

tonidopluglogoThe TonidoPlug is a wall-wart type device that runs a Linux server but doesn’t come with any external hard drives. The $99 device has some interesting potential so I got one to take a look at it. I just received it today so what its exact capabilities are remains to be seen. But in setting it up I came across my first problem – attaching the external drive.

Now, this isn’t exactly difficult. The TonidoPlug has a USB port and any external USB drive can be attached to it. In fact, I successfully attached a NTFS formatted USB drive. But I didn’t want to use that drive. Since I plan to experiment I wanted to use a toaster style USB bay so I can easily swap drives. I also wanted to use some older drives I had on the shelf gathering dust. All these would need to be reformatted. I didn’t really want to use NTFS or FAT32 as the file system so formatting on my PC was out. Of the supported file systems the ext3 file system seemed the most reliable choice. I didn’t need to swap drives with other machines and if I was forced to use the drive in another computer there are programs I could use the read the drive from Windows or OS X. With the decision to go ext3 the problem became how to format the drive. The only Linux computer I could easily attach a hard drive to was the TonidoPlug. So here are the steps I followed to format the drive.

I’ve had the device for only a few hours and am far from an expert, so these procedures may not be perfect but they worked for me. To keep this short I’ll assume you can use terminal (on OS X) or Putty (on Windows) to SSH into the TonidoPlug.

  1. Setup the TonidoPlug. I my case I attached a NTFS formatted drive to get used to the plug, but this could be done before any drive is attached
  2. Attach the drive to be formatted. In my case the drive I used did not have a valid file system so it was not mounted. But if the drive mounts then use the Admin console (or command line after SSH) to unmount it.
  3. Go into the admin panel and verify the disk is recognized, also make a note of the device name. In my case I saw the following (double-click any image for full size):
  4. The drive isn’t currently mounted and I make a note that it’s seen as device  /dev/sda. I’ll need this later.
  5. Connect to the TonidoPlug using SSH.
  6. Create the partition by running fdisk /dev/sda.
    When prompted I select a primary partition that uses the entire drive and select a type code of 83 (Linux active partition). The output transcript is:
  7. Now that the partition is created I need to add the file system and run: mkfs.ext3 /dev/sda1 The transcript for this is below and it takes a few moments to run.
  8. I mount the drive manually just to make sure: mount /dev/sda1 /media/disk1part1. The drive shows as mounted in the admin console.
  9. The final test is I reboot the TonidoPlug and verify the drive is available.

Now the drive is available like any other drive and I can start finding out what this thing is capable of.