Synology to Windows Home Server Using iSCSI

Image of Synolog DiskStation 212j

Image of Synolog DeskStation 212jI’ve been looking at the capabilities of the Synology NAS products by looking over the Synology DiskStation 212j. This time around I gave it a spin as an iSCSI target from Windows Home Server 2011. There’s links at the end for more information about iSCSI, but for my purposes here it can be thought of as a way to present a network connected drive as a local drive to the operating system. The Synology NAS will be addressed by WHS 2011 as a local drive. No additional software is needed, it’s all built in to Synology and Windows Home Server.

This was configured using the Synology DiskStation 4 beta software although the DiskStation 3 software is set up the same way based on the information at the Synology website.

iSCSI Target Types

The Synology DiskStation software supports three different configuration types as an iSCSI LUN:

Regular Files – this configures the target on an already created file volume. This allows flexibility in allocating space. It can be increased anytime, as long as there’s space available on the volume.

Block Level (Single LUN on RAID) – this configures the target on available disks. There can’t be anything else on the disks used and they will be completely allocated. This provides the best performance (according to Synology). The disks can be configured for RAID.

Block Level (Multiple LUNs on RAID) – this configures the target on available disk space. Space already allocated to volumes can’t be used, but the disk(s) can be shared with file volumes.

Configuring iSCSI

The Synology website has good instructions on configuring iSCSI with their software so I won’t repeat it here. But for my simple requirements I was able to run through the wizard and accept the defaults. I didn’t set up any advanced options. When configuring a “Regular Files” LUN the size defaults to 1 GB so I did increase that to a more useful size.

Configuring iSCSI on Windows Home Server 2011 was a bit different than documented by Synology so I’ll run through it here. The configuration is the same for Windows 7 and Windows Storage Server 2008 R2 Essentials. I suspect Windows Server 2008 R2 is also the same along with the other related software such as Small Business Server 2008.

This needs to be done on the server itself so a Remote Desktop connection is needed (assuming the server is headless). Go to Control Panel and select “Set up iSCSI Initiator”. Then answer “Yes” to the prompt to start the iSCSI service.

iSCSI Control Panel iSCSI Service notice

The iSCSI properties dialog will appera – select discovery tab then click the “Discover Portal” button and enter the IP address (or DNS name) of the Synology NAS. Once the info is entered you should see the iSCSI target on the Synology NAS although it will still be listed as inactive. To establish the connection click the “Connect” button. In a strange twist of terminology you want to leave the default “Add this connection to the list of Favorite Targets” in order to make the connection persistent.

iSCSI Discovery Properties dialog Discovered targets list Favorite Connections prompt

At this point the connection is established and the status will change to “Connected”. Once the connection is established you’ll need to switch over the “Disk Management” section of the Computer Management console.

iSCSI properties after connection  Computer Management

When you click on “Disk Management” you’ll be prompted to initialize the disk. If the disk will be larger than 2 TB select “GPT” as the partition table type. Right-click on the newly added disk and select “New Simple Volume” from the context menu. Run through the wizard and when the wizard is done, so are you.

Initialize disk prompt  Create volume menu selection  Drive after formating

Now the disk can be used like any other local disk.

Benchmarks

Performance isn’t a reason for doing iSCSI, at least not with a home network and a low-end Synology DS212j. It’s going to be slower than a local SATA drive, but since I can, I did some benchmarks.

This is Windows Home Server 2011 running on an HP MicroServer with relatively slow Western Digital 1TB Green Drives. It’s a Gigabit network using the MicroServer’s onboard NIC. When running the benchmarks I kept network traffic to a minimum, no streaming video or file copies, but I didn’t turn any devices off, so there was the normal background network traffic. Everything is connected to the same switch.

The DS212j had two 7200 RPM drives in it. One a Western Digital Caviar Black and the other a Hitachi HDT721010SLA360 drive. Both are on Synology’s compatibility list.

The first benchmarks show the local drives, the second shows a “Regular Files” iSCSI target.

Local Drive benchmarks  iSCSI Regular Files benchmarks

I also set up each type of Block Level LUN and benchmarked them. The first is the Single LUN setup which should be the best performer, the second is a Multi LUN setup.

iSCSI single LUN benchmarks  iSCSI Multi LUN connection benchmarks

Wrapping Up

Being able to use the Synology boxes as an iSCSI target is a nice feature. Since it’s accessed over the network it’s not going to out perform a local drive unless you got a data-center class network to run it over. iSCSI doesn’t allow multiple PCs to access the same LUN (except with cluster aware software) since there’s no file locking, so it’s not a suitable replacement for a file share.

The more I explore the Synology software the more I’m considering one of their larger models. While I don’t see any immediate need to swap out anything I use for an iSCSI connected Synology NAS, I do think that an investment in a Synology DiskStation would eventually be used as an iSCSI connected drive somewhere in the future.

Additional Links:

Wikipedia article about iSCSI

Synology iSCSI Best Paractices

Synology iSCSI – How to Use

Synology Data Replicator 3 – Windows Backup

Image of Synolog DeskStation 212jI wrote about using Synology as a Time Machine backup destination in my previous article. This one will be about using the Synology Data Replicator 3 (DR3) software to backup my Windows PCs. Synology has a fairly large list of supported 3rd party backup applications, but DR3 is bundled with the Synology NAS so I’ll give it a try.

I’m running Synology DiskStation Manager 4 beta (DSM4) on the Synology DS212j. There isn’t a new Data Replicator software version for the DSM 4 beta so I’m using the version that was on the DSM 3 DVD that shipped with my NAS. I also checked the Synology website and it’s the latest version. For testing I have my Windows 7 Professional (64-bit) PC and a Windows 7 Home Premium (32-bit) virtual machine.

The software installation is straight-forward and uncomplicated so I won’t post screenshots. The only issue I had was that the DVD menu (spawned by autorun) didn’t have the privileges necessary to run the install and rather than generate a error or other message it just ignored the click. Running the install directly presented the expected UAC prompt and all was well. The software is in the WindowData Replicator 3 directory on the DVD.

When I started DR3 the first time I was prompted by the Windows 7 prompt to let Data Replicator 4 through the Windows firewall. Between this pop-up and the program itself there was a jumble of dialog boxes, one of which that said DR3 would have a problem because opening up the firewall failed. I cleared that warning and dug through the windows to find the firewall prompt and OK’d it. Everything ran fine after that.

The main program screen is shown below (click for full size):Data Replicator 3 main screen

Clicking the “Select” button runs through a series of dialog boxes to select the target Synology server and then the shared folder on that server. I selected my home folder on the server although you can choose any share the ID has access to.

Once the folder is selected I’d suggest going into options before doing any backup. The options are shown below:

Synology Data Replicator 3 options screen

The screenshot shows the default options. I decided to enable 3 file versions and 30 restore points. I also enabled deleting the backed up files when the local file is deleted.

The restore point option is similar to Apple’s Time Machine. It’s a point in time that has a copy of all backup files as they existed at that time. According to the docs these are not unique copies in each restore point, only one copy of each file is kept. This linking is well hidden but appears to be true. File properties through both Windows and Synology’s own File Station software show unique files and in fact show size totals for the backup directory tree as if they were unique files. But when disk space used is viewed through Storage Manager it’s obvious there’s only one copy per file. File Stations and Windows show over 27 GB of files in my backup folder but there’s less than 10 GB of space used on the entire disk (and that 10 GB is more than the backups).

File Structure On Synology NAS

Each PC and user combination gets a unique directory name that contains the backups for that user/PC combination. For example, my two PCs are:

Data Replicator 3 directory structure

The backups, snapshots and versions are in sub-directories of those directories.

Backups

The first screenshot at the top shows the main screen where the files to be backed up can be selected. It’s pretty standard file selection stuff. Even though some mail can be backed up, it’s mail that resides in files on the PC. I didn’t test any mail backup since I don’t use the supported apps.

Backups can be done three ways:

Immediate – the backup runs when you click the button

Sync – the files will be monitored and any changes will be replicated. You’ll be prompted to do a immediate incremental backup when you select sync. This is to catch and changes when files weren’t being monitored.

Schedule – Like the name says. Schedule a daily, weekly or monthly backup

Restores

Restores are wizard based and can be done by restore point. Any in progress backups, including sync monitoring, must be stopped before doing a restore. You can also simply browse the backed up files and pull out the one you want.

Using DR3 and Impressions

Data Replicator 3 isn’t the slickest interface out there, nor the quickest, nor feature rich. But as a file based backup program it’s not bad. The strongest features are the immediate syn and file versions.

DR3 does have some annoyances. Assuming syncing is enabled and set to start when windows boots, there will be a prompt to do a backup and then the backup progress will be on screen and can’t be closed until the backup is done. Turning off the consistency check avoids this, but at the risk of missing changes unless there’s a manual or scheduled backup done. This consistency check can take awhile for what seems like little data.

Cancelling an in progress backup causes the next backup to do a cleanup as it removes a temporary folder. This also takes awhile.

The restore points only seem to occur for the incremental backup. My DR3 restore points are all when I restarted DR3. Maybe when I run it longer and leave it undisturbed it will create a restore point. But I doubt it and it it does it’s undocumented.

The default backup selections cover the standard location for data files. If you save data in non-standard locations you’ll have to manually select them. The same if you want to back up programs. There’s no concept of file sets – such as a files of a certain type anywhere on the disk.

I save data on my Windows Home Server and have very little on my PC, so I have little need for the sync and version features. My existing Windows Home Server backup provides a bare metal restore along with file versions so I’ll stick with that even though it limited to once a day (or manual backups).

Considering Synology isn’t in the business of making backup software I expected the typical bundled software half-effort so a feature check box could be ticked. Instead I found Synology’s Data Replicator 3 to be a good (not great) software package that can do the job of protecting data files.

Time Machine Backups To A Synology NAS

Image of Synolog DeskStation 212jI recently installed a Synology DS212j NAS and one of the first things I tested was using the Synology for Time Machine backups. Setting it up was easy and so far it’s been working fine. I set things up initially using Synology DiskStation Manager (DSM) 3.2 although the screen shots below are from DSM 4 Beta. The upgrade from DSM 3.2 to DSM 4 beta didn’t require any changes.

Apple’s Time Machine will continue to fill up a disk as long as there’s data to be backed up and space to put it. Only when it runs out of space will it delete the oldest backups. While it is optional, my first step was for me to create two volumes on the DS212j. One for Time Machine and one for everything else. Dedicating a disk volume to Time Machine is not required, but I wanted a way to limit the space used by Time Machine. Because it made more sense in my mind I used Volume 1 for everything except Time Machine and dedicated Volume 2 to a Time Machine share. Since I was setting up a new NAS I simply started fresh with two volumes. A user’s disk usage across an entire volume can also be limited using a quota, which would include Time Machine usage so this would be another way to go, but it wasn’t my choice. The screenshot below shows my volume configuration (click for full size).

Synology Volume Manager screenshot

I probably would have been better off starting with a smaller volume, leave some free space, and expand if I needed the space. This is because shrinking the volumes isn’t possible and I may not need all that space for Time Machine. But I can also put other files on that volumes. Plus, I suspect I’ll be rebuilding this test box a few times,

Once I have a place for the the Time Machine share it’s time to create it. This is done through the “Shared Folder” selection in Control Panel.

Screenshot of the Synology Control Panel

Then just fill in the information for the share. You can call the share anything you want and the description is optional.

Setup of the Time Machine share

Encryption and hiding the share are optional and I don’t use them myself. While Time Machine can encrypt local backups it won’t encrypt network backups so you may want to use this encryption. Click OK to create the share. Then select the new share and click the “Privileges Setup” button.

The Synology Share Screen

Select the user(s) you want to have access to the Time Machine share. You can use the admin account if that’s what you want. But I create an ID for each person accessing the Synology NAS. The same ID can be used from multiple PCs

Screenshot of shre permissions screen

Now it’s just a matter of going to the Macs and selecting the share as the destination. It will automatically appear as a possible destination, just select it and go.

 Time Machine Drive Selection

I’ve been running Time Machine backups from  to Macs, both running the latest version of OS X Lion.

Sending Synology System Email Using GMail or Google Apps Mail

I recently added a Synology DS212j NAS to the home data center and wanted to be able to send the Synology system emails through my Google Apps account. This should also work for GMail but I’ve only used Google Apps. These screenshots are from the Synology DiskStation Manager 4 beta (DSM4) but the setup is the same in DSM 3.2.

Optionally setup a user in Google Apps to use for sending emails. I have a dedicated account for sending these sort of system emails. You can use any account. You’ll need the email address and password.

Open the Synology DiskStation Manager, go to Control Panel (in DSM, not your PC), open “Notifications” and select the “E-mail” tab. The screen, with sample values, is shown below(Double-click for full size).

Synology Email Notification configuration

 

  1. This is the SMTP server and port and is the same for both GMail and Google Apps Email. The server is smtp.gmail.com and the port is 587. Some users may find port 25 should be used although I’ve found 587 always works.
  2. The username and password is the logon information for you Google email account and can be GMail or a Google Apps account.
  3. The third section configures where the notification are sent. Two addresses can be specified. Since I use this email account for notifications on multiple computers I specify Synology 212j as the subject prefix so I know which computer sent the email. Using the same email as your logon as the primary email is often recommended but I haven’t had any issues using a different destination email in any situation.
  4. Click the “Send a test email” to make sure everything is working.

If you’re on DMS 4 you can click the Advanced tab to select which notification emails get sent. By default, every possible notification will go through email.

 

 

 

First 24 Hours: Synology DS212j

Image of Synolog DiskStation 212j

Image of Synolog DeskStation 212jThe Synology DS212j NAS is at the low-end of the Synology DiskStation “Personal and Home Office” product line, at least among the models that support RAID. There are less expensive one-bay models. Synology has an interesting product line that is more like a home server than a NAS, thanks to the bundled DiskStation Manager (DSM) software. I decided to give it a look and this is my initial impression, having spent about a day working with it.

Disk Configuration

The DS212j is sold diskless and can handle two internal SATA drives and has two USB ports for external devices. I used two Western Digital WD10EACS 1TB drives which are on Synology’s compatibility list. I had wanted to try two different sized hard drives in order to try out Synology’s Hybrid RAID (SHR) but the spare 3 TB drive I have isn’t on their compatibility list so the SHR test will have to wait until I free up a 2 TB drive or I’m familiar enough the the DiskStation to know if a problem might be HDD compatibility. For now I’ll stick with approved drives.  Synology Hybrid RAID is a Drobo like technology that provides data-redundancy using  different sized drives. The DS212j can also be configured using RAID 0, RAID 1 or JBOD. I configured the DS212j to use SHR even though the drives were the same. I plan to pop in a larger disk once I free one up and see how Synology handles this.

The default installation configured one drive volume, using both drives, with SHR. I decided I wanted to test the DiskStation as a Time Machine destination so I reconfigured drives to be two volumes. The first is 332 GB and will be my working volume for everything except Time Machine. The second will be 600 GB and dedicated as a Time Machine destination. (A 1 TB drive has only 932 GB once formatted and SHR effectively mirrors the two drives.) After creating the Time Machine volume I created a share on it which I then dedicated to Time Machine. The Time Machine share can’t be used for anything else and only one Time Machine share can be created. The Time Machine share can be used by multiple Macs (I’m currently testing with two).

DiskStation Manager 4 Beta

Synology just released the beta for their next DSM version, DSM 4 Beta. Since this is my first Synology box I set up the drives and tested Time Machine and a couple shares using the DSM 3.2 software. I mainly wanted to be sure everything was working before I installed the beta, but once I was comfortable my DS212j was healthy I upgraded to DSM 4 Beta. Downgrading to 3.2 isn’t possible but since this was a new box there wasn’t any risk for me. A search of the Synology forums showed Synology betas are usually pretty stable, and while there were issues mentioned in the forums, none seemed like they would brick my box. So my work since then has been with the DSM 4 beta (DSM 4.0-2166).

First Impressions

I’ve yet to dive deep into anything besides the Time Machine backups but my overall impression of the Synology software and hardware is overwhelmingly positive.  The hardware seems solidly built. Plus, I like manufacturers that do the little things like include extra screws. The DS212j needs 8 hard drive screws, they provide 10. It needs two screws for the case they provide three. While the case is plastic, it is solidly put together. The fan is quiet so no complaints there.

I like the cross platform support. At least on paper, Windows, Mac and Linux clients get almost equal billing, The DSM 4 Beta cloud client is Windows only at the time but Mac support is promised by the final release. Of course, the pessimist in me is skeptical of the promise until I try it. I actually did the install and configuration from my Mac which is promising, The Data Redirector (for backing up PCs) doesn’t have a Linux or Mac version  A case could be made that Time Machine support negates the need for the Data Redirector and rsynch could be used for Linux. The Download Director doesn’t have a Linux version or a version for OS X after 10.6 so this does appear to be the single cross-platform gap.

I haven’t done any real benchmarking, plus the WD drives in the box are not built for speed. Still, file copies between my Windows 7 desktop and the Synology box are about 30% slower than copies to my HP MicroServer running Windows Home Server 2011. This was with the DSM 4 Beta Firmware which may have affected performance. But at this point, speed isn’t a selling point.

Time Machine backups and restores are working fine with the DS212j as the backup destination. I’ve never been a fan of Time Machine over the network. Time Machine backups have always seemed rather brittle to me and backing up over the network seemed to add one more complication. But having said that, it’s been fine for the first day.

DSM 4 doesn’t start the standard packages like DSM 3.2 does. At the time I upgraded I wasn’t using any of them so the previous standard applications needed to be started. Packages include two audio servers – iTunes server and Audio Station. Media Server is a DLNA server and Photo Station is for sharing photos. Download Station allows downloading files such as torrent files. Surveillance Station allows control of wireless cameras.  There’s also a selection of 12 add-on packages that include WordPress, Email Server, and Cloud Station among others.

It wasn’t obvious from the description, but the forums indicate that the “Backup and Restore” package in DSM can backup to Amazon S3 so that could be the backup solution for my critical files. I’ll take a look to see how it compares to Cloudberry on my Windows Home Server and see if it has the features I want.

I look forward to trying out the various applications and seeing where the Synology DS212j fits in my home data center.  I‘m a little afraid I’ll really like it and have to buy a larger model to get the disk space I’d need. Despite being called a NAS, my first impression is that the Synology DiskStations are a viable contender as a home server.

SOPA and PIPA and Blackouts

If your reading this on Jan 18th you know I’m not blacking out the site. While I agree SOPA & PIPA the only reason I see to blackout a site is to raise awareness and blacking out this site won’t do that. I suspect everybody who comes here will have heard of this if not actually visited a blacked out site.

If not, and you want to learn more, here are some links with more information.

  • A little old, but the Verge has a relatively short but informative article about SOPA. The DNS provisions appear dead (for now, but the rest still applies). Although those DNS provisions are still in the bill posted on the Library of Congress’ Thomas site.
  • The Reddit blog has a good breakdown of breakdown of the SOPA and PIPA provisions
  • Wikipedia has a long write-up about SOPA and does a good job of advocating the reasons SOPA is bad law. Just don’t expect to be able to read this the 18th.

Rumor has it that the Google homepage will have SOPA related links on the 18th.

Even if SOPA and PIPA fail to pass this year I have no doubt they will return with a new name so just because the latest news makes it appear they are dead, it’s not a permanent condition.

Move A Bento Database to a New Computer

I haven’t written a post in awhile so I was looking through my analytics to see what searches brought people here. I  found “How to move bento to a new computer” and figured that would be a good topic. I’ve written a few Bento related articles but none dealt with this directly.

Being a Mac program it’s not unusual that the inner workings are hidden from the user. But moving the file is pretty simple.

Move to a New Mac

The operative word here is “move” which means we don’t need Bento on the old computer.

Pre-Move Checklist:

  1. Install Bento on the new computer. Make sure it’s the exact same version as the original computer. From the menu you can select bento –> About Bento to get the version.
  2. Be sure that Bento on the new computer doesn’t have any data. We’ll do a backup but the data won’t be merged.

The Move:

By default Bento saves the database to [UserHome]/Library/Application Support/Bento where [UserHome] is your home directory. This is also shown as ~/Library where the tilde indicates the home directory.

OS X Lion hides the library folder by default. To open the folder on Lion start Finder and hold the “Option” key while selecting the “Go” menu. The ~/Library folder will be opened. Macworld  has 18 other ways of opening the Library so you can pick your favorite.

  1. On the new computer browse to ~/Library/Application Support/Bento in Finder and rename bento.bentodb to bento.bentodb.backup.
  2. Copy ~/Library/Application Support/Bento/bento.bentodb from the old computer to ~/Library/Application Support/Bentoon the new computer.You can do this any way your comfortable with such as connecting over the network or using a USB drive and sneaker net.
  3. Start Bento on the new computer. You’re done.

The bento.bentodb file is actually a OS X package file which is actually a collection of files with the right attributes so OS X presents it to us as a single file unless we select “open package contents”. If you use a Windows file system in a interim step the file will appear as a directory. Be sure to copy the entire directory and do not change any contents.

Bonus Tip – Open A Different Bento Database

You can run multiple copies of Bento or share the same Bento database from multiple computers. I save my Bento database on Windows Home Server and access the same file from all my PCs. Actually I also have multiple databases on the Windows Home Server. One word of warning – Bento is not a multi-user database so be careful not to open the file from two PCs at the same time.

Start Bento while holding down the option key. The following dialog will appear and you can select the database you want to open. I select the “Show this dialog” option so I don’t have to hold the option key and the dialog always appears.

Bento File Open dialog

Select database you want to open and any other options you want.

Syncing Bento

I no longer sync Bento but at one time I did use Dropbox to Sync Bento Databases between Macs.

MicroServer Sale At Amazon

Picture of Amazon listing

Picture of Amazon listingAmazon has the HP MicroServer N40L on sale for $199.99 + shipping. It’s not direct from Amazon but through onSale so there’s not free shipping. At this price I was really trying to find an excuse to buy another. The best I could come up with is that my current HP MicroServers are the older N36L model. Unfortunately (or fortunately) that wasn’t enough since my current servers work just fine and logic beat emotion.

Apple Software On WHS Shares

Trashes folder on a WHS share

Trashes folder on a WHS shareI run a mixed Windows/Mac home and all my data resides on my Windows Home Server no matter whether it’s Windows or Mac. This means my iPhoto library, iTunes library, Aperture library are all on my Windows Home Server. I recently noticed that these libraries were saving deleted files forever.

The libraries are a directory structure that OS X understands and may present to the user as a single file. For example, iPhoto displays as a single file in OS X unless “show package contents” is selected. Even though my iPhoto library is on a WHS share OS X displays it to me as a single file bundle. As long as the files remain within the library structure all is well. Libraries that maintain their own internal trash bin (i.e. iPhoto and Aperture, maybe more) end up trying to move the files to the OS X trash bin when you empty the library’s trash bin.

I recently noticed that when I emptied the trash in iPhoto that it moved the files to a “.Trashes” folder on my WHS share. (Note the leading dot)  See the first graphic to see what I mean, click it to enlarge) Well actually I noticed this huge .Trashes folder and then found it came from iPhoto and Aperture. If this was an OS X drive running on OS X it would be part of the trash bin and get emptied when I emptied the trash. Aperture also worked the same way once I checked. On the WHS share they live forever,  even OS X didn’t see it as part of the recycle bin.

The .Trashes folder could be deleted just like any other folder without causing a problem. The next time you empty a library’s trash it will be recreated. To see the folder you need to enable viewing hidden files and folder (click for full size for the Windows 7 setting below):

 

Show Hidden Folders Option

I also found that iTunes saved replaced apps to the .Trashes folder. Luckily it doesn’t save replaced or deleted podcasts. If it did I’d probably have run out of disk space. iTunes doesn’t seem to save anything I delete on my own, only the apps it replaced.

It’s only my apps that maintain their own library structure that have this issue. Deleting regular files on my WHS from OS X deletes them immediately.

I guess there is a price to pay for trying to get Microsoft and Apple to play together. But this is a small prices since it’s easily fixed with a scheduled task to delete the directory.

Acer Aspire Windows Home Server AH342-U2T2H

Acer Aspire AH342 Home Server

Acer Aspire AH342 Home ServerThe Acer Aspire Windows Home Server seems to be one of the few Windows Home Servers that can still be purchased in the US. Just before Christmas Newegg had it on sale for $290. After Christmas it went back up to $350 but then dropped further to $260 (it’s list price is $449). Since it includes Windows Home Server v1, and not the latest version, I suspect we’ll see more discounting as Acer tries to clear out it’s stock. Hopefully they’ll have a WHS 2011 version and stay in the market. I took a look at the Acer Aspire AH342-U2T2H.

Windows Home Server v1 will end-of-life in January 2013 so any WHS v1 purchase needs to take that into account. It’s not like the server will turn into a pumpkin at that time, but Microsoft will stop providing updates. This will be after the Windows 8 release date so hopefully Microsoft would release new connector software if it’s needed for WHS. If you’re going to be using the server for remote access, meaning it’s accessible from the internet, the lack of security updates after 2012 would be a concern. If the server is going to only be accessed by computers in the home then it’s less of a concern.

The hardware should support Windows Home Server 2011 if you want to install it later. There’s no onboard video so you’ll either need to install a PCIe x1 video card or do a blind unattended install. The server comes with 2GB of RAM and the specs say that 2GB is the max so that could be an issue depending on what add-ins you install. The Atom D510 CPU is 64-bit so can run WHS 2011.

This server was purchased to provide backup and central storage for a few PCs, basically a low cost NAS. There’s only one drive so to use folder duplication a second drive would have to be added. Because hard drive prices haven’t returned to pre-flood pricing I’m contributing one of my slightly used 2 TB drives for use in the server.

Initial Setup

Because the WHS software delivered with the server is quit old I couldn’t use if for setup since I have Windows 7 clients. If I had Vista or XP clients I could have installed the bundled software and then upgraded. Since I only had Windows 7 I followed these steps:

  1. Unpacked, plugged in and powered on the server. While it was doing its initial setup I went to step 2.
  2. Download the latest connector software from Microsoft and burn it to a CD.
  3. Once the LEDs stopped blinking I was ready to move on. The quick start light said all the blue LEDs would be on solid which is a bit confusing. The panel LEDs include a network LED which blinks for network activity and a hard drive light which blinks for activity. The status LED was blue and red while the drive lights were blue and purple. I moved on once things seemed to settle down.
  4. I popped the connector CD into a Windows 7 PC and ran it. The screenshots for the installation are below. Click for a larger picture.
    1   3
     4  5
      6 7
      8 9
  5. After logging onto the Windows Home Server my next step was to remove the McAfee Anti-virus software. I don’t use AV on my own WHS and if the owner wanted AV McAfee would be my last choice. As it is the included license is limited to 60-days so removing it wasn’t a problem for the server owner. The version pre-installed won’t work once WHS is updated although there might be an update from McAfee (I didn’t bother to inquire).I uninstalled McAfee through Add-Remove programs after RDP’ing into the server. It can’t be removed through the add-in manager.
  6. While still RDP’d into the server I ran Windows update and installed all the available updates.

At this point the Acer Aspire is a basic Windows Home Server v1 box with the latest updates.

Hardware & Features

The server comes with one 2TB Western Digital Green Drive (WD20EADS). I’d prefer a small system drive since I don’t like to share the OS drive with data, In this case it’s not much of a concern since I don’t expect heavy usage. So to take advantage of folder duplication I’ll be adding a second drive which is also a W20EADS drive. For testing purposes I added two more drives.

The server also has a nice compact form factor and will look good on a shelf. There’s also an eSata port and several USB ports (all USB 2). The front USB port has a one-button copy feature I’ll talk about later.

It’s also surprisingly quiet. I’ve got four drives installed and I’m doing a file copy. Even sitting next to the server I have to strain to hear the fan and the drives are silent.

There’s some multimedia software that will probably go unused and I don’t have time to test them. The console has tabs for “iTunes Server” and “Digital Media Server” and Firefly Media Server is installed. The server did show up as a “Media Server” for my LG Blu-Ray player and I was able to stream a video from the server.

The Lights Out add-in is also included although it is an old version (v0.8) so it needed to be upgrade. The add-in was licensed with an oem license but after the upgrade the license reverted to the trial version. Once the trial is over the license will revert to a community addition license which, according to this, has all the features of v0.8 plus a few more. The upgrade was done like installing any other add-in. I didn’t need to uninstall the original add-in although doing so probably would have been a good idea.

The One-button USB copy is interesting but I’d prefer it didn’t try to think so much. I tested with a drive full of DVD rips. It copied the drive to the public share as expected but then it copied about 50 of the .BUP and .IFO files to the video directory and renamed them to avoid duplicates. Pretty useless on their own and breaking the rip directory since they’re missing. It was also interesting that other files with the same names were left alone. So if you already have files in an organized directory structure this feature may change the structure so you may want to skip it and do a regular copy.

The expansion slot allows a video card to be added should one be needed. But it’s a PCI Express x1 slot which isn’t common among video cards. I’d be more inclined to look for a USB 3 expansion card to add some external drives. It will need to be a low-profile card.

I wish Acer would drop the McAfee AV add-in which I view as nothing but crapware. Even if it worked, it’s still only a 60-day license. The Light-Out adding is outdated but at least it was a full license. The included add-in and its license doesn’t provide any benefit once the latest version is installed.

I attached a Lian-Li EX-503 External Enclosure via the eSata port. The server could see four out of the five drives in the enclosure so the eSata port can handle a port multiplier but only up to four drives. There were also four drives in the server bays. I didn’t do any benchmarking or other testing beyond verifying that drives could be seen.

Power Consumption

I did some quick power measurements using a Kill-a-watt power meter. The server was plugged into the Kill-a-Watt which was plugged into the UPS outlet. I started with all 4 drive bays populated. There were three Western Digital 2 TB EADS drives including the one that shipped with the server as the OS drive. The fourth drive was a Hitachi  Deskstar 7K200 drive (2 TB, 7200 RPM).

With all four drives the power usage was between 52 and 56 watts. The 52 watts was when the server was idle, at least as far as access goes. Some background processes may be running although CPU usage did remain low. The 56 watts was during file copies or drive removal processing although it mostly stayed at 55 watts under load.

I removed the Hitachi drive and usage dropped to 44 to 46 watts with occasional and brief drops below 44 watts. When folder duplication was active the power usage was 46 watts.

With two W20EADS drives installed the power usage was 36 watts while idle and 37 watts while processing a client backup. During folder duplication, when both drives would be active, the power usage was 37 watts.

With just the original drive delivered with the server the power usage was 29 watts while idle.

Drive benchmarks

The benchmarks below are the screenshots of the ATTO benchmark results. ATTO was run locally on the server (double-click for full size).

ATTO Benchmark for Drive C:  ATTO Benchmark for Drive D:

There’s not much of a difference between C: and D: since they are the same physical drive.

The screenshot below shows the results of a robocopy from my Windows 7 PC to a server share with duplication enabled.

RobocopyResults_Win7ToAspire

The reported speed for the file transfer was about 2 GB per minute. If my math is right at 8 bits per Byte and 60 seconds per minute that’s about 271 Mbps. Turning the results to MB/s shows a speed of 33.94 MB/s which is significantly slower to the ATTO results run directly on the server, but includes all the server and network overhead. Additional tests produced similar results.

The screenshot below shows the results of a robocopy from the Aspire AH342 to my PC. The copy was started after the server completed drive balancing and wasn’t doing anything else.

Results of RoboCopy from Aspire H342 to Win7 PC

Assuming my math is again correct this is 231 Mbps and 28.93 MB/s.

The file copies were done with mostly video files so the average file size was pretty large and there wasn’t a lot of overhead opening a lot of files.

Summary

The price is certainly the big attraction although if you’re going to add three hard drives to max it out the cost will go up considerably at today’s prices. But if you have the drives or can wait for the flood-induced prices to drop it’s worth it. Personally I think a second drive should be added in order to enable folder duplication or to do backups so that will increase the cost.

Returning to Windows Home Server v1 was both nostalgic and a reminder of the frustrations WHS v1 brought. Removing a drive brings down the server while it’s processed which can be time consuming (hours). That’s not something most people will do as a regular activity so it’s not too much of a concern. There was also the occasional slowdown as some process ran (backup cleanup, drive balancing). After using WHS 2011 for about a year WHS V1 just looked and felt old.

I was impressed with the Acer Aspire AH342 Home Server. It will make a good NAS for sharing files and PC backups, which is why it was bought. But it’s not a product someone can buy off the shelf and expect to get running unless they’re familiar with WHS or have only Windows XP and Vista machines. But once the software’s age related issues are worked out it performed well. Plus I like the nice small cube form-factor and it’s quiet. It can be out in the open and on all the time.