Home Cloud: Part 1 – Planning the Home Cloud

[Update: As mentioned in Trail Log #66 I’ve rethought this project and will be looking at alternatives.]

In the Home Cloud introduction I set out my goals and a broad outline of my plan. Now it’s time to get into the details. First I’ll plan out the servers that will be part of the home cloud and lay the groundwork.

I’ll be using the domain run.co  for my home cloud. This domain is already registered but still unused. It’s first use will be these servers.

I’ll be setting up three servers:

Server

IP

Server Port

WAN Port

OS

OSQWHS01

192.168.1.101

80

8081

WHS v1

OSQWHS01

192.168.1.101

443

4431

OSQWHS02

192.168.1.104

80

80

WHS 2011

OSQWHS02

192.168.1.104

443

443

OSQTBS01

192.168.1.105

80

8082

WHS 2011

OSQTBS01

192.168.1.105

443

4432

The server port is what port is used on the physical server. Port 80 is the default for the web and port 443 is the default for https access which is used to secure the connection.

The WAN port is the port that will be monitored on the pfSense WAN connection. Traffic that comes in on the listed WAN port will be forwarded to the corresponding port on the server. So for OSQWHS01 any traffic to the WAN on port 8081 will be forward to port 80 on the server. Any WAN port can be used provided it’s not used for anything else.

As I mentioned in the introduction, if I’m in the office I can only get to my servers through the standard web ports due to the proxy server in my office. I’ll be using the server OSQWHS02 for that access.

A nice short article this time. As the home cloud grows I’ll add updates here. But for now we’ll start with Windows Home Servers. The three servers gives me a nice assortment to test. I’ll start configuring pfSense next.

pfSense + 1 Public IP = Home Cloud

Home Cloud Graphic
[Update: As mentioned in Trail Log #66 I’ve rethought this project and will be looking at alternatives.]

Now that I’ve ben running pfSense for a problem-free month it’s time to start using it for more than cool charts and graphs. My first goal is to be able to make multiple servers available from the internet. I’ve got Windows Home Server v1 and Windows Home Server 2011 servers running and ready to go. Once those are going I’ll want to add my development web server to the mix so I can do development and testing from outside the home. I’ve spent some time testing various options and I’ve settled on a solution that I think will work. At least all the individual pieces work, time to see if they fit together.

The main obstacle for me is that I have one public IP which needs to address the various internal servers. Those internal servers run the same services on the same ports. The nature of NAT port forwarding is all traffic coming into the WAN connection for a port gets forwarded to the same computer. I can’t parse port 80 (http/web) traffic and make a decision where it needs to go. This is the major obstacle. Another minor issue is that my public IP is dynamic and can change whenever Comcast wants to change it. (Although when I want it to change it’s surprisingly hard to do).

Another requirement is that I use my own domain, and not just a subdomain of some DDNS provider.

One problem I have, with no real solution, is that my home servers may not be accessible from sites behind a proxy server or firewall. Such as the office I work in for my day job. The proxy server will only pass ports 80 and 443 out of the office. So what I’ll end up doing is picking my main server and set it up to be accessed using port 80 and 443 as normal. The other servers won’t be accessible from my office. (A home VPN connection will be a future project.)

I’ll get into the specific configuration details in later articles but I’ve decided on the following approach:

  1. I’ll be using DNS-O-Matic to handle the dynamic DNS updates. This is a free service from the OpenDNS people, although an account is required.
  2. My DNS provider is DNS Made Easy. I’ve used them for a few years and they’re reasonably priced and reliable. They do support Dynamic DNS updates so I’ll use them.
  3. I’ll use pfSense of course. Rather than change the ports my servers use I’ll map a unique incoming port to the standard port used by the appropriate server. For example, traffic coming in to my WAN on port 8081 will go to port 80 on my Server 1. Incoming traffic on port 8082 will go to port 80 on my server 2. So I’ll have to remember what port redirects to which server but there’s no configuration changes needed on the server. I’ll be using pfSense 2 but pfSense 1.3 may work too as it seems to have all the features I use.

The basic steps I’ll be taking are:

  1. Map out what services I want to use, what port I want to use to access them externally, and what server and port they run on in my house.
  2. Setup pfSense so it can find the servers and add some aliases so I don’t get confused or have to remember IP addresses.
  3. Configure dynamic DNS so my DNS provider learns about the new IP address when I get it from my ISP.
  4. Add port forwarding and firewall rules to handle the port forwarding mapped out in step 1.
  5. Test and fix my mistakes.

I had wanted to handle this from within pfSense but my DNS provider (DNS Made Easy) isn’t directly supported and the RFC 2136 method won’t work either. I’m not willing to use a different DNS service. I did find references to add code to pfSense in order to add DNS Made Easy support. I decided against this to avoid forgetting about it and overwriting the code in a pfSense update. I also didn’t want to worry about a change breaking the code. While a third party service is one more thing that can break, it seemed the least problematic.

I also looked at changing the ports used by Windows Home Server. While I did find some write-ups on how to do this for version 1 there were caveats. WHS 2011 seemed to be more problematic and changing ports would break other features, My own brief test to change the port on WHS 2011 was a failure. Keeping the default ports on the servers and remapping them with pfSense seems to be a clean solution. I will need to remember to include the port in the URL, but other than that it’s pretty basic and worked in my testing, There might be some features that won’t be accessible but I haven’t found them yet.

Since I have only one public IP address and I’m using the port to map to the correct server I don’t really need to set each server up in DNS. I could use one name and then pick the server via the port, But I’ll use names anyway as it will make changes easier and help me keep things straight. It will also make life easier if I get more public IPs.

Finally, I’ll be testing using my cell network so as to access the servers externally. Testing from within the home isn’t useful and adds its own set of problems. I won’t be breaking access from within my house, but it won’t be a good way to test external access. pfSense has some security settings that kick in if it detects a private IP address as the source on the WAN port.

Now it’s time to start putting it together. I’ll use this post as a central repository with links to my other articles and resources on this topic so you can check back here to see everything on the topic I’ll call “Home Cloud”. I’ll be starting off by setting up two Windows Home Servers, a version 1 server and an 2011 server.

The place to start is with my pfSense 2.0 installation back in early October.

CrashPlan Update – Week 2

Backup Logo - Laptops connected to backupWell, not exactly week 2 due to a 5 day power outage but it feels like two weeks and it’s time for an update. I installed CrashPlan on Windows Home Server 2011 and have uploaded the first 70 GB to their online backup service. Upload speed has been good. I generally limited it’s bandwidth usage and it’s done a good job of staying near the  limit while not going over. When I opened it up it was likely affected more by my connection’s limitations than an throttling by CrashPlan. So no complaints there.

They can also backup to another PC, a friends PC (running CrashPlan) or a locally attached folder. I don’t think I’ll use anything other than their online storage. I like Cloudberry Backup better for backing up to other computers (on my network) and to local drives. Cloudberry will back up to a share and not need any software installed on the PC. Backing up to a friends computer with CrashPlan would require that computer to be online and for them to have CrashPlan installed. I’d still be using my bandwidth (and theirs) but not get much more reliability than cloud storage. One benefit that backing up to a friends computer has is the ability to seed the backup with a hard drive and then to get that hard drive back for a restore if needed at no cost. This would avoid the bandwidth of the first backup or a complete restore without the cost and time lost when sending them to CrashPlan. So these are definitely good features, just not ones I’m likely to use, at least not yet.

The idea of having PCs I support back up to my server is intriguing but my bandwidth caps makes me leery of becoming a data center.

Crashplan rescans the drives to verify backup selections at 3AM every day (configurable). This stops the backup for a short while but then the backup starts again with the refreshed file list. In my case this would refreh changes and put already backed up files before previously selected files were ever backed up. I kind of liked this since it meant backed up files were kept relatively fresh. On the downside it takes longer to ge at least one copy of all files up there. It’s only my observation that it seemed to refresh previously backed up files, it may not have been 100% consistent. For me the scan is taking about 10 minutes for 230,000 filet totaling about 70 GB.

The test restores worked fine. I was able to restore while files were still being backed up. With over 200,000 files backed up at the time, the files I selected were quickly restored to my desktop. The restore messages were a bit confusing which is my only complaint. The screenshot below is typical when a restore is finished:

CrashPlan Restore message

It says it’s unable to restore, yet the restore is already done. The rest of the restore options are pretty intuitive, The default options restore to the desktop and don’t overwrite any files which are pretty safe selections. Although in the case of a server I generally avoid filling up personal directories like the desktop since they are on the C: drive which is usually smaller than any other drive. Can’t really complain since this is desktop backup software, Just have to remember that large retores go to a drive with the space.

The backup also runs fine whether or not I’m logged on to the server (such as through RDP) without needing an hacks or workarounds.

CrashPlan does have an iOS app but it doesn’t support people like me who insist on our own encryption keys, so I haven’t tried that out.

I haven’t had any noticeable performance hit wile doing the backup. I generally limit the backup to uploading at 500 kbps when I’m home. This is about 1/4 of my rated upstream bandwidth, and about 1/3 what I usually see my upstream running at when under load and during peak net usage times (like after dinner when the entire neighborhood jumps on.) There hasn’t been any noticeable impact on streaming or file access when the backup runs. I also didn’t have any streaming issues when the nightly file scan ran.

I’ll be holding off adding any more files to my CrashPlan backup for a couple weeks. I figure I have about 100 GB of my Comcast cap that I can use for these backups in a normal month but want to wait awhile to make sure it’s a normal month. I’m already backing up the directories that typically change, so there will still be backups and I can see how CrashPlan handles versioning and deleted files.

The only negative is fairly obvious. Since CrashPlan doesn’t officially support Windows Home Server there’s no add-in. It’s necessary to remote desktop into the server (assuming it’s headless) and run the client. But that’s a relatively minor downside. I’m hesitant to trust my backups to software that isn’t officially support for the way I use it, but I haven’t read about any problems or encountered any myself. I’m confident enough that I turned off some offsite backups to S3 and I’ll trust those to CrashPlan. Not everything, the critical stuff goes to Amazon S3 too but is relatively small.

Update Dec 3rd: The latest CrashPlan update is included in Trail Log #66. A few hiccups bit going well.]

Changing Time

Alarm Clock

 

Alarm ClockI’m sure anyone who’s affected already knows the clocks fell back an hour this morning. Like past years the tech that makes our lives easier wasn’t quit up to the task. But there seemed to be fewer problems this year. The only real DST bug I ran into was on my iPhone. It fell back two hours but surprisingly, the alarm went off at the right time. When the alarm went off the clock said 6am, it was actually 7am and the alarm was set for 7am. But that did make for a confusing first minute. The clock was fine after a sync with iTunes. The phone is an Verizon iPhone 4 running iOS 5. My iPad which also runs iOS 5 was problem free.

The backup service on my Windows Homes Server 2011 box decided to stop around midnight. It’s may not time related since it seems to have stopped shortly after midnight. The backups that usually happen just after midnight never happened. Clicking the resolve issue link in the alert solved that problem and the backups soon happened. A second WHS 2011 server had no problems (a recently installed test box).

But a problem with Superduper! was probably related to the time change. It’s nightly image had been running so well for so long I forgot it was there. But this morning it displayed an error. It had been updating my backup image when the time changed and this seemed to cause a problem. Rerunning it in the morning was fine.

Other than that everything else was fine. Computers changed as they should and I remembered how to do all the manual clock changes so didn’t have to go hunting for manuals this time.

 

The OSQuest Trail Log #65: October Blizzard Edition

Picture of trees covered with October SnowAnd mother nature keeps right on attacking, Not content to wait until, winter officially starts mother nature decided to hit Connecticut with some nice, heavy snow clinging to all the picturesque October foliage. Eventually bringing much of that foliage, and the limbs it was on, crashing to the ground and bringing along power lines for good measure. I lost power on Saturday and just got it back Thursday, with cable/internet following on Friday. So I went through gadget withdrawal for a few days. The picture above is from Saturday afternoon after a couple hours of snow and before the trees started coming down. Luckily the ones around me missed cars and buildings. While not everyone was so lucky, I was pretty suprised by how many downed trees there were that managed to find open areas rather than other nearby targets. But on the the tech…

The highlight of the month for me was my first podcast as a guest on the Home Server Show podcast.

New Software

I installed CrashPlan backup on Windows Home Server, taking advantage of a discount offering unlimited online backup for $42/year instead of the normal $49. The backups been working well although I’d been hoping to get a bunch up there in October since I had plenty of space left in my cap this month, The power outage ruined that. It’s uploaded just over 54GB with about 15GB in the queue that started uploading once the internet came alive. But it’ll be later in November before I add much more. I want to avoid having to throttle myself by using too much early on. I figure I can do about 100GB a month and stay under my cap but want to play it safe.

I moved from using Untangle as both a router and unified threat manager (UTM) to using pfSense as a router but leaving Untangle as the UTM. I’ve been happy with the results and was just beginning to dig into some of the features during the snow-shortened weekend. I’ve started digging into pfSense a bit, more poking around than R & D. I also plan to do some testing to see if a caching proxy will reduce the bandwidth I use. I figure I need to make sure it will cache software and patches in order to make a dent in the bandwidth I use. (The cache in Untangle didn’t actually serve much from it’s cache when I tested it.)

Updated Software

It seems like everything I use was upgraded. But the highlights were…

I put Lion on my desktop Mac Mini only to find my upgrade fear imposed delay was unwarranted. Everything worked find with only a minor Synergy frustration due to Lion’s new feature where mouse movement doesn’t wake it. I’ve no plans to put Lion on anything besides this Mini and my Air. The other Macs have no reason to upgrade or the upgrade will remove features I use.

iOS is also updated to iOS5 of course. Despite some frustrations I managed to get both my iPhone 4 and iPad 2 upgraded. I’ve experienced shorter battery life for sure although no where near as bad as some complaints I read. I tend to keep things turned off and I hadn’t enabled much of iCloud. I saw the worst performance the days I was home and had a wireless connection. Despite typical usage the battery drained far faster than when I was in the office with wireless on but no network to connect to. No scientific test but typical usage each day. By 5PM at home the battery would be around 20% while usually above 50% in the office. But I just read Apple has an update in beta that’s supposed to help.

Not really software, but Google Reader saw an update. I use Google Reader on my computers (with the account being used by iPad apps). The timing was bad, the update came as I was grabbing some battery powered 3G access during the blackout so the last thing I wanted was change. I’ve trying to avoid hating it just because it’s different. I didn’t use any of the discontinued features so no complaints there. But I found it easy to blow through a bunch of articles and star ones I want to read later. That’s become a problem. Ignoring the performance problems (very uneven scrolling) the star isn’t in the same place in every post. It’s now at the end of the title, rather than the first things. And speaking of the titles, while it has a clean look the article titles blend right in. Sure the star at the bottom of the post is always at the very left, but it’s hard to find and I star based on title. I’ll be looking for a new desktop reader and use my iPad more to check through the feeds.

Google+

While I still maintain my status as the last human not on Facebook, I finally broke down and joined Google+ when they enabled it for Google Apps users. It wasn’t much later that I lost power so I don’t have much to say about it yet. I did find the Google+ iOS app doesn’t like Google Apps users and tells me to go get an invite when I log on, but the web interface is fine from the iPhone.

iCloud

iCloud was making news and I moved my MobileMe account to iCloud. I wasn’t a big MobileMe user having been burned by Apple’s cloud services in the past. I think my problem with iCloud (beyond not trusting Apple to keep things running) will be that it requires users to dive into the deep end, accept the way it works, and don’t expect a lot of options. I gave photo stream a try. Problem was my camera fumbling uploaded more bad pictures than intended pictures. Not really an iCloud problem, but still a problem. But I have no doubt it will improve over time and I’ll be drawn into the iCloud.

Web Work

I spent some time with the plumbing of the website. I seems like I have a bunch of minor issues that I can’t seem to get to (or keep putting off). At least I was able to tackle a logrotate configuration change. I also changed my caching plugin back to WP-Supercache. I always liked the plugin but stopped using it after it broke due to an upgrade. It’s working again and I’m using it. I did make an errant mouse click and enabled compression which didn’t work (possibly because I have compression enable in Apache). Unfortunately it went unnoticed in my testing and I didn’t notice a problem until my views went way down.

My change to WP-Supercache seemed to cause another problem which went unnoticed until recently. It doesn’t seem to have been rampant, but it was frequent. I don’t quit understand the problem completely but I don’t think it was a WP-Supercache bug. In short my Adsense ads would often display a “Page Not Found” error in the frame for the ad. I set up the ads to only display ads to new visitors. View a few pages over the course of a couple months and the ads are supposed to go away. I think WP-Supercache would sometimes cache the “don’t display” page which would cause a problem when a new “display the ad” visitor arrived and the code ran to display the ad.

Home Networking

I had been hoping I had pfSense and dynamic DNS setup to handle IP address changes so I could remotely access multiple home servers using my own domain. Well, when I got my internet back today Comcast gave me a new IP address as I hoped. But alas, no update to DNS, So it’s time to do some more research. I’ll be tackling that this weekend. I’m hoping I just have a pfSense setting wrong. [Update: Got this working so hopefully a write-up soon.]

The Month Ahead

The only thing I really want to get done is getting those home servers set up with Dynamic DNS and pfSense. [Update: Rather easy fix so hopefully a write-up soon] After that I’ll see what catches my interest. I’m also hoping this past week isn’t a sample of what’s to come this winter.

Powerless Days Are Over – Still No ISP

Snowy Trees

Snowy TreesBeing a Connecticut Yankee I was feeling the effects of the freak October snow that clung to all that picturesque New England foliage until the branches came crashing down bringing along the power lines. I lost power Saturday night and the gadget withdrawal began to take effect almost immediately, especially since I didn’t know how long I’d have to rely on batteries but it was obvious it would be awhile.

Luckily by Sunday night I got word that the office had power so I’d be able to start charging things up. Still, it was too hectic in the office to do much leisure surfing or post writing and too damn cold at night to do much except crawl under the covers to keep warm.

The power’s back today but cable is still out so I’m relying on iPhone tethering and Verizon to get connected. Some things I learned already:

  • Windows Live Mesh is great for sinking files between PCs but it apparently needs the internet even for PC to PC syncs. All my PCs see each other, but Live Mesh says they’re all up to date when they aren’t.
  • The iPhone isn’t smart enough to use 3G when it’s wi-fi network has no internet connection. I had to turn off wi-fi to avoid all those server not found errors since the phone saw my home wireless network.
  • My Mac is too stupid to route local or internet traffic properly when tethered via USB to my iPhone (with internet access but no access to home PCs) and connected via wi-fi to my home network (with no internet access). Well, probably not really a Mac specific failing, I’ll have to play around with some routing settings sometime.
  • Crashplan gets lonely. Both my own CrashPlan account and my parent’s account have begun to send emails saying there’s been no backups in several days. Along those lines, I received an email from CrashPlan promoting gift subscriptions. They mentioned a black Friday sale the 25th-28th for existing customers. (I suspect new customers would get similar deals). Their store is currently offering 10% off regular and 20% off gift subscriptions but it may be worth the wait. No special links or coupons, the current deals are in the CrashPlan store.
  • UPSs are good. And after the PC is safely shut down, the remaining juice can recharge phones, iPads and Kindles.

I was pretty well into the Octiober Trail Log when the lights went out so should have that out this weekend.

I’ve also been working to set up pfSense with Dynamic DNS to be able to access multiple servers (or PCs) in the home from the internet. I seemed to have it setup properly moments before the power went out. This one may take a little longer since it’s more involved and I’m still testing need I’ll need full blown internet access for the home network . On the bright side, the extended down time will probably cause Comcast to give me a new IP address which will be a good test of the Dynamic piece. I changed the Mac address spoofed by pfSense for the WAN connection to try an help the process along.

A final thought is that bandwidth caps are really beginning to annoy me. While not usually a problem, the Comcast cap doesn’t allow much flexibility in testing cloud storage and I have tapped the ceiling two months recently. And now being forced to the iPhone tethering I’m approaching the 2GB Verizon cap for that. At the rate I’m going I’ll hit the cap before it rolls to the next month. I do have unlimited data for the phone itself so I use it directly whenever possible.

Hope anyone else in the northeast US is getting back to normal and didn’t have anything worse than an inconvenience that’s forgotten when power is back (and the internet).