Syncing Software

File Syncing tile

File Syncing tileI’ve been a fan and longtime user of Dropbox. I even paid the the 50GB plan. That subscription coincided with my increased iPad usage and Dropbox was a great way to get files to it. That subscription is going to expire in May and I’d rather not renew it and save the money. So it was time to see if I really needed the 50GB and if there were alternatives. I’m not tied to a single solution and it’s a good time to see what my options are.

My Sync Requirements:

  • There needs to be some sort of cloud sync so that I’m not always required to have to have my devices running at the same time.
  • There needs to be computer-to-computer direct sync so I don’t ding my ISP usage cap for files that don’t need to leave the house. Plus, computer to computer will be faster than up to the cloud and back down again.
  • Mac & PC
  • OS X’s file bundles (for example, my Bento database is a file bundle) could be especially troublesome. The syncing software needs to maintain the attributes OS X needs.
  • iPhone/iPad & web access to the files, at least the ones stored in the cloud.

What’s not a requirement:

  • Backup – when I got the 50 GB plan I figured it would be a good place to stash files. It’s not that it’s a bad backup solution, but syncing clashes with the simplicity and reliability I want in my backups. Dropbox ended up being a dumping ground for stuff “I might want someday” and was out of control. Not really Dropbox’s fault but I won’t be changing my natural tendency towards being a pack rat.

How Much Space?

The first thing I did is go through the Dropbox folder and delete everything that didn’t need to be synced. My Windows Home Server provides a nice central repository so all I really needed was files I’d need on my iDevice or on my MacBook Air when I’m out.

So after the cleanup I was down to 600 MB, well under 2 GB. And even some of that can go once I get a chance to review it.

But Dropbox couldn’t meet all my needs, such as computer to computer sync, so this is what I’ve ended up with…

How & What I Sync

I started with the two pieces of software I was already using a familiar with – Dropbox and Windows Live Mesh (formerly various other names) as I figured between the two of them I’d be covered.

There are two main differences between the two. The first is that Windows Live Mesh (WLM) is designed for disparate folder locations while Dropbox uses it’s own folder structure. A second difference is WLM can ignore the internet completely and just do LAN syncing. Dropbox does do LAN syncing for it’s speed benefit, but files still go to the internet.

I use WLM to sync directories that can’t (or at least shouldn’t) be moved from their location, such as the draft directory for Windows Live Writer. I can (and did) stop writing this article on my desktop and moved to the couch with my laptop and picked right up again.  I also use it for moving my large Video files around which would cause all sorts of problems going to the internet as the gigabytes of video added up.

I have had problems with WLM on my Macs. I had to re-install it on two Macs as it began to crash on startup. If also has a habit of shutting down and requiring that I start WLM again. So for most of my data file syncing I’ll use Dropbox, with WLM handling the computer-to-computer stuff. Most of my data is on the server so this is mainly for stuff I want on my iDevice.

Alternatives?

I looked at SugarSync which gets generally good reviews. I liked it and their 5 GB free was more generous than DropBox. But I’ve had problems with it on my Macs. It’s had to be re-installed on each Mac at least once and it still fails to run on my MacBook Air. And it’s more than just a re-install, it’s an uninstall and then make sure all traces are gone before the re-install. But not being able to run on my Air is a show stopper. It may be something unique to my Macs, but it’s already turned me off to using it since I can’t count on it.

I’m pretty happy with the way the Windows Live Mesh and Dropbox combo are working, but is there anything else I should be looking at?

MacBook Air Trumps iPad

Quick Bits tile

An interesting event happened today. As I was heading out the the house for the day I grabbed my MacBook Air and left the iPad home. In the past I would have taken just the iPad, or in rare cases taken both (and a bag big enough to handle both).  I didn’t have any specific computer needs but if I found the time I’d catch up on my feeds or work on some blog content (such as this post).

While the Air is certainly bigger than the iPad it’s just as convenient to carry. Both fit in a that’s really no bigger than the device and both have the battery to get through the day so no accessories needed. While there may be Windows PCs that meet the Air’s instant on and long battery life in a easily carried size it wasn’t true of my (admittedly old) netbook never lived up to those expectations.

I’m certainly not saying that everyone should get a MacBook Air instead of an iPad but it certainly complicates things.  The iPad is easier to use “on the go” but in that case it’s caught in a squeeze between the Air and smartphones. The Air does generally require a working surface to use it but it’s instant on feature makes getting to it as fast as the iPad. So while I was comfortable with my decision to not get an iPad 2 now I wonder if I’ll want an iPad 3 (whenever it arrives) if my current iPad is still running. I suspect I’ll be carrying the MacBook Air outside the house these days although the iPad will remain in daily use.

Saturday Rants: Recent Pet Peeves

Frustrated Man tileA few recent news articles that hit some pet peeves of mine:

AT&T is Buying T-Mobile To Help Customers

Despite reducing the number of competitors by one, and that one having a reputation for good customer service and innovative phone and plans (or at least different), AT&T claims the merger is pro-consumer. Ars-Technica has a good article about just how warped AT&T arguments are in their official filing. After a long history of problems with AT&T I’m biased against seeing good in anything they do, but this is ridiculous. Never the less, the merger will be approved after some face-saving government concessions.

Yahoo is Saving Data for 18 Month

Yahoo announce they will begin retaining the information about the searches we do for 18 months (up from the current 90 days). While there may be reasons to argue against the 18 months, what peeved me was their “to benefit the consumer” spin. Guess they liked AT&T’s spin. Their announcement headline said this included…

… to Put Data to Work for Consumers

And the first sentence was…

Today, Yahoo! is making an announcement of our intention to change our log file data retention policy to meet the needs of our consumers for personalization and relevance, while living up to their expectations of trust.

At least be up front and say the primary reason is so they can make more money from the information they collect. If that includes better targeting of ads we might see the benefit of fewer, less annoying ads, but I doubt that’s their goal (at least the fewer part) and they don’t get into any specific about how this is better for us.

Look, I don’t expect all services for free and if I’m going to get ads they might as well be relevant, but don’t try to tell me it’s all for me. After all, the Yahoo email I paid for had more annoying ads than most free email services. Even if they had been well targeted, they would have been annoying. Maybe their email embedded ads are less annoying these days as I dumped them long ago. But despite the words, I don’t see them as pro-consumer.

Apple’s iPhone location tracking (and the reaction to it)

This one’s a bit complicated and the jury is still out. We agree to location reporting in the iPhone terms of service but then we get prompted by each 3rd party app as to whether or not to allow it. This gives the impression of choice. On the other hand they regularly determine location and there are valid reasons to do so. The question is, why keep such a long history? A plausible explanation is that the tracking is to improve GPS performance by being able to start the search in the right location. And it’s being kept for a year do to a bug, oversight or bad decision by a programmer.

There are plenty of more devious explanations and at least one security researcher who says the info is going to Apple to help populate their location database. What I find interesting here is that the reason given is the same reason that got Google in hot water for their wi-fi stiffing. That same researcher does show where Apple says what their doing and provides a “poorly worded” opt-out. As it happens I couldn’t verify it on my iPhone as I already encrypt my backup (so can’t get at the file on my PC) and I always opt-out of everything so I couldn’t easily verify any of this. Although I admit my opt-out is mainly because I ‘m too lazy to read the entire policy  and I basically distrust everyone.

But no matter what the reason it gets down to the same reason I encrypt  my iPhone backup. There’s no reason to save the info for so long, so why do it? It can only cause problems and open a can of worms (which is now open).

The second part of this pet peeve is the reaction to it. We have a Senators writing letters to Apple. OK, I agree Apple should explain, but there’s no reason for a Senator to get involved except to grab headlines. We wouldn’t want them tackling those pesky budget and economic issues instead.

To top it off, this wasn’t even news. It just caught everyone’s attention now because there was an app for it and some pretty pictures to go along.

What Could Possibly Go Wrong?

And while you may say there’s no real issue unless your trying to hide something – if the data is there someone will want to use it. The Apple iPhone tracking coincided with my reading a story about the Michigan State Police pulling data from cell phones during traffic stops. While denied, it falls into the realm of possibility and do we really think these devices won’t be abused or restricted to lawful use (and lawful doesn’t mean moral or just).

Beta Hardware?

Then I read Harry McCracken’s article about “beta hardware” and at first I agreed because it seems these vendors were making people pay for development products. At least Google had the sense (and money) to give away their Chrome Notebooks. I don’t think Harry is necessarily wrong in the details, but the word “beta” gives vendors too much credit.

The more I thought about it it’s not beta hardware, it’s a bad product. They’re playing catch-up to Apple and justify it by saying features are coming. Reviews says the iPhone didn’t have this (Apps) or that (cut/paste) when it was released as if that justifies the new entries not having this or that.

I can understand choosing a non-Apple product, but don’t try to justify it by saying it will be just as good as todays Apple product at some distant date (Harry doesn’t do this – others do and his article just triggered this response). First off, in order for that to be true it means the software will change (unless the Mfg. takes the hardware back for an upgrade as Motorola will do with the Xoom) and software is harder than hardware and frequently sucks despite delivering a promised checklist feature. Case in point, the Windows Phone 7 update fiasco.

If you want a non-Apple device then buy it because the one you’ll get is best for you and is what you want today, not based on some promise. There are plenty of reasons to chose a non-Apple phone, although fewer options in the tablet arena. If the vendor is promoting a product based on future features then it’s a bad device today. Harry’s wrong, at least in the choice of “beta”, these aren’t beta devices, they’re bad devices that the manufacturer claims will get better.  I guess the promise means the manufacturer knows it’s bad.

Dropbox Privacy Kerfuffle

Dropbox made news recently when they changed their privacy policy to indicate they would turn over files under government order (which requires they break their encryption).

Despite the sky is falling reaction of some, I was actually surprised to learn this wasn’t already in their policy. Even if it wasn’t in writing, I think it was safe to assume that they will comply with the laws of the country they do business in.

While Dropbox makes a big deal about security I’d never trust a service such as this with info I truly wanted to be confidential (such as medical, tax returns & financial records). Even with the best of intentions, mistakes happen – either by them or me. They allow file sharing, I could share my files in error or they could push a code change that shares them in error.

I don’t have the encryption keys, they do, so they could also end up in someone else’s hands despite the best policies and intentions. If I want something kept private I encrypt it myself with my own keys.

Now Dropbox was a little over eager in their security promotion. They used to sat “Dropbox employees are unable to view user files”. Not exactly true since there are at least a couple employees who have the keys so they no longer say that. it doesn’t mean every employee has the keys, but like I said, mistakes happen. Maybe it’s because I took the no employee access statement with skepticism since I knew they had to have the encryptions keys if only to run their service, but I don’t consider this more than misplaced marketing. Such a a statement  could only be true if I created the keys and never transmitted them.

If you really need information secure then trust no one, encrypt yourself, with your own keys (and don’t lose them). I fail to see a real issue with Dropbox since I never expected more, despite a claim that wasn’t 100% true. If I wanted to, I could simply use a Truecrypt vault inside Dropbox

 

It’s been an annoying week reading the tech blogs, but that sums up the news I found most annoying. What news pushed your buttons recently?

.CO Domain Bargain

I noticed that Hover (Tucows recently re-branded domain registry which was formerly Domain Direct) was offering a deal to transfer in any domain for $10 and I had som .CO domains that would be needing renewal soon. They are typically in the high $20 range (although I have see some in the teens). So I decided to give it a try and transferred the three I had in, taking another 10% off with a coupon code. To get another 10% off any domain order you can use my affiliate link or the coupon code osquest. There’s no mention of when the $10 transfer ends but they do say it’s for a limited time. This is my first time using Hover/Tucows but the transfer was painless.

RAID Rant: RAID is NOT Backup!

Data Backup Storage Information Tile

Data Backup Storage Information TileWith the loss of  Drive Extender in WHS and my search through RAID alternatives I’ve noticed a lot of people equate RAID (and RAID like solutions) with backup. RAID is not backup. RAID provides redundancy for one hardware component to improve uptime (and RAID 0 doesn’t even provide that).

So let’s get RAID 0 out of the way. This is also called “scary RAID” for good reason. With RAID 0 data is “stripped” across multiple physical drives so that there’s one big drive. This is usually done to improve performance but if any one drive fails then any data is lost. So clearly no way anyone would consider this “backup”.

Then there’s RAID 1 (Mirroring) where the files are written to two drives which are a mirror of each other. Two physical file copies, so that’s backup, right?  A see a lot of mentions that this provides two copies of the files. Nope. If one drive fails the other drive can keep the system running or feeding data, but what if:

  • You delete a file? – the deletion is mirrored across both drives.
  • A file is corrupted? – The corruption is written to both drives.
  • The controller/motherboard fails? – The drives are inaccessible until the broken part is replaced. If the controller failure didn’t corrupt the drives and you replace the hardware with like hardware (or correctly reconfigure the software RAID) you’ll probably be able to get the system back. Depending on the OS and RAID implementation you might be able to take one of those drives and attach it to another computer to get at the files. Assuming the failure didn’t corrupt the files.
  • A virus deletes your files? – the deletions are mirrored.
  • A power surge zaps your PC? – It’s as likely to fry two drives as it is one.
  • You screw things up when upgrading/reconfiguring your PC? – Oops
  • A water pipe breaks and floods your computer case? (or someone spills a drink into it)? – oh well
  • And so on…

RAID 5 provides redundancy through a check-bit rather than keeping a mirror but the potential problems are the same. There are still the same issues and if two drives are lost then it’s all lost. (There are more involved RAID solutions that can support multiple drive failures, but that’s the only additional protection they provide.)

In the case of Windows Home Server Version 1 the folder duplication feature has the same limitations of RAID 1. Redundancy, not backup. I kept all my WHS v1 files duplicated because it would have been a huge pain to lose one non-duplicated drive in the pool and cause a lot of down time. So I duplicated it all – and still did backups.

With WHS v2 I’ve become less fanatical about uptime. I have a RAID 5 array for the files that are critical and for when I need more than 2 TB of space in one share. But my video files (the bulk of my data is video) is spread across individual drives. The RAID array provides improved reliability for the files I really need and might not want to wait on a restore or rebuild. Since these are my most critical files they are backed up multiple times (some have 3 backups while the rest have 2 – in addition to the files on the RAID array). I have more backups of these than my video files that aren’t RAID protected.

There’s nothing wrong with RAID, just don’t confuse it with a backup.

Bento Tip: Syncing A Database Between Macs

Bento for Mac Tip tile

Bento for Mac Tip tileI use Bento a bit and I’ve been using a symbolic link to point to the database in a Dropbox folder. This has worked pretty well, except occasionally the symbolic link would break. I recently eliminated the symbolic link and it seems to be working well.

To set this up you’ll still need Dropbox but then you can skip the symbolic link.

  1. Decide where you want the database in Dropbox. I want mine in /Dropbox/data.
  2. Move the existing Bento database(s) to the new location. By default the default database is created in [UserHome]/Library/Application Support/Bento and is called bento.bentodb.
  3. Start Bento while holding down the option key so that the following dialog appears:
    bento_selecteddb
  4. Click the choose button and browse to the database you just moved and select it. It should now be listed as the selected database as shown below:
    bento_selecteddb
    Note that in the above screenshots I checked the “Show This Dialog” option so this dialog always shows and I don’t have to hold the Option key down. This is useful if you have multiple Bento databases. I do but don’t want them all in Dropbox. Bento will continue to open the last database selected unless told to do otherwise.
  5. Click OK and the database will open.

Words of warning. Bento isn’t designed to be opened by multiple PCs at the same time so while the database will sync, be sure to only have it open on one computer at a time. Be sure to keep backups in case the syncing causes bad things to happen. I’ve been syncing this way with Bento 4, although Bento 3 does have the same database selection dialog.

Windows Home Server 2011: RC1 to Gold Release

My Ubuntu Home Server caseI spent Saturday migrating my Windows Home Server Rc1 server to the final gold release of Windows Home Server 2011. This was not an upgrade in any sense of the word but a complete re-install of the OS. If all went well the data would survive. I could flatten the OS drive and re-install it, then point everything to the surviving data drives. Despite some careless corner cutting that added, rather than saved, time it all went as planned.

I did this between other chores on a Saturday afternoon so it wasn’t a continuous process, but it was done in an afternoon. Here’s how it went…

The Plan

I won’t be saving anything related to the OS, add-ins or settings. Everything will be rebuilt and re-configured from scratch. All the data is on separate drives from the OS and will survive the upgrade so that once the shares are recreated it will be available, avoiding the need to restore terabytes of data.

I won’t be saving the PC backups done with Windows Home Server. I don’t save data to my PC so the backups are mainly to provide quicker recovery in the event a PC fails.

Backups and Documentation: Despite the best plans things go wrong. Plus I was going to be mucking around with the drives all my data is on, so just before beginning I run one last set of backups and make sure they’re usable. I also verify that my documentation matches the current configuration of my two add-ins (Cloudberry backup & KeepVault backup)

Hardware Changes: I’d be replacing a bad Intel NIC with a Rosewill Gigabit Ethernet NIC (Rosewill is the Newegg house brand). I’d bought two of them for another project but figured I’d give one a try since I had it and they’re dirt cheap.

I’ll also be adding a Areca ARC-1200 two-port SATA controller so that I can mirror the OS drive. The motherboard RAID hadn’t gone well so I’m going with true hardware RAID.

The Implementation

My backups have been running well so it didn’t take much time to make sure they were current. Then it was simply a matter of shutting down the server and moving it to the bench.

Installing the NIC and RAID controller was straightforward. The Areca ARC-1200 was also easily configured for RAID 1 through the BIOS. I also upgraded the BIOS to make sure I’m on the latest version before I start using the controller. The ARC-1200 controller took 75 minutes to initialize the 320 GB RAID 1 array.

Whenever I install any version of Windows I make sure only the OS drive is connected. This was no different. I didn’t want to mess with data cables so I pulled the power from all except the two OS drives.

I booted from the Windows Home Server 2011 DVD (from Technet). While it would boot from the eSATA connected DVD drive the WHS 2011 setup wouldn’t recognize it. But a switch to USB and all was fine. I did have to load the ARC-1200 RAID drivers during setup. I had copied them to a USB thumb drive and they loaded from there. The installation was smooth, requiring a couple reboots. Once the OS was installed I had to update a bunch of drivers so that device manager didn’t show any errors.

Then I shut down the server and connected power to the four drives running RAID 5 off my 3Ware controller and booted up again. I left the other drives unpowered since I wanted the RAID array to be drive E:. Mainly because that was what I was used to seeing it as. Then I powered up the other drivers one at a time and made a note of what driver letter they used in WHS 2011. One of the problems with WHS (at least out of the box) was telling which drive was which from the console.

Once all the drives were connecting restoring the shares was a simple as creating a new share (“Add a folder” in WHS 2011 speak) and pointing it’s location to the directory already on the drive.

whs2011_addafolder

The only hitch was for the shares that were created as part of the WHS 2011 installation. I couldn’t move them off of the newly created D: drive because they already existed in the data drives (from my previous installation). This was easily resolved by renaming the directories with real data, moving the new (but empty) folders and then moving the files from the renamed folders into them.

Once the shares were recreated the only thing left was to re-install the KeepVault and Cloudberry add-ins for backup. Both add-ins found the existing backups just find and synced what was already backed up so the files weren’t sent again. The KeepVault sync took a couple hours and was longer than I expected and longer than Cloudberry, despite having far fewer files (both in size and quantity).

I also set up the server backup right away, and unlike the release candidate the backup is working. I‘ve been backing up the OS and the directory that holds the PC backups. These are the only items that don’t get backed up elsewhere through my usual methods. But I’ve never gone through a complete restore. Hopefully I won’t be trying it here, but it is on my list to test on another server.

Summary

Overall it was a smooth afternoon of work. The lack of a true upgrade wasn’t a problem since I’d never do anything other than completely overwrite beta software. I’m not a fan of benchmarks since I don’t have the patience to set up good controls, but the rebuilt server feels faster than the RC1 build. Whether it’s the new controller, software improvements or just overwriting some OS corruption will never be known but I’m happy to see snappier performance.

Other than the backup add-ins I’ve yet to install any other add-ins. I’m going to run things for a week or so to make sure everything is stable before I start making changes.

I’ve yet to try a PC restore from the WHS backups, so while the backups are running I have to stop short of saying they work. Since I don’t keep real data on my PCs this isn’t a huge problem, although I will be testing it since it would provide a quicker rebuild process when my hard drive fails.

I’ve gotten used to the lack of WHS v1’s drive pool and folder duplication although I’m still hunting for a reason to use WHS 2011 over Ubuntu or a basic NAS product. That may change if the add-in’s that appear meet my needs but for now the server is just a plain old file server. Kind of boring, but with multiple-terabytes of my data on the server boring may be a good thing. Yet I still hear the ghost of Ubuntu Home Server calling to me.

What are your Windows Home Server 2011 plans?

Obvious Lessons Learned

Random Access - System Builds category tileSometimes I never learn. Ran into two self-inflected speed bumps while rebuilding my Windows Home Server 2011.

1. Read The Messages – I spent about 20 minutes trying to figure out why WHS 2011 didn’t like the drivers for the RAID controller. Then I finally read the message and didn’t jump to a conclusion. The problem was it couldn’t find my eSATA connected DVD drive. A switch to USB and all was well.

2. Power is important. Balance a power brick near the power strip’s power switch and it’s sure to cut power at the worst time, despite a mental note to be careful. So rather than take 60 seconds to be careful I nudged the power switch at cut power to the server. The RAID array needed rebuilding and a file copy died mid-copy. I decided to start over rather than risk a problem going unnoticed. This time with the server powered from a UPS rather than the mouse-trap like power strip.

The install is progressing so I’ll make this a quick post.

Windows Home Server 2011 Available

Microsoft has released Windows Home Server 2011 to Microsoft Technet and MSDN subscribers. I started my Technet download and plan to install the final release on my server this weekend. I’m hoping I’ll be able to wipe out the OS drive but keep the files on the data drive and just re-share them. Tedious to recreate all those shares, but quicker than restoring terabytes from backups.

Frustrations of a Bad NIC

I’d recently mentioned that the performance of my WHS 2011 box wasn’t up to what I expected, and worse, it was inconsistent. Well, Saturday morning it was absolutely terrible, with constant timeouts when I tried to access files on the server.

It was easy enough to isolate the problem to the server, rather than the network since communication between other computers was fine and fast. But throwing a wrench into this was the fact that copies run from an RDP session into the server copied files to the PCs just fine. But just to be sure I swapped ports in the switch in case it was a bad port and then rebooted all the network devices.

Then I rebooted the WHS 2011 box. Granted, that should probably have been first, but since the RAID array is in a constant inconsistent state I’m trying to stay hands off until I rebuild it with the final release. That didn’t help and actually made it worse as the RDP connection was not unstable, pretty much timing out after every click.

By this time I was seriously considering rebuilding the OS, although I really wanted to avoid it since I’ll need to do it anyway when the final build of Windows Home Server 2011 is available. So in taking stock of the other things to check I came across the Intel network card I use in the server. After changing the auto-negotiation settings to various hard coded speed values without improvement I decided to switch to the NIC built into the motherboard. So I had to re-enable it in the BIOS then download and install the drivers. But after that, all my problems went away.

It wasn’t how I planned to spend my Saturday morning, but when I was done I had a much more reliable and stable performance. While peak performance seems slightly under what the Intel card provided the performance is much more consistent. All that stuttering and hesitation I had seen previously has gone away. I’ve also been able to stream video from the server, while doing other copies, without any problems.

Figures, I always go for the add-on NIC for better performance and have stuck with the Intel brand since it treated me well. On the other hand, at least it was a cheap part that went bad.