Field Notes: Google Two-Factor Authentication

the Google Apps LogoThere’s been a lot of discussion recently about GMail’s two-factor authentication thanks to the Matt Honan hack publicity. I’ve been using it awhile and figured I’d share my thoughts and experiences. I had been using it for an account that I just used for email so it wasn’t much of a hassle. But I recently added it to a second Google account and it’s been more of a hassle. It’s probably needed more on this account, since it’s used for more than just email so I’ve kept it enabled. In the case of Google the two-factors are a password (something I know) and something I have (my phone).

Here are my notes from using Google’s Two Factor Authentication. For the record, I used my own domain with Google Apps accounts in both cases.

There’s plenty of backup available should I lose or break the phone with the authenticator app:

  • I have the Google App on my iPhone so I don’t need a cellular connection to get the code.
  • As backup I have another phone set up to get the code via SMS.
  • As another backup there’s also printable one-time backup codes. The assumption is that Google can keep these codes secure.

If an app or device doesn’t recognize Google Two-Factor authentication there are Application Specific passwords:

  • “Application Specific” is more a description of the intent, rather than a technical requirement.
  • The Application Specific passwords can be used on multiple devices and applications. I’d prefer they be locked to the first app or device they’re used on.
  • If you use the password in a malicious or poorly written app the password can be used my someone else to access your email. So common sense still needs to be used when using the application passwords.
  • While 16 characters is a long password it’s not as complex as it could be, All passwords are 16 characters and there seems to be a limited character set. While this could be more secure, it would still be extremely hard to crack and isn’t a reason not to use them.
  • The application specific passwords only provide limited access to the account, even if compromised, such as accessing email.
  • Application specific passwords are easy to revoke so they can be used to try out a new app and then revoked if the app isn’t used.
  • I’ve had some issues where my iPhone email (for example) decides it needs a new app password and I have to re-enter it. This is a pain as I have to go to the website and generate a new one then type it into the iPhone.
  • While I can see the last time the application password was used, I can’t tell where it is used, so if the password is taken I wouldn’t notice, unless I stopped using it.

Misc Notes:

  • The initial setup is a bit of a pain. When two-factor authentication is turned on all the existing logons will break and have to be redone.
  • PCs can be made “trusted” and then for the next 30 days it won’t be necessary to enter the code when logging on.
  • If Google Sync is used (in Google Chrome) it’s necessary to use a encryption passphrase specific to Google Sync, the account password can’t be used since an application specific password is required. Well actually, an app specific password can be used, but it would have to be remembered and used as the app password for all Google Chrome logons, which goes against the design of the application passwords.

Anyone else using Google two-factor authentication? What’s been your experience?

The OSQuest Trail Log #65: October Blizzard Edition

Things got rolling again on the Quest with upgrades all over the place, new router software, new cloud services, new problems and new frustrations. Lion’s everyplace I want it to be as is iOS5. I finally joined a social network and I’me using a service based on the buzzowrd “cloud”. Then as if to signal I was going down a bad path I lost power for 4 days and the internet for 5.

Picture of trees covered with October SnowAnd mother nature keeps right on attacking, Not content to wait until, winter officially starts mother nature decided to hit Connecticut with some nice, heavy snow clinging to all the picturesque October foliage. Eventually bringing much of that foliage, and the limbs it was on, crashing to the ground and bringing along power lines for good measure. I lost power on Saturday and just got it back Thursday, with cable/internet following on Friday. So I went through gadget withdrawal for a few days. The picture above is from Saturday afternoon after a couple hours of snow and before the trees started coming down. Luckily the ones around me missed cars and buildings. While not everyone was so lucky, I was pretty suprised by how many downed trees there were that managed to find open areas rather than other nearby targets. But on the the tech…

The highlight of the month for me was my first podcast as a guest on the Home Server Show podcast.

New Software

I installed CrashPlan backup on Windows Home Server, taking advantage of a discount offering unlimited online backup for $42/year instead of the normal $49. The backups been working well although I’d been hoping to get a bunch up there in October since I had plenty of space left in my cap this month, The power outage ruined that. It’s uploaded just over 54GB with about 15GB in the queue that started uploading once the internet came alive. But it’ll be later in November before I add much more. I want to avoid having to throttle myself by using too much early on. I figure I can do about 100GB a month and stay under my cap but want to play it safe.

I moved from using Untangle as both a router and unified threat manager (UTM) to using pfSense as a router but leaving Untangle as the UTM. I’ve been happy with the results and was just beginning to dig into some of the features during the snow-shortened weekend. I’ve started digging into pfSense a bit, more poking around than R & D. I also plan to do some testing to see if a caching proxy will reduce the bandwidth I use. I figure I need to make sure it will cache software and patches in order to make a dent in the bandwidth I use. (The cache in Untangle didn’t actually serve much from it’s cache when I tested it.)

Updated Software

It seems like everything I use was upgraded. But the highlights were…

I put Lion on my desktop Mac Mini only to find my upgrade fear imposed delay was unwarranted. Everything worked find with only a minor Synergy frustration due to Lion’s new feature where mouse movement doesn’t wake it. I’ve no plans to put Lion on anything besides this Mini and my Air. The other Macs have no reason to upgrade or the upgrade will remove features I use.

iOS is also updated to iOS5 of course. Despite some frustrations I managed to get both my iPhone 4 and iPad 2 upgraded. I’ve experienced shorter battery life for sure although no where near as bad as some complaints I read. I tend to keep things turned off and I hadn’t enabled much of iCloud. I saw the worst performance the days I was home and had a wireless connection. Despite typical usage the battery drained far faster than when I was in the office with wireless on but no network to connect to. No scientific test but typical usage each day. By 5PM at home the battery would be around 20% while usually above 50% in the office. But I just read Apple has an update in beta that’s supposed to help.

Not really software, but Google Reader saw an update. I use Google Reader on my computers (with the account being used by iPad apps). The timing was bad, the update came as I was grabbing some battery powered 3G access during the blackout so the last thing I wanted was change. I’ve trying to avoid hating it just because it’s different. I didn’t use any of the discontinued features so no complaints there. But I found it easy to blow through a bunch of articles and star ones I want to read later. That’s become a problem. Ignoring the performance problems (very uneven scrolling) the star isn’t in the same place in every post. It’s now at the end of the title, rather than the first things. And speaking of the titles, while it has a clean look the article titles blend right in. Sure the star at the bottom of the post is always at the very left, but it’s hard to find and I star based on title. I’ll be looking for a new desktop reader and use my iPad more to check through the feeds.

Google+

While I still maintain my status as the last human not on Facebook, I finally broke down and joined Google+ when they enabled it for Google Apps users. It wasn’t much later that I lost power so I don’t have much to say about it yet. I did find the Google+ iOS app doesn’t like Google Apps users and tells me to go get an invite when I log on, but the web interface is fine from the iPhone.

iCloud

iCloud was making news and I moved my MobileMe account to iCloud. I wasn’t a big MobileMe user having been burned by Apple’s cloud services in the past. I think my problem with iCloud (beyond not trusting Apple to keep things running) will be that it requires users to dive into the deep end, accept the way it works, and don’t expect a lot of options. I gave photo stream a try. Problem was my camera fumbling uploaded more bad pictures than intended pictures. Not really an iCloud problem, but still a problem. But I have no doubt it will improve over time and I’ll be drawn into the iCloud.

Web Work

I spent some time with the plumbing of the website. I seems like I have a bunch of minor issues that I can’t seem to get to (or keep putting off). At least I was able to tackle a logrotate configuration change. I also changed my caching plugin back to WP-Supercache. I always liked the plugin but stopped using it after it broke due to an upgrade. It’s working again and I’m using it. I did make an errant mouse click and enabled compression which didn’t work (possibly because I have compression enable in Apache). Unfortunately it went unnoticed in my testing and I didn’t notice a problem until my views went way down.

My change to WP-Supercache seemed to cause another problem which went unnoticed until recently. It doesn’t seem to have been rampant, but it was frequent. I don’t quit understand the problem completely but I don’t think it was a WP-Supercache bug. In short my Adsense ads would often display a “Page Not Found” error in the frame for the ad. I set up the ads to only display ads to new visitors. View a few pages over the course of a couple months and the ads are supposed to go away. I think WP-Supercache would sometimes cache the “don’t display” page which would cause a problem when a new “display the ad” visitor arrived and the code ran to display the ad.

Home Networking

I had been hoping I had pfSense and dynamic DNS setup to handle IP address changes so I could remotely access multiple home servers using my own domain. Well, when I got my internet back today Comcast gave me a new IP address as I hoped. But alas, no update to DNS, So it’s time to do some more research. I’ll be tackling that this weekend. I’m hoping I just have a pfSense setting wrong. [Update: Got this working so hopefully a write-up soon.]

The Month Ahead

The only thing I really want to get done is getting those home servers set up with Dynamic DNS and pfSense. [Update: Rather easy fix so hopefully a write-up soon] After that I’ll see what catches my interest. I’m also hoping this past week isn’t a sample of what’s to come this winter.

Google’s “Hack”

Quick Bits - Commentary placardThis is the non-story that just won’t go away. The big bad Google drove around stealing data by “hacking” people’s wireless network. Articles such as those at the Huffington Post contain quotes such as “one of the most massive surveillance incidents by a private corporation that has ever occurred.”

Google collects a hell of a lot of data that concerns me more than this. In this case what seems to have happened is Google collected data from unsecured wireless networks as it’s street view vehicles drove around. The real lesson here is Secure Your Wireless Networks! Even if the network was protected by the easily breakable WEP encryption Google would not have gotten the data.

Of more concern than the actual data being collected was that Google used some library code in a project without knowing what it did. I have concerns about how Google collects and uses data along with a big concern about mistakes that could expose that data.

Google’s explanation rings true. It doesn’t make me feel any better about Google’s ability to avoid mistakes, but it doesn’t make me any more worried about Google’s intentions. But our politicians now have an event they can latch onto and appear to be cracking down on privacy. What would make me more concerned is that some governments have requested copies of the data rather than telling Google to destroy it. Luckily some governments (Ireland) have it right and have had Google destroy the data.

Google’s bungling attempt to try and make email social via Buzz concerned me. Google using code they didn’t understand concerns me. Google collecting data that’s already flowing wide open in the air doesn’t concern me.

Sure, they couldn’t do it on the scale of Google, but criminals could drive around doing the same thing, and then using the data they collect.

Update: Well, it looks like Google was throwing away encrypted data and keeping the unencrypted stuff. Still, it’s more worrisome to me that seems to be a mistake or careless rather than some attempt to collect info.

Google DNS – Close But No Cigar

image of WWW on goldAmong Google’s recent announcements was their introduction of Google Public DNS. I’ve been using OpenDNS and have no complaints. Well, actually I recently found I had defaulted back to using my ISP’s DNS (Comcast), probably during a router firmware upgrade. When I switched to back OpenDNS I also didn’t notice a different over Comcast. I wouldn’t have noticed unless I was in the router config for another reason and happened to see it.

Comcast and OpenDNS both do typo hijacking and display a search page with ads rather than an error page. I went through the process of opting out of Comcast’s typo hijacking. OpenDNS also allows an opt-out for typo hijacking which I have set. Interestingly enough, the advertising company – Google, doesn’t hijack typos for ads and they display the error page for typos. But this lack of hijacking wasn’t a benefit for me since my opt-outs were already in place and were working fine.

To be honest I didn’t notice any performance difference when I was set to use any of them. When I first switched from Comcast to OpenDNS long ago I did notice imroved performance, but not this time. So I went looking for a way to benchmark performance and came across namebench. It’s simple to use and provides useful information.

Just download namebench and run the executable. You’ll be presented the following screen:

namebench main screen

The “Benchmark Data Source” is a drop down that let’s you pick one of your browsers or the Alexa Top Global Domains as a data source. Picking your most used browser provides results that are specific to the way you browse. Some people have complained that this could send all your browsing history to one person (the Google developer). Since the source code is public it’s easy to confirm it doesn’t. But, if your still concerned, picking Alexa will use generic sites.

Click “Start Benchmark” to get things going. Once the benchmarking is done (took about 10 minutes for me) a page with the results will open in your browser. At the top will be the information yiu really want:

namebenchresults

The above result is from a run after I’d already re-configured for it’s previous recommendations and OpenDNS is the second fastest DNS server according to the benchmark. The right box displays the recommended DNS servers that should be used. In my case the first one is the internal IP of my local router so should be ignored. (I didn’t include it in the screenshot but you’ll get detailed info on the servers tested. See the previously linked namebench page for samples.

The bottom line is Google Public DNS didn’t make the cut. So, while the accuracy of the benchmark may be questioned (as would any benchmark) it’s pretty clear there’s no Google favoritism. M5Net, UltraDNS and Comcast were my recommended DNS servers. Another note, because of caching the first time run of namebench will deliver the most accurate results.

So, I started off by looking at Google Public DNS but by the time I was done I was off of it. But looking into it I considered the following:

  • This gives yet more of my information to Google, which at it’s core is an advertising company. Their privacy policy is pretty good and Google hasn’t monetized DNS yet. Of all the info Google has on me, my DNS info is probably less of a concern. Let’s face it, someone is going to have this data. It’s Google’s recent cavalier comments about privacy and all the other info they have that’s a concern.
  • Google doesn’t have to match the info to me to benefit. The additional information they collect about were people surf and how often is a treasure waiting to be mined. They don’t need to put ads on error pages to profit from DNS.
  • Google does continuously hit on speeding up the web so it’s likely they’ll keep improving performance. They have studies showing that slow response on their search results generates lower revenue.
  • They also promote security and Google certainly has the money and talent to keep DNS as secure as possible.

Like my recent foray into Google’s Picasa/Eye-Fi deal, Google Public DNS is yet another Google offering that sounded good but wasn’t quit right for me. Like Picasa, Google DNS will stay on my radar and I’ll check it out sometime down the road. Anyone else trying Google Public DNS?

Google Wants Our Photos In The Cloud

image of a compact=Google currently has a deal going that offers a free Eye-Fi card when you lease 200GB of storage for them for a year. When I first saw it it seemed like a pretty good deal, and I hate to pass up a good deal. But it’s less of a deal if I don’t really need the space and won’t use the card. So that got me thinking about my options.

The space is split between Gmail and Picasa. I’m not even close to my Gmail limit and I’m not currently a Picasa user. In theory there’s also some unofficial hacks that allow the space to be used for file storage, like gDisk for the Mac. But I’m not willing to trust something Google may break at anytime so it’s not a consideration. What I’d be looking to use the disk for is to back up my photos. Right now I have just under 20GB of photos and it costs me less than $3/mth to keep them backed up offsite. So that’s $36/year, still shy of the $50.

But that assumes I could easily save everything up to Picasa and I found that wouldn’t be possible. The Picasa 3 desktop allows automatic syncing of it’s albums to albums on Picasa web albums. But this proved to be problematic and not a better solution than plain old backup via Jungle Disk. The deal-breakers were:

  • Picasa is limited to 1,000 albums with up to 1,000 photos in each album. This sounds like a lot but the 1,000 album limit is a deal breaker for me. I keep my files in a directory structure and the number of directories already exceed 1,000. I don’t want to do any drag or dropping to create new albums just for syncing since that’s prone to error. Sure, I have plenty of directories with one or two photos, but I don’t want to re-organize everything , I’m set in my ways.
  • Deleting entire albums from Picasa desktop did not delete the album from the web. Photos within albums deleted just fine. Deleting all pictures in a folder automatically deleted the folder so it’s not like I could keep the folder behind until it synced the deletions.
  • RAW image files were synced to the web as jpg’s so it wouldn’t be a true backup.

While a lot of people like Picasa, there was nothing that caught my attention and would compel me to use it. I’ll keep looking at it and may yet find some compelling feature, but for now I’d have a hard time justifying 200GB for Picasa. Realistically I’d be better off with a lower priced plan.

Then there’s the Eye-Fi card. If it was worth the cost then I could consider $50 for the card and the Google storage as the free product. The version offered is the Eye-Fi Home Video which has a list price of $69. I don’t find it online anywhere for a street price. The closest card is the Eye-Fi Share Video which sells for $73 at Amazon. If I had to guess I’d say the “catch” is that that since the Home Video card doesn’t typically include any online component the only online options are Picasa and YouTube. These are the only online services specifically mentioned in the offer. The Share Video allows sharing with more services. Other, more expensive, cards include geo tagging photos which would add a potentially useful feature.

I like the idea of being able to automatically load pictures from my camera to my PC automatically, but the Wi-Fi card doesn’t offer anything else that’s compelling to me.

So while the Google/Wi_fi offer does seem like a good deal I’m not yet convinced it’s worth $50 to me. I’m still intrigued by Picasa and the web album component so I’ll keep considering it.

I also decided to looks at some alternatives:

  • SmugMug offers online albums along with a “SmugVault” that can be used to store any type of file (such as RAW files) but it’s a subscription service and would cost more than what I have now.
  • The old standby Flickr is $25/yr for unlimited storage. Still, it’s not a good solution for backup. There are plenty off Flickr add-ins and plug-ins so I could probably find one to do syncing, but it still wouldn’t be a true backup.
  • I already use Windows Live Photo Gallery to organize my photos and like it. Plus there’s a free 25GB for online photo albums. But like the others, it’s lacking as a backup solution.

So, the bottom line is Jungle Disk remains the way I backup my photos. I’m really not surprised since it’s cheap and easy. Picasa still has my attention if I want to do some online albums and the Eye-Fi card would offer some convenience. But I’d probably want the version that does geo tagging (although I haven’t done any research to see how well it does that). I may spend the $50 bucks in a moment of weakness since it is a good deal, but for now I won’t be clicking the button to upgrade storage and order the card.

Google Stealth Updater

I’ve been setting up a new Windows 7 PC and I just installed Google Gears for Firefox. I noticed a new task under Windows Scheduler called GoogleUpdateTaskMachine. It’s set to run at every login and when the computer is idle. I have mixed emotions about this. I think it’s good to keep software up top date and for many people or PCs it may be the way to go. Yet, this is the company that recently deemed the entire web to be malware due to human error so it’s clearly not suitable everywhere.

So, the problem I have is that the updater only runs in stealth mode and updates all Google software at the same time (assuming they have updates). There’s no notifications and no options to delay updates. So I went into the scheduler and disabled the task. When I’m willing to take updates I’ll run it on demand. I’ll probably also write some sort of notification wrapper to put around their program so I can schedule it regularly and not have to remember to do it. But ultimately Google should add the notification ability into the updater itself. Microsoft and Apple do it, so can Google.

Here are screen shots of the task is Windows 7 (click for full size):

GoogleUpdater1  GoogleUpdater2  GoogleUpdater3  GoogleUpdater4  GoogleUpdater5

I just clicked “disable” to disable the task in the task scheduler.

GoogleUpdaterService There’s also a service called “Google Updater” that I changed from Automatic to disabled although it wasn’t actually running when I checked it.

Google gears is an add-in to Firefox, which has its own update mechanism (that includes warnings and options to delay) yet they saw the need to install their own updater. No argument from me that Google has a lot of smart people working for it and has been hugely successful selling ads. But, they seem to have this attitude that they know what’s best for everyone and we don’t need to know what they’re doing. Scary thought for a company that collects so much information. Of course, I still use GMail, Google Apps and Google search.

Update: I installed Google Chrome and it installed a second copy of the updater and created a second update take, this one in my user profile rather than under program files (Google Chrome installs in the profile also).

Google Chrome

As you probably already heard Google released their own browser last week called Google Chrome. It’s a beta release and so far only a Windows version is available, although OS X and Linux versions are in the works. It certainly received a lot of positive press.

There’s been a lot of speculation about why Google released it, from the beginnings of a Web/Google OS to market research. My own theory is it’s partly to make money. While free, Google pays out a lot of money to the Mozilla foundation and Apple when people use the Google search box in their browsers and then click on an ad. If Google owns the browser they don’t pay out the comissions.

I also buy the Market Research angle. Both to track see what people use and want on the web and to bolster their advertising business. Google been working more and more to keep people signed in to Google so they can track their browsing. Now they have the browser and unless people turn off the tracking (and sacrifice some features) Google will gather a lot more information, even when your not on their site.

I use Google for a lot of things but I have to admit I’m getting a bit uneasy about their scope of data collection. Although I admit I stick with Google products when they work for me. (I’m still using GMail.) As if to prove they have an insatiable appetite for data Google is launching their own satellite. OK, Google doesn’t own the satellite but their logo was on the rocket and they the exclusive rights among online mapping sites for the data.

Since Google Chrome is Windows only I won’t be using it on any regular basis, whether it’s good or not.

According to the New York Times today is Google’s 10th anniversary of incorporation and they have some comparisions between Google and Microsoft.

The OS Quest Trail Log #8

Another short update for the log.

I spent some time trying to install Boot Camp on my iMac. It couldn’t resize the existing partition to make room for boot camp. I could image to another drive, then erase and image back. But I decided to try out iDefrag and explore the whole OS X doesn’t need defragging thing. My 500GB drive with about 250GB of data had 0.5% fragmentation, after being used for about 9 months. As expected (because they’re large and written to often), the Parallels VM drives were the most highly fragmented as were some Aperture library files. Songs in my iTunes library were also highly fragmented which did surprise me. After a night of running iDefrag my hard disk was compacted and I could repartition it for Boot Camp. So I’ve added a Windows Vista install to the mix, installing it with Boot Camp on my iMac,

Software of Interest

OpenOffice.org 2.3 has been released. The release notes for the open source application provide the list of enhancements and security fixes.

iStat Pro 4.2 has been released for OS X. iStat Pro is a dashboard widget that displays numerous OS stats.

The popular Carbon Copy Cloner has been updated to version 3.

Acorn has been updated to version 1.0.1. The update is mainly bug fixes but has a few minor new features. Acorn is a image editor for the Mac that has a $40 intro price (there’s also a 30-day full-featured evaluation version).

Tips

I’ve been having network problems running Vista under VMWare on my MacBook. Every once in a while I lose network connectivity. Everything shows as “working” but it’s not. I do a repair and all is well. Scott Hanselman has a post showing how to “Reset the crap out of your network adapters in Vista.”

Links & News

Yahoo acquired Zimbra for $350 million. There’s a Yahoo blog entry and press release. Zimbra provides email and collaboration software. Speculation is that this will help Yahoo create an offering to compete with Google Apps.

IBM announced I.B.M. Lotus Symphony. Symphony is an office productivity suite based upon the open source OpenOffice.org. The announcement follows the recent announcement of IBM formally joining the OpenOffice.org community.

Google added the long awaited presentation app to Google Docs & Spreadsheets. It’s called “Presentation” (and Google Docs and Spreadsheets is now just Google Docs)

Google Apps for My Domain – Part 1

the Google Apps LogoThis article is obsolete. Images and broken external links have been removed.

I’ve completed my move to Google Apps and now all my mail goes into my inbox there, one way or the the other. In Part 1 I’ll cover the domain setup and IMAP mail migration using the migration tool, while in part 2 I’ll cover the features that are available to all GMail users.

My reasons for moving to Google Apps were:

  • Sometimes they can be a bit creepy but I trust them as much as I trust any other ISP or mail provider.
  • I want to provide email to family members.
  • My current setup has my mail provided by Bluehost as part of my hosting service. This pretty much puts me in charge of the email server. I just don’t want to have to worry about backups and email problems. It was OK when I was the only one using it, but if I’m going to bring other’s on board it’s just a disaster waiting to happen.
  • EMail is not tied to an ISP.
  • GMail has the best spam filter I’ve ever used.

Google Apps includes Mail (including Talk & Calendar), Docs & Spreadsheets, Personal Start Page and Page Creator. There are two versions, free and Premium. Free allows 2GB for email and is ad supported. Premium allows 10GB for email and allows the ads to be turned off. Premium also has a 99.9% email uptime guarantee, along with mail migration tools and integration tools a business may look for. My only interest in Google Apps is for email.

I started with the free edition but quickly signed up for the 30 day Premium trial so that I can use the IMAP mail migration tools that’s included.

Domain Structure

The domain I use for email is my primary domain with my Bluehost account but there’s no website associated with it. While I *should* be able to use the same domain as the primary domain with Google Apps I decide to be cautious since I’ve never done this before. I registered a new domain with 1&1 and use it as the primary domain with Google Apps. The domains I’ll use are (not the real names):

myfamilyblue.com – this is primary domain with Bluehost and the domain I use with email. I want to use this domain for email addresses.

myfamilyga.com – this is a new domain I’ll register and use as the primary domain for Google Apps. This will be available for email addresses and deliver to the same mailboxes as the other domain, but I won’t hand out the domain name.

In addition, while I can change MX records myself with Bluehost I have to go through tech support to change CName records. With 1&1 I can change both MX and CName records. This means I can make changes myself without having to go through tech support. This will be less annoying to me and less annoying for them if I decide to undo changes.

For the subdomains I’ll want mail.myfamilyga.com to access mail but I’ll use the default URLs for the other tools. You don’t need to use subdomains since Google Apps will give you URLs but I wanted the sub-domain for easy access to the frequently used mail. I can setup redirection for the subdomains of myfamilyblue.com to redirect to the Google App URLs.

There are additional restrictions if you buy the domain from Google, such as not being able to cancel Google Apps for a year. I’ll use my own domain that’s already registered.

Setting Up the Domain

  1. I registered the new domain, myfamilyga.com, at 1&1 and waited for the DNS to replicate.
  2. Then I registered with Google Apps for Domains. I set up the first user during registration and this will is the admin ID.
  3. Google does create a test address so you can test email before changing your MX records. The address is displayed when you first set up Google Apps.
  4. I need to verify the domain with Google before the services will actually start working. Google provides a couple of ways to do this. Either copy a specific html file to the site or create a CName record. I went the CName record route since I wanted one anyway. Google provides instructions for various domain hosts and I used the ones they provided for 1&1. In the case of 1&1 I needed to create a sub-domain then go in and create a CName record for that sub-domain and point it to ghs.google.com. I didn’t have to wait for this to replicate before I could continue, although it does need to replicate before email can be fully used.

    Note: It’s a bit hidden in the help but Google also allows a MX record to verify domain registration. So if your mail system is ready to go you can just create the MX record. Remember, mail deliver will go to Google once the MX record is created so make sure all users are created if they have mailboxes on another server. My domain verification seemed slow so I created an MX record and then verification completed immediately. It may have been a coincidence.

  5. The next step is to set up the users which will also create the mailboxes. I already created a user name for myself while setting up Google Apps. So I set up nicknames for all the other mailboxes and forwarding addresses that I had set up on the old myfamilyblue.com.
  6. The next step is to change the MX records for the domain. As soon as the MX records are changed all the email will start going to GMail so you’ll want all the users set up before making the change. In my case I have a new domain so I changed the MX records immediately so they have time to replicate. The MX record information provided by Google is here. The setup may vary depending on your domain host. Just make sure the entries are in the order listed by Google and that the priorities go from higher to lower. My setup for 1and1 MX records is shown below (click for full screen).
  7. Since I wanted multiple domains reporting into Google Apps I went into the “Domain Settings” section, “Domain Names” tab and added the myfamilyblue.com domain as an alias. Then I went to Bluehost and changed the MX records. Here’s how to set up the MX records at Bluehost.
  8. Test mail delivery to the users that have been set up. It may take time for the MX records to take effect.

You can use this NSLOOKUP(kloth.net) tool to see if the CName and MX records have changed on your DNS server. Enter your domain in the domain field and enter the DNS server (from your hosting/DNS provider) in the server field, then select the record type from the dropdown list. If you registered a new domain in step 1 it may take time for the change to replicate through the internet. For the first 48 hours the query may show your DNS server has the correct information but the rest of the internet may not know that your domain info is on that server.

IMAP Mail Import

I registered for the free-trial of the premium version so I could use the IMAP migration tool. My Bluehost email was in IMAP mailboxes and was the bulk of my EMail.

The IMAP email migration tool is under the “Advanced Tools” tab (premium edition only). I set up the server connection to Bluehost. For server software I picked “Cyrus” (first choice for trial and error) no security and port 143. Some mail systems may require an “IMAP Path” such as “Inbox”. I told the wizard I’d specify a few accounts and then I entered the user id and logon information for the accounts to migrate. I was pulling everything into my one new GMail mailbox.

The migration took some time, about 45 minutes in my case, and is dependent on quantity and size. A progress bar displays the status or you can click into the details and see how many emails have been migrated. As the mail was pulled in the migration tool added two tags, one was the email address of the old mailbox and the other was the full folder path that the email was in. The tagging was an unexpected and nice bonus.

My AOL My eAddress mailboxes are also IMAP mailboxes. I tried the migration tool on them but always received errors soon after the migration began. I only had about 100 emails in those mailboxes and only a couple of folders. So after a few migration failures I went to plan B. The My eAddress mailboxes were already set up in Thunderbird so I created a new IMAP mailbox on Bluehost, added it to Thunderbird and dragged the AOL email to the new account. Then I used the IMAP import utility to pull it into GMail.

Summary

At this point I had GMail working in my own domain. I really don’t have an interest in the other Google App pieces.

Some things to keep in mind:

  • I have two domains. When I set up a user ID it gets one mailbox that is addressable with both domains. So ray -at- myfamilyblue.com and ray -at- myfamilyga.com deliver mail to the same user mailbox.
  • Nicknames can be set up for users. I consolidated all my myfamilyblue.com mailboxes and forwarding addresses into one GMail mailbox by setting up a nickname for each one.

In part 2 I’ll cover importing mail from POP accounts (such as my other GMail accounts) and consolidating all my email in this one mailbox. All things which are available with regular GMail accounts.

Google Video Sales To Cease

Google will be exiting the video sales business. Those few people who purchase videos from Google Video can’t watch them after August 15th due to the joys of DRM. As compensation Google is giving people two bucks to spend via Google checkout. On the positive side, it’s unlikely many people bought videos since the site is shutting down. Another reason to avoid DRM, as if one was needed.