Categories
OS Quest Trail Log

The OS Quest Trail Log #2

It’s only been four days since my last log entry but I want to get on a weekly cycle with these and weekends seem to be the best time to do it. So here goes…

Google Apps

I continue to use Google Apps and like it. Really all I use it for is email, but it’s allowed me to consolidate all my email addresses into one mailbox for delivery. I finally got the hang of using search and labels (aka tags) rather than folders and it makes things much quicker and more efficient.

I never thought I’d see the day but I no longer use a desktop email client, it’s all web based. That does scare me a bit so I’ll probably set up Thunderbird or Mail.app to suck down the email on a regular basis so I have a local copy.

MacMozy

My Mozy Mac backup software had been a solid performer until I updated to the latest version on the 9th. Today the backup hung up in a “Backup Started” mode. It’s been that way for 10 hours. I just clicked cancel and it’s still hung up. If this is the same problem I had several version ago it’ll stay that way until a reboot (or I manually kill and restart all the processes.)

Blog at WordPress.com

My new blog is going along well. Looks like I haven’t been wasting my time and will probably make it public next week. Because WordPress.com hosts the blogs they are restrictions on what can be changed and added. The boundaries are good for me as it keeps me from going down the ratholes of themes, plugins and widgets. This blog is for the ratholes.

.Mac Returns

I subscribed to .Mac again. It’s one of those things I couldn’t stay away from. Now that I have a laptop that I use a lot it’s beneficial for me to sync with it. Standalone Sync software costs $50 or so. .Mac is an extra $30 or so (don’t buy it from Apple unless you get it bundled with a new Mac). The extra disk space, now 10GB, helps a lot since I can use it as a data directory for just about everything I want on the laptop and it will automatically be backed up and synced with my iMac. I’m still liking Yojimbo and will probably keep it so the .Mac sync for Yojimbo is a real plus.

New Software

My copies of iLife ’08 and iWork ’08 arrived late in the week. I haven’t had a chance to really dig into the software.

For iPhoto ’08 the Events seem to be simply rolls renamed. Except now they make sense and the new features make them more useful. Of course I was burned. Rolls were so useless that I simply combined all my photos into about a dozen rolls of over 1000 photos each. Getting them into events may be more trouble than it’s worth. I haven’t had a chance to dig into anything else.

For iWork (yea, I broke down and bought it) I like Pages now. It seems more accessible to someone like me who just needs a simple document every now and then. Numbers seems really cool. In typical Apple fashion they seem to be emphasizing looks. There’s 18 templates most of which are heavy on the eye candy. The UI elements that stand out are related to styles and formatting.

Bits and Pieces From the Web

Power Replacements for Built-In Windows Utilities (Lifehacker) – All free

Microsoft has released Vista Performance and Compatibility updates – But not through Windows Update yet.

Wired has a BitTorrent starter Guide.

iMovie ’08 Users Can use iMovie ’06 (TUAW) – Some people don’t like the completely rewritten iMovie ’08. Apple has a download available for them so they can also use iMovie ’06.

How have your travels along the PC trail been this week?

Categories
OS Quest Trail Log

The OS Quest Trail Log #1

It’d time to introduce the “Trail” to the OS Quest blog. It seems like the type of thing that should be done on a weekend, or maybe just once a month, or maybe whenever I can fill a post. (I’ve really thought this through) I’ll recap the things I’ve been working on and the things I’m thinking of working on in my little world of personal computers. I’ll also throw in tidbits I’ve come across but don’t warrant their own post (or I don’t have time to do them justice).

Blog “Re-envisioning”

This is just a fancy way of saying I’ve been looking for a way to keep blogging and computers interesting by changing things up. So the Trail Log is one way to keep blogging regularly and to be able to move among different projects depending on what I feel like doing at the moment without feeling the need to finish up something so I can post about it.

And less face it. By keeping track of things as I move along, rather then at the end I’ll be able to answer the question “What the hell was a thinking?” six months later.

Google Apps (formerly known as Google Apps for Domains)

I started considering Google Apps mainly as a family email solution. For better or worse I decided to jump right in and committed to it. Bluehost does provide email and I’ve been using that primarily. But two things concerned me. The first was the lack of SSL or other secure connections (this can be remedied at added cost) and the second was that really didn’t really want to be my own email support if I added family members.

Naturally I made it a little more complicated than it had to be but I ended up where I wanted. I’ll provide more details in a future post.

While some of the things Google does creeps me out I basically trust them as much as my ISP (AT&T/Yahoo) and other email choices. Google shows me more respect using their free product than I get from AT&T/Yahoo as a paying customer. I took the Premiere free trial so I could use their email migration tool but I’ll probably go back to the free version. I have three domains pointed to Google apps for email (like I said, more complicated than it needed to be.)

WordPress.com

WordPress.com is the free blog hosting service offered by the WordPress people. My motivation here is two-fold:

  1. Due to the whole “blog re-envisioning” thing I’m considering doing another more general blog and I don’t think I want to make this one more general.
  2. Along with family email there may be some family blogs or websites in the future so I want to get familiar with it and the best way I can do that is use it.
  3. (yea, I know I said two-fold) A minor reason is I don’t want to add the full support for another website or blog at this time.

Photobucket/Flickr/Picasa

Keeping with the general trend of webs services that family can use but I don’t have a lot of support issues. At least not issues specific to keeping software going. In addition, WordPress.com seems to lend itself to hosting media on one of these sites.

I’m leaning to Photobucket but haven’t decided. Any suggestions or potholes?

Google Reader

I’ve also switched to Google Reader over the last month. When Google Reader first came out I didn’t like it and I guess just didn’t get it. I’d been using Yahoo’s web email client to read some feeds and then using the Mac specific Newsfire (received with Macheist bundle). I never really liked either one of those. I tried Google Reader since it was browser based and I could get to it from anywhere, unlike Yahoo (my company block web email so the built in RSS was collateral damage) and Newsfire. After a couple of weeks of forcing myself to use Google Reader my brain cells lined up and I got “it”. The “it” being the benefit of RSS over visiting the sites from my blogroll links. The couple of weeks time-line isn’t fair to Google Reader, I’d frequently just go to the website instead, especially the first week where it was more not using the old reader than using Google Reader. But now I’ve gone from a handful of subscriptions to 78 and I’m more efficient.

What RSS reader are you using? Any experience with Google Reader?

Yojimbo

I’ve started using Yojimbo (a Mac info organizer by Barebones Software). I was going to look at various options before picking one but I decided to save time and go with Yojimbo. It doesn’t have the longest feature list but it seems to be the easiest to use. I don’t make a living doing research, I’ll give up features for ease of use. I’m still in the trial period and only have a few records entered but unless I find a problem I’ll be buying it.

Bits and Pieces from the Web

Lately I’ve been reading a lot of articles that annoy me and find that AT&T is at the root. In another case of AT&T disrespecting their customers CrunchGear has an article about AT&T raising rates and cutting service to customers using legacy systems. Come on AT&T, wanting them off is fine, send them a letter saying the service is ending by XX/XX/XX. Maybe even show good faith and offer a special deal to upgrade. Instead you screw them.

During last Tuesday’s Apple event TUAW.COM posted an article about someone who noticed a new icon was on Steve Jobs PC and they speculated it was for Numbers. I realize Apple is the master of hype but is it really a scoop and all that important to try a “break” news during an event that will make the announcement?

Kudos to Talino.org for posting an article on how to delete the U3 partition from USB thumbdrives that have them. I had to do it once and I couldn’t do it from a Mac. The article tells us how to do it when the software’s been broken by a reformat attempt on a Mac. Guess I was lucky, windows was handy for me. U3 is such a bad idea, I avoid buying USB keys with it even if I pay a dollar or two more.

Since I’m bashing AT&T I’ll end with one more anti-AT&T mention. AT&T made news by saying no one wants $10 DSL. They say their not hiding it, although others say it’s hard to find, and even they admit it’s not shown as an option until a phone number is entered.

That’s all for the first Trail Log. What have been your experiences this week?

Categories
Websites & Domains

SiteUpTime – Web Site Monitoring

I came across SiteUpTime.com which does web site monitoring, as the name implies. They offer a free plan which I just signed up for to monitor The OS Quest.

The free plan includes:

  • 1 Monitor
  • 30 or 60 minute check intervals
  • 4 Monitoring Locations
  • Email Alerts
  • Monthly Reports
  • Online Statistics
  • Control Panel
  • Web Server Monitoring (http)
  • Email Server Monitoring (pop3)
  • Email Server Monitoring (smtp)
  • FTP Server Monitoring (ftp)
  • Public Statistics

There are some reguirements on the free account. You need to link to their site (I added a badge to the right sidebar) and you need to agree to receive occassional emails about their services. Plus you can only have one free account.

The way the system works is you specify a primary location (of the four) when creating the monitor. If the site in not accessible from the selected monitor then additional locations will be checked. The locations are San Fransisco, Chicago, New York and London.

The also have Premium and Advanced plans which increase the number of monitors, frequency of monitoring and the type of services that can be monitored.

It’s worth noting that your monitoring services, not servers. So if you want to monitor both ftp, web and email on a server you’ll need three monitors.

I just signed up so I can’t really say if the service is as good as it seems. But, it will be interesting to see how Bluehost does, even though a 30 minute interval leaves a lot of time for unnoticed server reboots.

Categories
Mac OS X

Superduper!: Disk Cloning/Backup For Mac

Image in this post have been lost.

I’d purchased and used Superduper! with my Mac Mini but when I moved to the iMac I stopped using it. I had used it as a cloner and I didn’t really want to do clones as my primary backup anymore (so I thought) so I switched to Apple Backup. Apple Backup has it’s faults (that’s a different story so see my Apple Backup post) so I decided to give Superduper! another spin for doing backups.

I don’t remember if I ever used Superduper! on my iMac but it was already installed so I fired it up. I was immediately prompted that there was an upgrade ready, version 2.1.4(82) [from 1.5.5 (v74)] so I clicked the button to do the upgrade. It came up as an unregistered version which answered the question if I had used it. The unregistered version does provide a subset of features that will never expire but I dug out the registration number and plugged it into the software.

Installation

Installation is simple. The download is a disk image file (.dmg), open it via double-clicking and drag the SuperDuper! icon to your Applications folder.

Using SuperDuper!

First I started with the basics. I wanted a backup of my home directory, not a clone of the hard disk. This way my backup uses less space and I can put other files on the target drive. When SuperDuper! starts the main screen is displayed.

The first time the program starts the fields up to are blank.

The “Copy” field is the source. You can pick any source drive, internal, external, iPod, USB thumb drives, etc… You do not pick a directory in this field but can limit the copy later. For the “To” (destination) you also pick any drive (other than the source). The destination has to have enough space for the actual files but can be smaller than the source if there’s enough free space for the files.

The “Using” field contains the pre-built (out of the box) scripts for file selection. You can also create your own scripts but that’s beyond the scopt of this review. The “Using” (script) choices are shown in this screenshot.

“Backup – user files” will backup your home directory. “Backup – all files” will clone the entire hard disk except for certain files that Apple says shouldn’t be cloned.

The “Sandbox…” selections are a nice feature of SuperDuper but aren’t really backups. The sandbox creates a bootable copy of your system on another drive (or partition) but the data is shared with you regular boot drive. The sandbox helps you recover quickly if your boot partition fails but your data is not duplcated.

Currently, I only use SuperDuper to backup my user directory on my iMac. On my Mac Mini I cloned the entire drive and could boot off of either drive at any time. I’ll concentrate on my iMac here. I backup it up to a Disk Image. There are two disk image destination options. The first, which is the one I use, is “Read/Write “Sparse” Image. The sparse image can be used over and over and it will grow to accommodate the data. The second type is a “Read Only Disk Image”. This is recreated in full each time. “Read Only Disk Image” is recommended when multiple systems are being restored from a single image.

Then you have a choice of options. You can repair permissions before the copy (only available when doing a full clone). You can do a “Smart Update” which means it will only make the changes (add/modify/delete) necessary to make the destination match the source. It’s much quicker to do the backup this way. You can also optionally erase the destination instead and start fresh. Other options include only copying new or updated files (nothing is deleted). These last two items are intended for merging images and should not be used to create a bootable disk.

You can also tell SuperDuper! what to do when it’s done: quit SuperDuper!, nothing, shutdown computer, sleep computer, restart from the destination, or set destination as startup disk. The last two options are only available if a full clone (not just user files) was done to a physical drive (not an image file).

I didn’t use any of the Advanced options but they include running a shell script before or after the copy, copying the ACLs, to create a disk image of the backup (in addition to your backup). You can also automatically install a package on the destination once the copy is done.

Once you’ve set everything the main screen will tell you what’s going to happen.

You can click “Copy Now” to start immediately. You can also click “Schedule” to set up a recrring schedule. Multiple schedules can be configured and the options for each schedule are presented on one screen.

Like the main screen, what’s going to happen is clearly described.

The Mac cannot be asleep (or off) when the scheduled time arrives but unlike Apple Backup it’s not affected if the Mac is scheduled to awake just before the scheduled backup time. Even the “Smart Update” can take some time if it has a lot of files to check (like your entire drive) but this is true of any backup software. (But “Smart Update” takes a fraction of the time on my Mac. Last night it took 14 minutes in total. It had to evaluate 186.2GB containing 159,641 directories, 726,663 files and 30547 symlinks. It had to copy 4,905 items totaling 7.95 GB which was 556 directories, 1,915 files and 2,434 symlinks.

Restoration is simple. To restore individual files just attach the external drive and drag the files back. If the backup was made to an image files simply mount the image file and drag the files back. To recover from complete disaster you can either boot from the cloned drive (if a full clone was done) or boot from the OS X DVD and run the Disk Utility to restore from the external drive or image file. The SuperDuper! manual has complete (and short) instructions. The important thing to remember is SuperDuper! doesn’t have any native restore functions. It clones/saves the files in a way that can be accessed through OS X or, in the case of a complete failure, through a standard OS X restore process. You do not need SuperDuper! to do the restore.

SuperDuper! also includes ability to create copy scripts so you can customize what gets or doesn’t get copied. You can either modify the four standard scripts or start from scratch.

Summary

Pros

  • Excellent Value- I bought the software back in September of 2005 when it was version 1.x. When I fired it up this week it was upgraded to version 2.1 at no additional cost. I couldn’t find an upgrade policy on their website, but I’ve never been charged for an upgrade since I purchased it.
  • Don’t need SuperDuper! to do a restore. No hunting for or configuring a program to get your files back.
  • Scheduling option works when the Mac just wakes up, unlike Apple Backup (sorry, had to mention it again, it’s a pet peeve of mine).
  • Easy to understand interface that clearly says what it will do based on your selections.
  • Backs up all attributes. I did not have any problems with missing meta data when restoring files. This probably in part due Finder being used to restore the files, but the quality of SuperDuper’s copy engine shouldn’t be minimized. Finder couldn’t copy what’s not there.
  • If you clone the entire disk then recovery is as quick as booting from the backup disk if your main drive fails completely.

Cons

  • Basic disk cloning/file copy only. Lacks more advanced backup features such as keeping historical versions. You could use the scheduler to set up a rotation to different backup location but this would use a lot of disk space.

SuperDuper! is a low cost method of getting quick, reliable safety backups. It doesn’t include advanced features like encryption (useful for safely storing backups outside your house) or the ability to manage multiple or historical backups. What it does do is reliably and easily clone disks and copy data as good as or better than anything that I’ver seen. If you don’t need those advanced features then SuperDuper! is all you’ll need. If you want those advanced features you’ll probably get SuperDuper! as a disk cloner and then license the full version to use it as a quick, reliable safety backup.

Try/Buy

Superduper! is currently available for $27.95 from Shirt Pocket Software. There’s a trial version available. The trial version can be used forever. The features missing from the trial version are scheduling, smart updating, sandboxing and scripting. If you’re going to be using SuperDuper for backup’s you’ll want the licensed version.

Categories
Backup

Backups – Part II – My Modern Era

In part one I covered my “formative years” where backups were little more than creating multiple copies of the files the best way I could. Since I always got my files back when I needed them I could claim it worked, but there was luck involved too. My move to using a Mac as my primary desktop a couple years ago was a good opportunity to examine my backup strategy.

Backups for the Mac ended up being easier than I expected. At the basic level I could simply buy a external firewire or USB drive, hook it up, and clone my disk. There were several options to do this. Carbon Copy Cloner and Superduper were common at the time, Carbon Copy Cloner is donationware and Superduper is now $27.95. Superduper has a free trial and the basic cloning feature is not time limited.

So I got a firewire drive that stacked with my Mac Mini and downloaded a trial copy of Superduper. I picked Superduper to try because of it’s advanced features and it seemed slightly easier to use. At he time CCC didn’t include a scheduling ability and SD did. I ended up buying Superduper and my backup routine simply became a nightly scheduled task to clone my disk. Superduper also had the ability to clone to a disk image rather than a physical hard disk which allowed me to do additional backups when I needed them.

Under windows I had classified my data into importance so I could concentrate on regular backups for my critical data. By cloning the entire disk I eliminated the need to do this. But all my data was still stacked on my desk all cabled together. For awhile burning to CD seemed like a acceptable option but it still had the flaw of requiring me to actually do something to get the backup done.

So I looked into and eventually purchased a .Mac subscription (if your considering .Mac look at Amazon.com or another seller to save money over the Apple direct price). I decided to get one, primarily to be able to use Apple Backup to back up my critical files to iDisk. But being paranoid I didn’t trust iDisk security to protect the files. That led to purchasing a copy of Suffit since it had the ability to create encrypted archives.

Once all that was combined I ended up with a process where Stuffit create a full (all files) encrypted archive of the directories I want to back up on a weekly basis. Then on he following days it creates an encrypted archive of the files that changed. Then Apple Backup backed the encrypted file up to the hard disk.

The main benefit of this setup was that I didn’t actually have to do anything to get the backups (well, I had to leave the Mac turned on), everything was scheduled. While I’d had problems using Apple Backup to backup to CD my test restores of these backups always worked.

Then in January I got a new iMac and it was time for another change. All I was using Stuffit for was the scheduled encryption, they had released two newer versions and I didn’t the point of upgrading. Besides the cost (eventually I’d be forced into an upgrade) Stuffit just seemed to add too much to the system and I didn’t want it on my iMac. So I switched to using the built in OS X encryption to create a encrypted disk image which could then be copied to my iDisk. The downside of this is the entire disk image was copied each time which can be time consuming since my upload speed is rather low. But those files only changed once or twice a week so an automator action meant it was easy enough to kick off the update. All-in-all it was a workable solution.

Then I moved away from cloning the HDD and simply used Apple Backup to backup my home directory to an external drive. I also began using Mozy instead of my iDisk. At first (when Windows was the only option) I copied the encrypted disk image to my Windows machine so Mozy could back it up from there.

Right now my backup process is:

  • Nightly Apple Backups of my home directory (minus iTunes library) to my external drive
  • Nightly Apple Backups of my iTunes purchased music to my external drive.
  • Backup my important files (the ones in the encrypted disk image) to Mozy using the Mac beta
  • Copy the encrypted disk image of my important files to my iPod which is away from my PC when I’m away from my PC. This is set up as an automator action so it’s fairly simple and quick. It’s the only part of my backup process that’s not scheduled.

I’m not completely happy with Apple Backup so I’m hoping Mozy or something similar will prove to be a suitable replacement. My gripes against Apple Backup are:

  • It has issues running a scheduled backup when a computer wakes from sleep.
  • The first backup is a full backup. Every backup after that is incremental. This means I’ll eventually run out of space on the target disk. I then have to manually delete the files and trigger a new full backup. If I don’t manually trigger the full backup it will do another incremental even though the earlier files are gone. (When using iDisk there’s an option to delete the iDisk backups which will automatically do a full backup the next time, just no such option when the target is a external drive.)
  • Having all those incrementals makes me nervous that one in the middle will go bad and I won’t be able to restore. But this could just be my paranoia.

In the interest of full disclosure, the link to Mozy is a referral link. If you use it to sign up for a free Mozy account I get another 256MB of backup space added to my free account when you do your first backup.

Categories
Backup

Backups – Part 1 (My Formative Years)

I seem to be slipping into a backup theme in pending posts, plus it’s time for me to adjust my backup strategy at home. So, I figured I’d write up my backup related history and biases. This is part 1, so that must mean there will at least be a part two. I’m hoping I don’t need a part 3, but we’ll see.

In the beginning…
Back in “the days of DOS” my backups consisted of floppy disks. (Remember those? And I don’t mean those smallish 3.5″ ones. I mean big floppies that actually flopped.) Usually a “backup” meant making a copy of the diskette, or copying the files from the hard drive to a diskette (and yea from one floppy to another because there was no hard drive). Diskettes failed so there were usually multiple copies and I was forever trying to remember which was the latest copy and what files were on each disk. (Labels? To much trouble. Eventually I figured out pencils would write on the diskette itself. Just never a pencil around when I needed it.)

Then technology moved forward and software was created to do backups, to floppies. This solved the problem of numerous, badly labeled diskettes. Now there could be a labeled box with labeled diskettes. The box being “Set 1”, “Set 2”, etc… and the diskettes numbered one through whatever. In other words “a system”. This was great for organization. But then there’d be a need to recover a file and the technology brought new fun to the restore process. Usually a bad diskette in the middle of the set. Oh yea, also be sure you had extra copies of the backup software as it was needed to read the disks. Technology crept forward and tried to address those problems. But it was never quit right.

Then technology moved forward again and floppies were replaced with backup tapes and even stuff like zip drives. Less swapping and now when a single piece of media went bad you lost lots of data. Maybe it was me, but I never had a backup system with media I could really trust. I’d have many copies and somehow managed to survive. But it was a painful existence, and those tapes were expensive.

Lessons learned, habits formed…
Some of the lessons I learned in these early days were:

  • Data on my computer is very organized. Data will tend to disorganization once it leaves the bounds of my computer unless I’m forced to fence it it. If there’s a flat surface I will put something on it. If that something is also flat I will put something else on it. And so on. (FYI – small flat things like diskettes or CDs tend to get larger flat things like paper and magazines put on top of them.)
  • I think I have a good memory, at the time a file is backed up. This means minimal or no labels. Write a date on a disk, I’ll know what’s there. I’m wrong about that, I don’t have a good memory. But I can never remember that fact and am forced to repeat my mistakes. On the plus side, swapping disks to find a file could be done while watching TV and is a mindless activity. It also provides an incentive to protect the original data source in order to avoid the whole exercise.
  • I don’t want to spend time addressing those first two bullet points if it takes any time at all. There were programs to catalog tapes. Labels do exist. They just didn’t exist in my house. It’s a personal failing. I accept that but never attempted to correct it.

Some of the early habits formed were:

  • I always had a copy of current and important work. I’d have a batch file that would copy a current days work or important directories to a floppy even in the early days. The more advanced versions would zip files first. Even after CDs were around floppies were easier and quicker. I tended to have many copies of these as a crude versioning system and also as a way to have a second (or third, or fourth) backup.
  • I tended to organize my computer hard drive in a way to make backups easier. Files were “archived” to a section of the disk when I didn’t need them or I knew they’d never change. Then they’d be backed up (or copied) to a couple floppies/tapes/CDs and filed away. Since this was an infrequent event I’d actually take time to label things.
  • Backups media will go bad. I always had extra backups, even if it meant rotating backup sets where that “second” set was older.
  • Restore some files every so often and make sure they work, especially for archived backups. If the restore failed I made a copy of the second set.
  • I backed up data, not programs or program configurations. Whenever I re-installed my PC it was usually for the purpose of cleaning things out so I didn’t want anything except data from my old PC. I kept copies of the program disks but if a program went bad it was a re-install, not a restore This habit solidified in my Windows days. It’s changed a bit for Linux and OS X. Not so much because there’s not a need to clean up, but just because all the configuration is file based and it’s easy to copy and restore (or delete if it’s suspect).
  • I tended to keep my backups small as it was just easier to deal with. I was never into using clones (like Ghost) as a way of doing backups.

I got my data files, now what…
Luckily I never got bit by this (just close calls) but just having the data wasn’t enough. This became a problem as I abandoned Windows and moved to Linux. Moving the data wasn’t a problem but I had archived older data. There came a time when I had to pull out some archived data that had been created with Windows software. I had rebuilt my Windows PC without most of the software. I had to hunt around to find copies of the original programs to read the files. Then I had to install the programs to read the data.

Some of the files were simply scans but they had been saved in a proprietary format. My move to Linux had actually solved this issue since all the scans were now standard graphic files or pdfs. There are numerous viewers for them, on every operating system I’m likely to use.

So a new lesson learned here was:

  • Data wants to be widely viewed. I now use a standard format whenever possible so it can be read in whatever is handy. If a standard format does go obsolete it’s time to save some viewers.
  • If the data is proprietary make sure the program that created it is with the backup files. There’s still a catch here in that you need to have an OS that will run the software. See the previous bullet and avoid the whole issue.

The Three Horseman…
Alright, it’s four horseman and they bring bad things. But I’ve found I group my data into three categories which prevents bad things.

  1. The really important stuff (in my life). This is mainly financial or “life” records. Stuff that will cause me financial loss or extreme hardship if they are lost. Generally, this is the stuff I also need to keep locked up so it doesn’t fall into the wrong hands. These are also the things I want in a standard data format, or lacking that I’ll include the software to read the file. I’ll also throw things in here that are easy to save (I have a lot of text files in this category) even if their loss is minimally annoying. An address may not be critical, but it’s easy enough to save. It’s also the things I’d need in the event of a major catastrophe that affects more than my hard drive (like a house fire).
  2. Files I want because losing it would entail some financial loss or the lose of time to recreate. Examples are some MP3 files, important pictures, videos and software installation files. Losing these would be a loss, but one I’d get over with minimal pain.
  3. Files I won’t miss or can easily replace if they’re lost. This can include old software, PC configuration files. I also include MP3 files that are both on my iPod and on physical CDs in this category. I things go terribly wrong and I lose both the PC and the iPod I can re-rip them. Misc pictures and videos also fall into this category. If I lose these I may never know it. Or the impact will be no more than a small speed bump.

While I’ve never formally though of it this way until now, I’ve been doing my backups in this way for a long time. In the old days the category three files may have been to much trouble to back up and they wouldn’t get done. Then CDs came along and I’ve burn them to CD every so often. I’d verify the backup when it was made but usually never again.

For the category two files I’d be a little more conscientious and make sure I got backups every week (or month) and test them every so often. These tended to be files that didn’t change that much so occasional backups were no big deal.

I’m paranoid about the category one files and always had some backup routine to make sure they got backed up. Historically, the size of these files were always relatively small (even today this backup is only 200MB).

The three categories are really just a way of putting a value on the data so I know how much I want to pay, in either time or money, to back them up.

Part 2 – Enter The Modern Age
Until I got my first Mac my strategy was basically “copies everywhere, whenever I got a chance.” Around the time I got my first Mac my backups changed from where I had to be there and do something to one that had automation and didn’t require media swaps. So in part two I discuss the specific backup techniques I’ve used recently.