Website Performance Tools

Picture of a red sports carI mentioned in my previous Trail Log that my look at the new Site Performance feature in Google Webmaster Tools tools led me down a rat hole looking into ways to optimize my website. Since I use WordPress I approached this as WordPress optimization, but the reality was most of this was basic web optimization. I did use these tools to help decide which WordPress plug-ins & features to use. Plug-ins or features that degrade performance for little benefit were dropped. In this article I’ll discuss the tools it led me to. I’ll cover the optimization changes I made to optimize my website in another article.

Google Webmaster Tools Site Performance

This is a Google Labs feature so it’s experimental. The information is not collected from Google’s web crawlers, rather, it’s collected by Google Toolbar users that have the PageRank feature active.

The top of the Site Performance page will display a graph showing performance history. This is what mine looks like today.

performance

It will also say what percentile your website falls into. When I first went in a couple days ago 73% of sites were faster than mine. Today 66% are faster. While I did make some optimizations this seems like a big drop in such a short time. I suspect much of this is due to the pages people are visiting. Unfortunately I didn’t keep track of load times but today it tells me page loads average 4.2 seconds. I’ve no idea what the September spike is due to. The October spike is around when I changed to the Thesis theme although I’m not sure why it spiked then dropped, it’s not like I changed the theme a lot. There may have been a plug-in I was trying and later stopped using.

They also list example pages along with suggestions to improve performance on the page. They don’t fudge the results in their favor as compressing the Google Analytics code snippet was listed. I didn’t spend much time on this information. Instead I installed the Page Speed add-on for Firefox that was promoted on the page.

Page Speed Add-On for Firefox

The Page Speed add-on plugs into the Firebug add-on so you’ll need that first. The plug-in is simple to use. Just start Firebug, load the page and wait for the load to complete entirely, then click the “Analyze Performance” button.

You’ll get a report showing the different categories on possible improvements. If you click the category name you’ll go to a help page to lean more information. The suggestions are listed in priority order and color coded as red, yellow or green. This was the jumping off point for most of the enhancements I made. Some of them I’d never heard of, like “minify css” while others just required tweaking an existing configuration, such as enabling gzip compression for javascript. There’s also items that aren’t realistic for me to implement, things that are core WordPress code such as an inline script in the header that appears before CSS files.

Page Speed will also provide optimized versions of files that needs it. These can be a little hard to deal with since they are given long, random names. Links are provided so you can open them up and cut/paste. In the cases of images I found it easier to use another tool and just compress the original images rather than finding the Page Speed generated files. In the case of CSS where I only had a few files I just used the Page Speed minified files.

Between the suggestions on this page and the help links I was able to make significant improvements in performance. Unfortunately I didn’t do timings in the beginning and Page Speed doesn’t do actual timings so while things are better I’ve no idea how much better.

The specific optimizations are best covered in a follow-up post.

Web Page Analyzer & Pingdom Tools

As I mentioned, Page Speed didn’t give me any timings. Once I’d hit the big things Page Speed identified I wanted to look at specific timings. Web Page Analyzer is a website that does what it says. I provides a plethora of information. Like Page Speed it provides information on optimization techniques and whether or not they’re in use. It’s report is all text and provides a wealth of information along with suggestions.

Pingdom Tool report Like Web Page Analyzer Pingdom Tools is web based and breaks down the load time for each item on the page. It presents it’s information in a graphical report such as the one to the left. (Click the thumbnail for full size). It also provides a summary box with stats like total load time and number of objects broken down by type. Unlike the other tools it doesn’t offer any suggestions.

The Pingdom tool graph is a nice quick way to see what order things are loading in and how long each item takes. It makes it easy to drill down into what’s taking the most time. Using this I found a plug-in that was trying to load a missing file via CSS. It also called out another plug-in that was loading a file over 170KB via CSS and even on pages the plug-in wasn’t used. Removing that plug-in alone cut the load time on this example page by 50%. That plugin just wasn’t worth the performance cost.

I found Pingdom tools first and it helped me find the big hits that Page Speed wasn’t calling out. I’ll continue to use both, first Pingdom tools to quickly call out the really slow objects then Web Page Analyzer gets into the nitty gritty. Web Page Analyzer provides more detailed information, as does Page Speed, but adds timings. The Web Page Analyzer timings are based on the object size and not the actual response from my server. I like this because it makes it easier to gauge changes I make and provides a consistent benchmark. Pingdom Tools & Site Performance are based on actual experience. This would take into account my server performance and any network congestion. This can be used to gauge the response a user is likely seeing.

OptiPNG

OptiPNG isn’t a benchmarking tool, rather it’s the one program I added to my web page creation workflow. As the name suggests, it optimizes PNG files. Actually it will also convert JPEG and GIF files to PNG and optimize them. It’s an open source command-line utility. At the basic level it’s easy to use. Just pass a file name (wildcards allowed) and it will optimize and replace the file.  There are various optimization levels along with other options (such as saving a backup file) that can be set on the command-line.

YSlow

YSlow is a Firefox add-in from Yahoo that blugs into Firebug like Page Speed. YSlow has been around awhile and I’d heard of it, but it wasn’t at the top of my mind and I didn’t come across it until late in the process. So, while I now have it installed, I haven’t really used it. I provides a nice scorecard to help call out problem issues and it provides links to more information. It looks like it presents information similar to Page Speed but with the addition of some summary statistics. If anything, there’s too much information. At this point I don’t want to take the time to delay this article until I’ve had time to explore YSlow. It would take too long and probably wouldn’t change anything for me. Of course, I may regret that statement in the future. Any YSlow fans out there?

Summary

This covers the five tools that I’ve gravitated to while exploring WordPress optimization. They helped me see what was going on with my website and within WordPress. While it was nice to see some performance improvements, the exploration was great fun which was the primary motivation to continue. (It’s five tools because I exclude YSlow since I didn’t use it, but I wanted to mention it since it’s certainly seems useful.)

Now comes the tedious task of implementing the optimizations on individual pages to fully optimize my website. I’ve already got the site-wide optimizations done. Images that appear on all pages are compressed, my CSS is minified and gzip compression has been enabled for javascript. Unfortunately I hadn’t learned learned about the benchmarking websites so I don’t have solid number for the improvement. But these changes got rid for the red in Page Speed. I’ve also changed my posting work flow so anything new is optimized when it appears. Hopefully my Site Performance stats in Webmaster tools will improve over time.

Next on the agenda to optimize my website  is to optimize my most visited pages. The main effort there will be to compress and optimize any images. The rest of the page optimization was covered by the site wide changes in most cases.

The OS Quest Trail Log #46: Housekeeping Edition

Picture of Santa Celebrating with a beerIt’s been over a month since I’ve done a Trail Log and I got some time for blog updates this weekend so I might aw well do one. The day job has kept me busy and pretty well burnt out by the time I get home so I haven’t dived in depth into anything for awhile. But recently I’ve started to take a look at things that have piqued my interest.

Optimization

I’ve been on a Google kick lately, first looking at Picasa Web (because of the Google Eye-Fi offer) followed by Google Public DNS. Then I went down a unexpected rat hole with the newly added site performance feature in Google Webmaster Tools. This was actually a quit enjoyable foray into site optimization and I learned a lot in a short amount of time. One thing I learned is I have a lot more to learn. I’ll have one or two posts out in the next couple of days on this and other tools along with the optimizations I made. I’ve already made some changes like compressing the header and other images that appear on multiple pages. This was all pretty ad-hoc and I didn’t keep very good notes, but I want to start optimizing some of my most viewed pages so I’ll write up the posts when I do them.

Regular visitors may also notice the social bookmarking links at the bottom of each post have changed. This is also the result of the optimization testing. I tried several and all had a pretty significant performance impact. Several others I tried were just as bad but for other reasons. The one I’m using now, Social Bookmarks, provides the best performance. Even so, I can’t use all it’s features and there was a bug in it’s CSS that I had to fix (it was trying to load a non-existent image. This was all part of the fun, since trying to optimize things required me to try and figure out how things worked. It comes as no surprise that all those web 2.0 bells and whistles are expensive.

Along the optimization lines I’ve also been researching a new web host. I really like Slicehost and have no complaints. If I was running a business I’d be staying with them. They have solid support. The best part is their server has been reliable enough that I’ve rarely needed support and only to have them do things I couldn’t do myself (like kernel upgrades before they added that ability to the console). But the reality is there are now competitors with more hardware for the buck. After the holidays I’ll pick one and give it a try. The question is can I get the same or better for less money. No sense starting now since it would site idle most of the time.

Other Website Changes

I’ve made a couple other changes, only one is worth mentioning. I had been closing comments 90 days after a article was published as a way to stop spam. I turned this off and re-enabled comments. If a certain posts attracts spam and isn’t otherwise worth commenting on I’ll turn off comments for the post. I don’t check the spam queue for false positives so if you post a comment you should limit the number of links. Also, when moderating comments I’m more likely to spam a short “nice post” type comment instead of checking any links.

I’m turning it back on because I’m even worse at answering email than I am with replying to comments. At least with comments others can join in. I still have a couple emails I need to respond to. Still feel free to send an email. I’ll just apologize in advance for the slow response.

Frustrations

Of course, things haven’t been frustration free. As I started writing this article my netbook woke up in the other room and started doing a backup to my Windows Home Server. This serves as a reminder then when a PC does a backup to my WHS it slows the server down to a crawl for everything else. So tonight my streaming video stopped. It’s not usually a problem since the backup occur in the dead of night. But occasionally they aren’t. Eventually I’ll have to dig into this. Unfortunately once the problem occurs I can’t get a remote connection to see what’s going on, so I actually have to set up a test and be prepared. On the plus side, the problem seems consistent so once I set things up it should happen the first time I test.

The new Handbrake also stopped working on me. I’d upgraded it and it was working without a problem. Then I applied Apple’s latest updates and it stopped (that Mac Mini is still on Leopard). I’ll try a Handbrake re-install first. VLC is also pretty old so my next step will be to try and update that since Handbrake uses it’s library. It’s just one of those annoying things that’s not hugely important yet.

DNS Errata

In between writing my Google Public DNS article and now I listened to the latest Security Now podcast by Steve Gibson. In it, he mentions a DNS benchmarking tool that he’s written. He’s written some great utilities so I’ll be sure to check it out. Steve is also my hero because he writes his utilities in assembler. They rarely need updating for bugs and they’re nice and small. His DNS benchmark utility is 150kb while the namebench utility is over 8MB even when compressed. And it’s not because his is command line, it’s a Windows GUI too.

Happy Holidays

SantaHi That’s about it for this trail log. I might get another Trail Log in before the end of the year since I have some days off, but there’s no guarantee. I’ll be happy if I get the optimization article(s) done this weekend. The coffees’s on so we’ll see how long I can go. Thanks to the miracle of WordPress post scheduling I’m hoping to get pretty far into the future so this site isn’t so dormant. So happy holidays to everyone.

Google DNS – Close But No Cigar

image of WWW on goldAmong Google’s recent announcements was their introduction of Google Public DNS. I’ve been using OpenDNS and have no complaints. Well, actually I recently found I had defaulted back to using my ISP’s DNS (Comcast), probably during a router firmware upgrade. When I switched to back OpenDNS I also didn’t notice a different over Comcast. I wouldn’t have noticed unless I was in the router config for another reason and happened to see it.

Comcast and OpenDNS both do typo hijacking and display a search page with ads rather than an error page. I went through the process of opting out of Comcast’s typo hijacking. OpenDNS also allows an opt-out for typo hijacking which I have set. Interestingly enough, the advertising company – Google, doesn’t hijack typos for ads and they display the error page for typos. But this lack of hijacking wasn’t a benefit for me since my opt-outs were already in place and were working fine.

To be honest I didn’t notice any performance difference when I was set to use any of them. When I first switched from Comcast to OpenDNS long ago I did notice imroved performance, but not this time. So I went looking for a way to benchmark performance and came across namebench. It’s simple to use and provides useful information.

Just download namebench and run the executable. You’ll be presented the following screen:

namebench main screen

The “Benchmark Data Source” is a drop down that let’s you pick one of your browsers or the Alexa Top Global Domains as a data source. Picking your most used browser provides results that are specific to the way you browse. Some people have complained that this could send all your browsing history to one person (the Google developer). Since the source code is public it’s easy to confirm it doesn’t. But, if your still concerned, picking Alexa will use generic sites.

Click “Start Benchmark” to get things going. Once the benchmarking is done (took about 10 minutes for me) a page with the results will open in your browser. At the top will be the information yiu really want:

namebenchresults

The above result is from a run after I’d already re-configured for it’s previous recommendations and OpenDNS is the second fastest DNS server according to the benchmark. The right box displays the recommended DNS servers that should be used. In my case the first one is the internal IP of my local router so should be ignored. (I didn’t include it in the screenshot but you’ll get detailed info on the servers tested. See the previously linked namebench page for samples.

The bottom line is Google Public DNS didn’t make the cut. So, while the accuracy of the benchmark may be questioned (as would any benchmark) it’s pretty clear there’s no Google favoritism. M5Net, UltraDNS and Comcast were my recommended DNS servers. Another note, because of caching the first time run of namebench will deliver the most accurate results.

So, I started off by looking at Google Public DNS but by the time I was done I was off of it. But looking into it I considered the following:

  • This gives yet more of my information to Google, which at it’s core is an advertising company. Their privacy policy is pretty good and Google hasn’t monetized DNS yet. Of all the info Google has on me, my DNS info is probably less of a concern. Let’s face it, someone is going to have this data. It’s Google’s recent cavalier comments about privacy and all the other info they have that’s a concern.
  • Google doesn’t have to match the info to me to benefit. The additional information they collect about were people surf and how often is a treasure waiting to be mined. They don’t need to put ads on error pages to profit from DNS.
  • Google does continuously hit on speeding up the web so it’s likely they’ll keep improving performance. They have studies showing that slow response on their search results generates lower revenue.
  • They also promote security and Google certainly has the money and talent to keep DNS as secure as possible.

Like my recent foray into Google’s Picasa/Eye-Fi deal, Google Public DNS is yet another Google offering that sounded good but wasn’t quit right for me. Like Picasa, Google DNS will stay on my radar and I’ll check it out sometime down the road. Anyone else trying Google Public DNS?

Google Wants Our Photos In The Cloud

image of a compact=Google currently has a deal going that offers a free Eye-Fi card when you lease 200GB of storage for them for a year. When I first saw it it seemed like a pretty good deal, and I hate to pass up a good deal. But it’s less of a deal if I don’t really need the space and won’t use the card. So that got me thinking about my options.

The space is split between Gmail and Picasa. I’m not even close to my Gmail limit and I’m not currently a Picasa user. In theory there’s also some unofficial hacks that allow the space to be used for file storage, like gDisk for the Mac. But I’m not willing to trust something Google may break at anytime so it’s not a consideration. What I’d be looking to use the disk for is to back up my photos. Right now I have just under 20GB of photos and it costs me less than $3/mth to keep them backed up offsite. So that’s $36/year, still shy of the $50.

But that assumes I could easily save everything up to Picasa and I found that wouldn’t be possible. The Picasa 3 desktop allows automatic syncing of it’s albums to albums on Picasa web albums. But this proved to be problematic and not a better solution than plain old backup via Jungle Disk. The deal-breakers were:

  • Picasa is limited to 1,000 albums with up to 1,000 photos in each album. This sounds like a lot but the 1,000 album limit is a deal breaker for me. I keep my files in a directory structure and the number of directories already exceed 1,000. I don’t want to do any drag or dropping to create new albums just for syncing since that’s prone to error. Sure, I have plenty of directories with one or two photos, but I don’t want to re-organize everything , I’m set in my ways.
  • Deleting entire albums from Picasa desktop did not delete the album from the web. Photos within albums deleted just fine. Deleting all pictures in a folder automatically deleted the folder so it’s not like I could keep the folder behind until it synced the deletions.
  • RAW image files were synced to the web as jpg’s so it wouldn’t be a true backup.

While a lot of people like Picasa, there was nothing that caught my attention and would compel me to use it. I’ll keep looking at it and may yet find some compelling feature, but for now I’d have a hard time justifying 200GB for Picasa. Realistically I’d be better off with a lower priced plan.

Then there’s the Eye-Fi card. If it was worth the cost then I could consider $50 for the card and the Google storage as the free product. The version offered is the Eye-Fi Home Video which has a list price of $69. I don’t find it online anywhere for a street price. The closest card is the Eye-Fi Share Video which sells for $73 at Amazon. If I had to guess I’d say the “catch” is that that since the Home Video card doesn’t typically include any online component the only online options are Picasa and YouTube. These are the only online services specifically mentioned in the offer. The Share Video allows sharing with more services. Other, more expensive, cards include geo tagging photos which would add a potentially useful feature.

I like the idea of being able to automatically load pictures from my camera to my PC automatically, but the Wi-Fi card doesn’t offer anything else that’s compelling to me.

So while the Google/Wi_fi offer does seem like a good deal I’m not yet convinced it’s worth $50 to me. I’m still intrigued by Picasa and the web album component so I’ll keep considering it.

I also decided to looks at some alternatives:

  • SmugMug offers online albums along with a “SmugVault” that can be used to store any type of file (such as RAW files) but it’s a subscription service and would cost more than what I have now.
  • The old standby Flickr is $25/yr for unlimited storage. Still, it’s not a good solution for backup. There are plenty off Flickr add-ins and plug-ins so I could probably find one to do syncing, but it still wouldn’t be a true backup.
  • I already use Windows Live Photo Gallery to organize my photos and like it. Plus there’s a free 25GB for online photo albums. But like the others, it’s lacking as a backup solution.

So, the bottom line is Jungle Disk remains the way I backup my photos. I’m really not surprised since it’s cheap and easy. Picasa still has my attention if I want to do some online albums and the Eye-Fi card would offer some convenience. But I’d probably want the version that does geo tagging (although I haven’t done any research to see how well it does that). I may spend the $50 bucks in a moment of weakness since it is a good deal, but for now I won’t be clicking the button to upgrade storage and order the card.