Categories
OS Quest Trail Log

The OS Quest Trail Log #22: Abridged Edition

This week’s Quest included progress in the Ubuntu Server series with articles on setting up iptables and getting comfortable with Ubuntu. So now I’m at the point where I can start installing the server software. MySQL will be first up. I also started down the Windows Home Server path and I’ve been looking at some add-ins to move beyond the simple file sharing and PC backups.

I’m starting a new day job next week so this week will be busy transitioning the old stuff and next week will be getting up to speed on the new stuff so the next week or two are likely to be light ones on the Quest. But then again, I need to have some fund.

Software Updates

Transmission 1.01 was released which was a minor upgrade to the 1.00 version of my favorite bittorrent client. The update is available through the programs own auto-update feature. Changes are covered on the Transmission page and include performance and OS X specific improvements.

1Password by Agile by Agile Web Solutions has been updated to version 2.5.9. The update is available through the programs own auto-update feature or as a direct download. Changes in 1Password 2.5.9 includes a new password strength meter among over 40 new features, changes and fixes.

WordPress now using PHP 5. I switched over to PHP5 on my server and all seems well. I’m using the latest version of WordPress and what few plug-ins I use are also current and actively developed. I’ll post more info once I know things are working OK. Let me know if you have any problems with the site. Active development of PHP 4 ended at the end of 2007 although security updates will continue until August 2008.

Categories
Ubuntu Server Project

Ubuntu Server Project #5: Getting Comfortable With Ubuntu

This is a bit different than the other posts as I won’t actually be installing any major software. Instead I’ll be customizing Ubuntu to make it easier for me to use and finding programs to monitor my server.

System Information

First I’ll want some commands that tell me about the system. Since there’s only 256MB of memory allocated to this Ubuntu Server virtual machine I’ll want to keep tabs on memory usage. I can do this with the free command and use -m to have the info displayed as easy to read megabytes.

free -m

This will display the amount of memory used.

 

The first line includes cached memory so I’m more concerned with the second line which shows I’m using 16MB and have 233MB free. The third line shows I’m not using any swap space which is nice. This will be my baseline and I can monitor it as I install software.

If I want more detailed memory usage I can use cat /proc/meminfo.

If I need a reminder of the version I’m using I can use cat /etc/issue which will display the Ubuntu version. lsb_release -a can also be used to display version information.

The top command displays information on running processes and system resources. It’s updated in real time and you can exit by typing q. Pressing <shift>-<m> while top is running will sort the processes based on memory usage.

uname -a prints the machine name and kernel information along with a few other things.

 

As the above output shows it was necessary for me to use a different kernel in order to run Ubuntu under Parallels.

df -h can be used to display disk usage in MB. -h means human readable as opposed to blocks.

Screen

Screen is a terminal multiplexer that allows multiple sessions in one terminal window much as the console does. In addition, it provides the ability to disconnect a session and return to it later, or continue processing if a session is interrupted.

To install screen I execute:

sudo aptitude install screen

As a side note: Even though I left the Ubuntu Server CD image connected to the VM I had to mount it manually for aptitude to use it. I issued mount /cdrom to mount it.

There’s a good screen tutorial at Linux Journal so I won’t go into it here.

Build-Essentials

Build-Essentials is a Ubuntu meta-package of programs that are frequently needed to properly install other programs so I want to install it. I run:

sudo aptitude install build-essential

The install is problem free.

Shortcuts (Aliases)

There’s some commands I’m going to be using a lot. To save time typing, especially since my typing is pretty bad, I set up some aliases. I open my bash configuration file in the nano editor so that I can add some aliases.

nano ~/.bashrc

I scroll down until I find the Alias Definitions section.

image lost

 

I uncomment the last 3 lines shown above so that I can put the aliases in a file. I could add the aliases in this file but I like the idea of using a separate file just for the aliases. Remove the # to uncomment the lines. I save the file then use nano to create the ~/.bash_aliases file.

nano ~/.bash_aliases

I add the following aliases to the file:

alias free="free -m"
alias install="sudo aptitude install"
alias newalias="nano ~/.bash_aliases"
alias remove="sudo aptitude remove"
alias update="sudo aptitude update"
alias upgrade="sudo aptitude safe-upgrade"

The first one makes it slightly easier to get free memory, the third opens the alias file for editing while the other simplify the aptitude command line. To run the command I can just type the alias, adding any necessary command-line options after it. It’s necessary to logout and login when making these changes since the bash configuration is only read during logon.

 

Well, I’ve got aliases to make my life easier and I’ve got system utilities to monitor resource usage as I install new software. Next on the agenda is the MySQL installation.

Categories
Ubuntu Server Project

Ubuntu Server Project #4: Iptables Firewall

Continuing along the security theme set by the previous article I’ll configure some simple iptables firewall rules for my Ubuntu Server virtual machine. Iptables can be pretty complicated and I won’t attempt to go into great detail. Since this is a virtual machine only accessible from within my home network I have the luxury of being able to play without having to actually be concerned with security. So iptables will be set up for the experience and for future testing.

Iptables is installed with every Ubuntu installation so there’s nothing new to install. We just need to configure the rules that iptables needs to use. Since I’m setting up a web server I’ll create rules to allow SSH (port 22222), HTTP (port 80) and HTTPS (port 443) traffic.

I’m going to create two files that contain the iptables rules. One will be used for testing and the other will be for production. The production rules will be permanent and load during reboots. The test rules will be in file /etc/iptables.test.rules and the production rules will be in file /etc/iptables.prod.rules.

The Rules

I connect to the Ubuntu server using SSH from the terminal on my Mac. Everything done related to iptables has to be done as root so I issue the command:

sudo -i

and enter my password when prompted. Now I won’t have to use sudo as a prefix for each command.

For my first step I’ll save any existing rules to the production file using the command:

iptables-save >/etc/iptables.prod.rules

On my freshly installed Ubuntu server this generated the following file contents:

image lost

 

To list the current filter rules on the screen I run iptables with the -L switch.

iptables -L

which results in the following information:

image lost

 

What the above means is that anything from anyone on any port will be accepted. I’m not a fan of the theory that as long as nothing is running on the ports then nothing needs to be blocked. I am a fan of blocking everything except traffic which this server is intended to handle. So I’ll be setting up some rules to restrict traffic. Initially I’ll be doing this in the /etc/iptables.test.rules file. During this time I’ll keep my existing terminal connection active and actually start a second session just to be sure. This way if a test rule blocks SSH I’ll have an existing connection that I can make the change with. (OK, it’s a VM on my Mac so no second session, but if it was a remote server I’d set up the second session as a safety measure.)

I start off with some very simple rules which are based on information found in the Ubuntu Documentation Iptables HowTo. Rules are processed top to bottom and once a decision is made about a packet no more rules are processed.

A lot of traffic on the server uses the loopback port and we want to allow it all. No reason to stop intra-server communication. So I add the lines:

-A INPUT -i lo -j ACCEPT
-A INPUT -i ! lo -d 127.0.0.0/8 -j REJECT

The first line says to accept all traffic on the loopback port. The second rule says to reject all traffic that uses the loopback address but isn’t on the loopback port. -A means append the rule to the chain. INPUT is the chain to add the rule to. Valid chains are INPUT, FORWARD and OUTPUT as shown in the previous screenshots. -i means to only match if the traffic is on the specified interface. lo is the loopback interface. -j is the action to take with the packet. Valid actions are ACCEPT, REJECT (Reject and notify sender), DROP (silently ignore) and LOG. The ! in the second line means “not” so in this case it means traffic not on the loopback adapter. -d indicates the destination and can be an ip address or port. In this case it’s the loopback address.

Then I’ll add a rule to continue to accept all established connections:

-A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT

State matches are described in greater detail at faqs.org. But this rule says to accept all traffic for an ESTABLISHED connection that has seen traffic in both directions. It will also accept traffic for new connections if it’s associated with an established connection, these are the RELATED packets.

Next I’ll allow all outbound traffic. I’ll leave restricting outbound traffic for another day.

-A OUTPUT -j ACCEPT

Now I’ll enable web traffic on the common ports of 80 for HTTP traffic and 443 for HTTPS traffic.

-A INPUT -p tcp --dport 80 -j ACCEPT
-A INPUT -p tcp --dport 443 -j ACCEPT

The -p specifies the connection protocol used, in this case tcp and dport indicates the destination port.

Now I’ll allow SSH traffic. Use the same port specified in the ssh_config file. In my case it was port 22222.

-A INPUT -p tcp -m state --state NEW --dport 22222 -j ACCEPT

In this rule the state parameter is used to allow the creation of NEW connection. The previously defined rule for established connections will apply once the connection is created by this rule.

Next up is a rule to allow pings.

-A INPUT -p icmp -m icmp --icmp-type 8 -j ACCEPT

In this rule icmp is the protocol used. A complete list of icmp types is at faqs.org which shows 8 as a “echo request” type.

Now I’ll create a rule to log incoming packets that are denied by iptables.

-A INPUT -m limit --limit 5/min -j LOG --log-prefix "iptables denied: " --log-level 7

This rule will log denied packets, up to 5 a minute. It will prefix the log entries with “iptables denied: “. The LOG action doesn’t stop rule processing so the packets will be processed by any following rules. The reason we know these packets will be refused is because the only rules that follow will reject the packet. So if a packet has reached this rule there isn’t a chance for it to be accepted.

So the rules to deny any remaining packets are:

-A INPUT -j REJECT
-A FORWARD -j REJECT

The rules file needs to begin with *filter and end with COMMIT. The complete iptables rules file is available as a text file.

Enforcing the Rules

I save the rules to /etc/iptables.test.rules and then run the following command to load them in:

iptables-restore </etc/iptables.test.rules

The to see if anything actually changed I run iptables -L and compare it to the previous results. As the screenshot below shows they are different.

 

image has been lost

Now it’s time to test the critical SSH connection. I open a new terminal window and try a connection. It works and the other rules seem correct so I’m all set. If it failed I’d still have my existing connection to fix the problem (assuming the rules to allow existing connections took affect).

Now I need to make these rules permanent. First I’ll save them to my production rules file:

iptables-save >/etc/iptables.prod.rules

Now I need to make sure the rules are loaded at startup. I load the file /etc/network/interfaces in the nano editor. I add the following line at the end of the loopback section:

pre-up iptables-restore </etc/iptables.prod.rules

The screenshot below shows my updated interfaces file.

image has been lost

The final test is to restart Ubuntu server and make sure the rules are still in place.

So now I have a basic server setup and it’s running a simple firewall. I’ll probably spend a little time exploring Ubuntu before I start installing the server software.

 

Categories
OS Quest Trail Log

The OS Quest Trail Log #21: Macworld Edition

Macworld is over and the good news is there’s nothing I feel the need to run out an buy or regret not waiting for.

The MacBook Air was the big new product of the event. To me, this fulfilled the rumors about Apple moving into the enterprise. This is a executive suite computer if I ever saw one. It’s not intended to replace the MacBook or MacBook Pro and if success is defined as selling more than either of those models then it’s doomed to failure. But if success is defined as raising Apple’s image then it’s a hit. All ultra-portables have compromises. A built-in optical drive is a frequent compromise and Apple’s Remote Disc solution seems like an original solution. Besides, there’s an optional external drive which is usually the case for other ultra-portables.

The lack of an ethernet port is an interest compromise but probably not a huge problem for the people who’ll buy this. There’s a USB adapter for those who want it. In my home, where my MacBook is my second Mac, I never connect via ethernet and always use 802.11n. I don’t think I’m alone and I do think the MacBook Air is a second Mac, not a persons only Mac.

In any event, the Air isn’t for me. Hopefully some of it’s technology will work it’s way into other Macs by the time I’m looking to replace my current machines.

I see iTunes Movie Rentals and Apple TV Take-Two as related products. The updates to Apple TV break it’s link with a computer. It can now stand on it’s own in the living room. The price drop helps too. As an Apple TV owner (and fan) I’m looking forward to the update.

The Rental service is interesting. The terms are hardly unique to Apple (Amazon Unboxed as the same 30-day/24-hour windows for one) yet a lot of people seemed to think Apple is the first in the space. It will be interesting to see if Apple raises the profile of online rentals. Even though I’m an Apple TV fan, I’m skeptical about this really taking off.

The last product introduced was Time Capsule. This product falls into the “why not?” category. Can’t say I’ve seem similar products but it’s hardly revolutionary. Integration with Time Machine on multiple Macs is nice. The pricing is reasonable when compared to the original Airport Extreme and considered a hard drive option. I’m just not going buy this to replace existing hardware and I don’t think many others will either. Hopefully this means an update to Time Machine will enable it to work with any networked disks.

Software Updates

Apple pushed out some security and Macworld related updates this week. In addition the QuickTime 7.4 and iTunes 7.6 I also received two other updates.

iMovie 7.1.1 was described as addressing…

…issues when publishing movies to a .Mac Web Gallery, improves overall stability, and addresses a number of other minor issues.

I don’t use iMovie so couldn’t say what affect, if any, this update had.

Front Row 2.1.2 was also released and has the generic description…

…provides for bug fixes and improved iTunes compatibility.

I use Front Row mainly for playing DVDs and iTunes Videos on my Intel Mac Mini. I haven’t noticed any difference when using it.

Ubuntu Server

Things are moving along with my Ubuntu Server VM. OpenSSH was set up on the server and public/private keys were set up to connect from my Mac. This week I hope to get the firewall setup (with just basic settings for testing) and then start installing the actual server software.

 

Frustrations

This week was remarkably free of frustrations.

 

The week ending was uneventful. I’m hoping the week ahead lets me spend time with Windows Home server in addition to continuing to build the Ubuntu server.

Categories
Ubuntu Server Project

Ubuntu Server Project #3: Networking & SSH Setup

This post is obsolete and screenshots have been removed.

This is the third installment in my Ubuntu Server Project series which documents my efforts to get a working copy of WordPress running on Ubuntu Server 7.10. It’s summarized, with links to past articles, on my Linux page.

Ok, technically security should have been set up immediately after installation so this should have been the second installment and not the third. But Ubuntu was a VM on my own desktop and wasn’t on the Internet so I wanted everything nice and up to date before proceeding.

Setting Up Networking

I could keep working with the Ubuntu 7.10 Server locally on my desktop and get right to the installation, but I want to start dealing with it as if it’s a remote server. So the first thing I’m going to do is get the IP Address and Mac Address for my Ubuntu server so I can connect remotely. I log onto the console and issue the command ifconfig to get the ip address along with the mac address. The screenshot below shows the results on my Ubuntu Server (click to enlarge) with the ip and mac addresses indicated.

 

It’s worth mentioning that I set up the Parallels vm to connect via a bridged network so it gets it’s own unique ip address rather than sharing (via NAT) with the host OS. While the IP address will probably stay the same, it’s assigned by DHCP and could change. It’s my internal DHCP server (actually my Airport Extreme) so I’m going to reserve the DHCP address for this Ubuntu Server instance. To do that I need both the IP and Mac addresses.

I’m concerned with the adapter labeled eth0. The ip address is on the second line and is labeled inet addr. The mac address is on the first line and is indicated by HWaddr. Most home routers can do DHCP reservations although the methods vary. Look for the tern DHCP reservation. All you should need is the ip and mac addresses. A note for Airport Extreme users like me – even though there’s no good reason for it, adding a DHCP reservation forces a router restart.

If you don’t want to set up a reservation you can just lookup the ip address when it changes and you can’t connect.

Installing SSH Server

I want to install an SSH server so I can securely connect to the server remotely. (Remember, I’m treating this like a remote server.) I log on to the Ubuntu console and run the following commands:

I want to make sure my package list is up to date:

sudo aptitude update

Then to install the SSH server:

sudo aptitude install ssh

Aptitude tells me (click for full size):

 

and I approve the installation which finishes without error. SSH server is installed am I’m done with the SSH server install. For information about aptitude see my previous article.

The whole point of SSH is security. In the next step we’ll see that our first SSH connection from a workstation says the host is unknown and provides a fingerprint. Now, this is a internal private network and the host is really a VM running on the same machine and we’ll be connecting via IP address. But for security purposes we’ll get the “RSA key fingerprint” while we’re here. I execute the command (on Ubuntu server):

ssh-keygen -l -f /etc/ssh/ssh_host_rsa_key.pub

Note that I don’t need to use sudo. As the extension .pub implies, this information is public for all to see. The response I get is:

2048 64:93:11:41:b7:31:cf:66:41:cb:7c:4f:37:3b:89:e8 /etc/ssh/ssh_host_rsa_key.pub

That long colon delimited number is the servers RSA Key Fingerprint. Whenever I attempt a SSH connection from a new machine I will be presented with that number. If it doesn’t match then I’m connecting to another machine, either by error or by mischief.

There’s also another type of key generated during the install called a dsa signature key which is another form of key signature. To get this fingerprint execute:

ssh-keygen -l -f /etc/ssh/ssh_host_dsa_key.pub

From this point on I will do everything on my Mac and treat the Ubuntu Server as if it’s a remote server. Although for simplicity doing the server stuff from the console running the Ubuntu server and doing the local computer stuff from terminal would be much simpler.

Setting Up SSH Public/Private Key

SSH provides secure, encrypted access to the server’s console. I’ll set up a public/private key for my iMac and the server, this way when I want to connect I don’t need to enter a password. Public/private keys should only be used when the local workstation is secure since anyone who has access to the workstation can access the server.

I’m going to test the SSH connection before proceeding. I open terminal on my Mac and execute:

ssh ray@10.0.1.200

I’m told the authenticity of the host can’t be established and I’m presented that 16 digit(hex) number. It matches what I know to be the server so I type yes to continue connecting. The I’m told Warning: Permanently added '10.0.1.200' (RSA) to the list of known hosts. This means future SSH logons from this machine will not generate the authenticity error. The SSH connection is working.

I logout of the connection but stay in terminal. (I could just open another terminal window, but I’m easily confused.)

First, I’ll create a folder on my local Mac to hold the keys. I execute:

mkdir ~/.ssh

This folder may already exist, and should have been created when the server was added to the known hosts list. If it does exist you’ll get an error that it can’t be created and you can move on. The ~ indicates your user home directory. The folder will be created in your home directory and the “.” means it will be hidden (at least in Finder).

Now I create a public/private key combination for my Mac by executing:

ssh-keygen -t rsa

This will generate a public/private key using rsa encryption. Two files will be created in ~/.ssh called id_rsa and id_rsa.pub. The private key is id_rsa and should never be put in any public place. The public key is id_rsa.pub. During the key creation I was asked to confirm where I wanted to put the files and if I wanted a passphrase. I accepted the default for location and hit enter for an empty passphrase.

Now I copy this to the server using the secure copy command.

scp ~/.ssh/id_rsa.pub ray@10.0.1.200:/home/ray/

This will copy the public key file to my home directory on the server. I’m prompted for a password but since scp encrypts the password it’s safe to enter it. Change the ip address to your own address and substitute your ID for ray.

Now I need to configure the public key on my Ubuntu server. Still in terminal I execute

ssh ray@10.0.1.200

and enter the password to connect to the Ubuntu server console. I’ll create a directory for the authorized public keys and move my key into it, changing the name of the file in the process.

mkdir ~/.ssh

mv ~/id_rsa.pub ~/.ssh/authorized_keys

This copies the id_rsa.pub file to the newly created .ssh directory and renames it to authorized_keys. Now I need to set the permissions for the directories.

chwon -R ray:ray ~/.ssh

This changes ownership of the directory. -R means to apply recursively and I’m saying to change to owner to the user and group ray. Substitute whatever ID you created.

chmod 700 ~/.ssh

chmod 600 ~/.ssh/authorized_keys

This changes the access permissions for the directory and file. The 700 means only my ID can read, write, or execute files in the directory. The 600 means only I can read or write the file (no execute privilege).

Now I need to configure the SSH server.

Execute:

sudo nano /etc/ssh/sshd_config

The ssh_config file is loaded in the nano text editor. Scroll up and down using the arrow keys. Help is along the bottom, ^ means the control key.

Scrolling down the file I make the following changes:

port 22222

Near the top you’ll see Port 22. For security purposes it’s good to change this port number, since it makes it a little harder for people to find the SSH server connection. You need to pick a port that’s above 1024 and that’s not being used on your system. Port number in the range 1024 to 49151 may be registered and used by specific applications. Port numbers between 49152 and 65535 are dynamic and aren’t reserved for any particular use. You can pick any port above 1025 as long as it won’t be used by something else on your server. A list of registered ports is maintained by iana. I picked 22222 because it’s easy to remember and not currently registered to anyone.

PermitRootLogin no

This means the root user can’t log in through ssh. This is a bit redundant with Ubuntu since the root user can’t logon in a typical installation.

AuthorizedKeysFile %h/.ssh/authorized_keys

I just needed to uncomment this by removing the # at the beginning of the line. Notice it points to the public key file we created (%h is expanded to the user’s home directory).

PasswordAuthentication yes

I uncomment this so that I can log on with password in addition to keys. The key will be used if available, if not there will be a password prompt. Setting this to no means the key must always be used. If all your PCs are secure and can use public/private keys you can set this to no, which means that the keys must be used. Just don’t lose the keys.

X11Forwarding no

Since there’s no GUI on this server so I turned this off.

UsePAM no

I’m not using the PAM module.

I added the following new lines at the end of the file.

UseDNS no

I’ve seen there were some past issues resolved with this setting and I don’t need DNS lookups for my clients.

AllowUsers ray

This specifies which users are allowed to connect via SSH. Separate multiple users with spaces.

I write the file with ^O and then exit with ^X. (^X will prompt to save but I’m paranoid and save first anyway).

Finally I need to restart SSH so I enter:

sudo /etc/init.d/ssh restart

Then I logout and login again. If everything is set up right I shouldn’t be prompted for a password, and I’m not. The proper ssh command (from OS X terminal) with the port change is:

ssh -p 22222 ray@10.0.1.199

If you want to enable the dsa key instead, or create the dsa keys in addition to the rsa keys you can repeat the process, substituting dsa for rsa. Instead of the command mv ~/id_rsa.pub ~/.ssh/authorized_keys you will need to concatenate the new file with the authorized_keys file. Use the following command to do this after copying id_dsa.pub to your home directory.

cat ~/id_dsa.pub ~/.ssh/authorized_keys >~/.ssh/newkeys You can chain multiple key files together in one command. Then copy the newkeys file over the authorized keys file:

cp ~/.ssh/newkeys ~/.ssh/authorized_keys

To delete the id_rsa.pub file from your home directory after it’s concatenated to authorized_keys run

rm ~/id_rsa.pub

I can repeat the public/private key generation from my other computers and use the above concatenation command to add the public keys to the authorized public keys list or stick to passwords since I won’t be using those computers very often.

So the server is up and running and we can securely connect. Next up I’ll get a basic firewall going and then I’ll finally be ready to install some software.

Additional Reference

OpenSSH Quick Reference (PDF)

SSH Host Key Protection – An article From Security Focus that describes the use of SSH and provides some background.

OpenSSH.com is the OpenSSH project website which has a OpenSSH FAQ.

Categories
Ubuntu Server Project

Ubuntu Server Project #2: Updating the Install and the Basics

 

This is the second installment in my Ubuntu Server Project series which documents my efforts to get a working copy of WordPress running on Ubuntu Server 7.10. It’s summarized, with links to past articles, on my Linux page or go to the previous article about installing Ubuntu Server 7.10.

Most of my experience with the *nix command line is limited as there’s always been a GUI. I think the most I did was over 10 years ago when I did some work on HP-UX. So I’ll be starting with the very basics, and will probably get some things wrong.

First up I’ll be needing some command line basics.

Getting Help

The commands I’ll be using have man (think manual) pages on the system (at least the ones I’ll be using at first will). So first up I’ll need to know how to use man. The syntax of man couldn’t be simpler, it’s:

man command

where command is the Ubuntu command for which you want the manual page.

I’ll be using aptitude to update my Ubuntu install so I issue the command:

man aptitude

and the man page for aptitude is loaded. To navigate use the <spacebar> to move forward and the <b> key to move back. Hit the <q> key to exit. Man also has a bunch of switches to search and use numerous other features but I don’t need those now.

You can also get help for most commands by typing the command followed by the -h parameter. The text will probably be more than a screen can handle and some of it will scroll off the top. To scroll up use <shift>-<pageup> and use <shift>-<pagedown> to head back down. I’ve read that shift-uparrow and shift-downarrow can be used but they don’t work for me. Could be a Mac/Parallels thing rather than bad info. To get to the command prompt release the shift key and start typing your command or just hit any key to get there (be sure to delete anything that was typed). I’ve either started typing the command or just hit <return> to get to the command prompt.

Virtual Consoles

Ubuntu, and Linux in general, has virtual consoles even when at the command prompt. There are 6 of them. To switch virtual consoles type the keys <alt>-<Fn> where n is a number 1-6. For Mac users the alt key is the option key. Also, for fellow users of the new Mac keyboards the function keys default to their special features (Dashboard, Spaces, volume, etc…) so you’ll need to hold the <fn> key too. Or, you can do like I did and go into the keyboard section of System Preferences and enable standard function keys.

Then you’ll have to use <fn> for the special functions but not any apps or to switch consoles.

With virtual consoles I can use one for the man page and another for the actual commands. In addition, each console requires it’s own logon so different IDs could be used. Long commands can run in one console while I work in another.

Aptitude vs. Apt-Get

Ubuntu uses Debian’s Advanced Packaging Tool (apt). I came across two commands for managing this from the command line. They are aptitude and apt-get. They seemed similar but different so I figured I needed to pick one and stay with it. I decided to go with aptitude. I did read that mixing in apt-get after using aptitude could cause problems with aptitude because aptitude wouldn’t know about all the dependencies.

Aaron Toponce has a recent article with well laid out logic for aptitude which is based on this older explanation of aptitude. But there does seem to be a minor religious war over the best package management system.

Since I’m starting with a fresh system it seems aptitude is the way to go. It just made sense. Besides, if I get tired of the command line aptitude has a curses interface (menu system).

Apt-get does have super cow powers while aptitude does not, which is the only reason I considered using apt-get.

Updating Ubuntu Server 7.10 – It’s Why We’re Here

The whole goal here was to update my original Ubuntu installation and now I’m finally ready.

I’m logged onto the console with my ID and I need to enter two commands. I’ll be starting each command by specifying sudo which will run the command as the superuser. I’m using the default configuration so I’ll be asked to authenticate with my password which is the id/password created during the Ubuntu Server installation.

The two commands are:

sudo aptitude update

As a mentioned, sudo means run as superuser, aptitude is the package manager I’m using and update is the action that aptitude will perform. Update tells aptitude to get a list of new/upgradable package.

sudo aptitude safe-upgrade

The safe-upgrade action tells aptitude to upgrade packages to their latest version. Packages will not be removed unless they are unused. Packages which are not currently installed will not be installed. If it’s necessary to remove of install one package in order to upgrade another this action may not be able to handle it. Full-Upgrade can be used in this situation. Aaron Toponce has an article describing the difference between safe-upgrade vs. full-upgrade. As the name implies, safe is more conservative. If it fails to update a package I can do further research and make a decision. Full-upgrade was formerly called dist-upgrade.

I issue the update command and the package info is quickly update. The safe-upgrade command upgraded 16 packages without error.

I then re-issued each command to make sure there weren’t any further updates. There weren’t so I’m done. I saved a snapshot in Parallels and shut things down.

Summing Up

Even though this was the basics it does cover the things I had to learn to get going. Rather than following a book and having it set the agenda I figure I’ll learn as I go. Good idea or not?

If you think you’d prefer apt-get due to the super cow powers type apt-get moo to see if you want the feature.

 

Categories
OS Quest Trail Log

The OS Quest Trail Log #20

CES is over and MacWorld is about to begin. Can’t say anything at CES caught my attention. Stuff was just bigger, thinner and less expensive. The big news last week seemed to be Warner’s move to Blue-Ray and that was before CES.

I can’t say I’m all that excited about MacWorld either. Curious? Yes. Excited? No. Rumors are flying of course. Apple got the Mac Pro upgrades out of the way this week so they aren’t looking for keynote filler. My lack of excitement has more to do with the fact that I won’t be buying any of the rumored products, at least not this year. The iTunes Store movie rentals is intriguing, depending on price. I won’t pay ten bucks to “own” the iTunes locked movies when the extra laden DVD is usually only a few bucks more. But the movie rental may be worth it. My Netflix DVDs have been sitting in the house longer and longer lately so pay as I go may be cheaper. Still, it’s hardly a earth shattering change.

The big event at on the OS Quest this week was a complete cleaning of the OS Quest data center. Same old hardware a software but now nice, shiny and uncluttered. The post for the picture is data center desk one week after it’s cleaning and still uncluttered.

Frustrations

.Mac gave me my first real bit of aggravation in a long time. For performance reasons I sync my iDisk to the local disks of my Macs and it’s worked remarkably well since the changes to .Mac around the time of Leopard. But this week was shades of .Mac past when the same 16 files were always out of sync between the remote and local iDisks. No matter how many times I told it to use the .Mac version it would ask again the next sync (I also told it to keep the PC version numerous times with no success). These were all files which hadn’t actually changed. So I ended up having to blow away the local iDisk (simply by turning local syncing off) and then turning it back on which pulled everything back down. It’s been fine since.

This led to my Mozy surprise, which actually wasn’t caused by Mozy. When local disk syncing is turned off it creates a copy of your old local iDisk in a sparse image file on your desktop which can be quit large. My ~1GB of iDisk data resulted in a 34.3 GB disk image, even though .Mac itself is limited to 10GB and I’ve allocated up to 9GB to iDisk. The next day I noticed Mozy was backing up over 35GB of new data.The iDisk archive file didn’t hit me right away, especially since the amount of data was so large. At first I thought it was backing up everything again. I didn’t notice the large file size until I traced back the new files in the Mozy log.

I’m a fan of Apple’s Mighty Mouse but it added to this week’s frustration. One came with my iMac and I gave it a try over my previous trackball and liked it so much I since bought another. The scroll ball was the clincher. But the scroll ball develops an annoying habit of scrolling every way but down. My original had the problem and I relegated it to a little used Mac. But the newer one developed the same problem this week and it’s relatively new. So it was either fix the problem or go for a warranty replacement. A quick Google search found this thread at Mac OS X Hints showed I wasn’t alone. In my case holding the mouse upside down and pressing down hard while scrolling, along with compressed air fixed the problem. The thread has other solutions all the way up to a link to how to take the mouse apart. I’ll have to try it on my original Mighty Mouse the next time I fire up that Mac.

And of course photography brought it’s own frustrations but those are covered below in the photography section.

Software Updates

TextExpander 2.0.4 was released from SmileOnMyMac. According to SmileOnMyMac the update:

fixes a problem with Dvorak-Qwerty keyboard layout, as well as a problem with the snippet type failing to stick.

The update is free to version 2 owners.

I did have a minor problem with the install. Despite following the install directions to the letter I ended up with two TextExpander icons in my menu bar. Further research showed I had two textexpanderd processes running. I killed them both then went into System Preferences and selected TextExpander and the process started up again, just one of them.

I’ve been using Transmission as my BitTorrent client since upgrading to Leopard. It’s a small, nice and simple but fast client. It recently left beta and is now at version 1.0.

NetNewsWire went free with version 3.1 earlier this week and then cam out with NetNewsWire 3.1.1 later in the week. The version fixed a crash on startup bug, a split version bug and won’t collect attention data for authenticated feeds. I’ve been using NetNewsWire since it went free and I like it so far. I haven’t experienced any of the bugs so can’t say if they’re actually fixed. I’ve been less enthralled by Newsgator Online and FeedDemon. NNW is for Mac and FeedDemon is for Windows although they are not the same app for different platforms. The change from Google Reader to NNW was easier than for the the online and FeedDemon versions. The upgrade is done by the application itself and is automated but NNW will shutdown and restart.

Jungle Disk 1.50 was released for Windows, OS X and Linux. I like the concept of Jungle disk and spent the twenty bucks to buy it. I still don’t use it as my primary backup tool but it keeps getting better. At twenty bucks for lifetime upgrades it’ll never be cheaper. Since it’s cross-platform and allows files to be opened while on S3 it’s inevitable that I’ll start using it. The main downside (for me) is that the Amazon S3 pricing is open-ended and after about 30GB it costs more than Mozy’s $5/month. Until this release it also lacked some of the features of Mozy (like block-level backups) but this release support Jungle Disk Plus which adds many of those features. All the changes to this version are covered in the release notes.

Jungle Disk for Windows Home Server 1.02b Beta was released back on December 16th, the day after I installed the previous version. When I saw the regular Jungle Disk update I went an checked the WHS version and sure enough, it was updated back in December. I had experienced one potential JD problem. At one time I logged off the console with the Jungle Disk pane the one active in the console. When I tried to log back into the console from any machine the console wouldn’t display properly and wouldn’t respond to the mouse. I had to reboot the server. Other that JD has been fine but I updated anyway. Jungle Disk for WHS is still in beta so the software is free at this time and is available here. You still need an Amazon S3 account.

Ubuntu Server

I got back on track with the Ubuntu Server project. It’s been added to my Linux page and you can follow along there if you miss it in the feed. You should see my next post on Monday or Tuesday. I purposely didn’t install any GUI with Ubuntu Server to force me to the command line. It’s been a long time since I’ve been in that situation so I had some fun just getting familiar with using the console.

Photography

I finally got around to processing the pictures I took over the holiday. It wasn’t a pleasant experience. I bought a new flash to get away from the built-in flash. This did result in more even lighting , although they came out on the dark side and had to have the exposure bumped in processing. That wasn’t really the frustrating part, I just need to get used to the flash and camera by using it more. I was happy the spotlight glare and redeye were gone.

The frustration started when I went to Aperture. Now, I admit much of the frustration was due to my lack of knowledge about Aperture and digital photography post-processing in general. But for a company with a reputation of elegant UI’s I found Aperture to be one frustrating piece of software. Just to make sure I wasn’t nuts I installed the LightRoom demo again and did find that to be much easier to get into and fix up the photos. Still, I’m sticking with Aperture and I’m sure things will improve over time.

I also pulled the pictures into iPhoto and again found that much easier to fix up. And iPhoto was much easier when it came to printing. Despite having calibrated my monitor and using the ICC Profiles for printing from Aperture the printout was much darker than the screen. While from iPhoto, which doesn’t use the ICC profiles came out much closer to what was on the screen. To add to my confusion with Aperture the best printout came when I used the ICC profile and left ColorSync on in the print driver. My understanding is this is just wrong as ColorSync should be off when the ICC profiles are used.

Just makes me realize I have a lot to learn and need to take a lot more pictures.

So I did take some pictures around the house to work with the flash and some natural lighting. But I put the SD card into the card reader and several photos show as bank in iPhoto. I open Aperture and it either can’t read them at all or it renders them as garbage. Before swearing at my Mac I popped the card back into the camera, and sure enough even though it saw picture files, it wouldn’t display them either. So, some time lost and a lesson learned that I’ll set the card as read only before putting it into my computer’s card reader.

Website News

The News & Links section usually appears at the end of the Trail Log posts but I’ve decided to kill the feature, at least for now. Based on the stats I have for clicks they won’t be missed, but feel free to chime-in in the comments if you want them back.

The Links and Articles pages are also going to see a exit in the near future. Neither gets much activity so they’ll be dropped so I can eliminate the plug-ins.

 

The Security Quest articles which usually appear on Wednesdays will also be cut back. Probably once a month following Microsoft patch Tuesday or when I have enough to write about. It’s all about eliminating having to post to meet a schedule and posting about whatever I’m working on instead. They weren’t getting much traffic as it is, although there were exceptions. If all you want is the security content you can subscribe to the Security category rss feed or register to get the security articles via Email.

Categories
Random Access

Security Quest #17: Microsoft Edition

Another second Tuesday of the month and another set of Microsoft patches. I realize it’s important to patch vulnerabilities as soon as possible and this monthly release schedule tends to go against that, but I like the consistency and ability to plan.

Anyway, this week brought two patches. The first is MS08-001 titled “Vulnerabilities in Windows TCP/IP Could Allow Remote Code Execution”. This affects all supported desktop OS’s. It’s rated as Important for Windows 2000 and Critical for all flavors of Windows XP and Windows Vista. I didn’t have any problems applying this update to my two Windows XP SP2 installations. There wasn’t any update through Windows Update for my Vista SP1 RC1 install so I don’t have any experience with that one.

MS08-002 is titled “Vulnerability in LSASS Could Allow Local Elevation of Privilege” and is for Windows 2000 and Windows XP on the desktop. It rated as important. If someone already has logon credentials they can use this vulnerability to elevate their privileges.

There’s no cumulative IE update or any Office updates this month.

 

Microsoft Security Resources

Additional security resources from Microsoft:

Microsoft Security Newsletter is a monthly e-mail covering security topics from Microsoft. To subscribe you’ll need a Microsoft Live ID (formerly passport) although the newsletter can go to any email address.  You’ll also be required to provide a name. By default the box to also receive other Microsoft emails is checked so be sure to uncheck it (unless you want the emails). You can also view the latest newsletter‘ without subscribing.

Microsoft provides several levels of security notifications via several methods. They provide either basic or comprehensive alerts along with additional non-vulnerability advisories and a blog. Delivery system include email, rss, Windows Live Alerts and the website.

A security bulletin search is provided that allows searching by date, product and severity rating.

They also have a new (at least to me) Malware Protection Center that lists information about malware and provides links to Microsoft tools.

Spam Counts

This weeks spam counts:

Primary Mailbox 30-day spam count: 2

This is down one from last week and none of it is new.

Public Mailbox 30-day spam count: 156

Down 20 from last week with new spam this week at 21 pieces.

Website comment and trackback spam: 7,573

This is up 73 from last week.

Categories
OS Quest Trail Log

The OS Quest Trail Log #19: New Year Edition

Well, we just kicked off a new year, at least for those of us who use the Gregorian calendar. Despite the title this isn’t a predictions or resolutions post.

Actually I think the first topic is more for me than for you. I’ve become hooked on posting to the site and I figure it’ll be easier to break the addiction if I write down that I’m breaking it. I’ll be posting less so I can spend more time working on projects and learning new stuff. I’ll post about the progress when there’s something worth talking about. It’s more a mind set change on my part since I’ll no longer be working on things based upon how well they fit into a posting schedule.

The initial project I want to work on is getting WordPress running on Ubuntu server (in a VM). It’s not about getting WordPress up and running and more about getting a LAMP stack running. It’s just easier for me if I have an goal that can define success, instead of “install and play”. Windows Home Server is also intriguing me and I look forward to spending more time with it.

Coinciding with my plan to post less and learn more,  I came across a couple reviews for software that was on my list to write-up since I use and like them. These reviews are so good I crossed them off my list and link to them in the “Reviews” sections below.

Frustrations

Like it says up top, it is a frustrating journey and there were a couple minor scares this week.

I’m currently working on an iMac where Spaces has decided to stop jumping to the Space where the app is when I switch to or start an app. Instead it brings the menu to my current Space, leaving the application windows in the far away Space. I suspect a reboot will fix it but I haven’t wanted to shut everything down. But this is the second night and it’s become most annoying.

Not to be outdone when I started my MacBook yesterday the keyboard went south. The numlock LED was on (I forgot it had one and where it was). The caps lock LED was also on, although the indicator in the password field was indicating the opposite status. But that didn’t matter since I couldn’t type in the password field. The mouse worked so I powered off and on and then things were fine. I think I’d have felt better if it stayed broken or if I at least had to do something to fix it (disconnect a drive, uninstall software, anything). I hate intermittent problems.

Software Updates

Mozy

I upgraded MacMozy to version 0.9.0.0 a couple days ago. When this update notice first came out the link was broken (both in Mozy and on their website) and the software couldn’t be downloaded, but I was able to get it Wednesday. I haven’t noticed any real problems but did come across one anomaly. Activity Monitor reports Mozy as not responding.

 

As you can see Activity Monitor does show CPU activity (it changes). This is both when no backup is occurring and when they are occurring. Backups run just fine and on schedule. I’ve also restored a couple newly created and edited files and they’re fine. Among the changes was “decreased cpu usage of status icon” but the status icon is the “Mozy Status” process.

Another change in this version is an option to show hidden files. The timing of this was ideal as I had begun to back up my iDisk using Mozy. When I selected the entire iDisk it told me I was going to backup up a couple hundred gig, even after I deselected all the common .Mac hosted folders such as Groups. What I found is a hidden file in iDisk called .filler.idsff that Mozy saw as almost 200GB. By enabling hidden files I could exclude it. This is important for me because selecting the entire iDisk and deselecting directories I don’t want means new directories will be backed up. If I had to select directories individually I’d have to have remembered to add newly created directories.

In case you’re wondering I keep my iDisk local on my Macs so I get better performance. Even though .Mac can be considered a backup there’s always the possibility a deleted or corrupt file will get synced out of existence so I wanted a backup and I wanted Mozy’s 30-day history.

WordPress 2.3.2

WordPress came out with a security update and I installed the update over the weekend. I took the opportunity to update my plug-ins and Mint at the same time. There’s not much to add beyond what I already said.

On a related topic, WordPress 2.4 will be skipped and WordPress 2.5 is still scheduled for March. I have to say I’m happy. I like to keep software up to date but quarterly version releases is too much for my taste when those releases add or change features. And that’s from someone who likes updating software. It would be different if the threat of a forced update wasn’t always looming over the horizon due to a newly found vulnerability. It’s easier to upgrade when the only changes are security related, updates that include feature changes take longer.

 

Windows Home Server

I continue to like Windows Home Server. I copied my Aperture library to share and it seemed to work fine. But I decided not to keep it there. I was uncomfortable leaving it there. Between being a network drive and not being a native OS X format I was concerned about stability. Besides, Aperture isn’t really an application I share between Macs and it would screw up my backup plan which centers around files on my iMac’s local drive.

I have been running a small iPhoto library off a share without an issue but I haven’t done thorough testing.

A alternative for Aperture, iPhoto or any Mac software that saves data in a bundle is to create a sparse disk image file and put that on the network share. It can then be mounted to access the data. I didn’t have any problem running this off a home server share either while doing so quick testing. But that also affects my backup plans as it’s now one big file. Even though the bundle looks like one big file my file syncing and backup software sees the files inside it and can deal with them individually.

HP has announced that my HP MediaSmart Server will be getting an update involving PacketVideo technology which should add “advanced graphics such as thumbnails of photos, in-menu browsing and album art” and improve streaming to other digital devices. 64-bit Vista support will also be added along with McAfee anti-virus software (for the server side). McAfee will only be free for seven months. I’m not sure I want McAfee running on any of my boxes. My experience with them (years ago) is that they took over the machine almost as bad as Symantec and I swore off both them and Symantec.

Jungle Disk has a beta version of their backup software for Windows Home Server available for free. Jungle Disk doesn’t have all the features of their current Windows/OSX/Linux software but they seem to have plans to add the features that make sense (like block level backups). The software is free during the beta but requires an Amazon S3 account. Their regular software is a reasonable $20 for lifetime upgrades so I wouldn’t expect the WHS version to be more than that. I’ve been running the beta with just minor and already reported issues.

News.com has a short article about how Windows Home Server remains a tough sell. It’s the last two lines that caught my attention:

One area that Microsoft may look at to boost the popularity of the Home Server is having the software work better in households that have both Macs and Windows PCs.

“That’s something we are taking a close look at,” VanRoekel said, though he added that Microsoft has “nothing to announce.”

That can only be good for me.

 

Reviews

Transmit

Transmit by Panic Inc. is an FTP client for the Mac and it’s become a favorite of mine. I started with Fetch because it did what I wanted at the time but I eventually added Transmit and it’s what I now use exclusively. I use it every day if scheduled tasks are taken into account.

Shawn Blanc has written a thorough review of Transmit. Like Shawn, favorites are one of my favorite features (pun intended) since they’re more than just links. I like his idea to add a notes feature to favorites. He also mentions my biggest pet peeve about Transmit:

The basic interface of Transmit is perfectly blunt. You’ve got “Your Stuff” on the left and “Their Stuff” on the right.

Your Stuff is what’s on your computer, and Their Stuff is what’s on the server. I like the idea, but I do think it could be named better. Just because a file is on another server doesn’t mean it’s “theirs”. I would prefer to see these named as “Here” and “There”, or “Local” and “Over Yonder”.

Every time I see the screen I mutter “it’s all my stuff”. It’s probably embarrassing to admit, but I had a hard time getting my head around that, I would always have to think twice or even three times when I did a synchronize that would delete files. At least now I’ve used it enough the automatically think local and remote.

The next review is also by Shawn and both are part of his series of reviews titled “Some of the Greatest Software Available for your Mac“, which is still a work in progress. NetNewsWire (a desktop RSS reader) is already reviewed and 5 more apps along with one piece of hardware are also on the list. The reviews are so good I can’t fault him that Mint isn’t Mac software (although there are OS X widgets available) and his ninth item isn’t even software.

Mint

Mint is web site stats software. Like the Transmit review, Shawn’s review of Mint is extremely thorough, starting with some history. If you’re looking for a web site stats package check out Mint and Shawn’s review of it. Mint is hosted in your domain and requires MySQL and PHP. There’s community support (and in some cases plug-ins) for WordPress, Moveable type and others. In my case I was able to implement Mint with a plug-in so I didn’t have to even edit any templates.

 

News and Links

WPCandy.com published a new advanced reference sheet for WordPress to add to their previous WordPress help sheet.

Firefox has a bug that can allow a malicious hacker to spoof a validation dialog. The link is more appropriate to the Wednesday security links but Firefox will probably be updated by then.

Categories
Websites & Domains

Security Quest #16: WordPress Edition

WordPress has released version 2.3.2 which it calls an “urgent security release”. WordPress 2.3.2 contains a total of 7 bug fixes. The security vulnerability would allow someone to see future posts by giving access to draft posts. Sixteen WordPress files were changed in this update.

This version will also suppress some DB error messages to avoid giving out to much information. The error messages will still be displayed if debug mode is enabled. Details on all the changes can be found at Westi on WordPress.

The update was released on the 29th and I got around to installing it this past weekend, along with updating numerous plug-ins. The update wasn’t too tough but mainly because I assumed things would work OK and didn’t do too much testing. I had seven plug-ins to update, although only five were actually in use. Against common sense I updated all the plug-ins and WordPress itself on my test site without doing a backup first. I replaced all the WordPress files rather than picking out the 16 that changed. There weren’t any DB changes but I ran upgrade.php on my test site just to be sure and was told there weren’t any DB changes.

Updating the regular site was just a matter of copying the new WordPress and plug-in files up to the new site. But in this case I did do backups first.

WordPress Update Notifications

With WordPress 2.3 notification about updates began to be included in the admin panel. If WordPress itself needs to be upgraded there a message along the top of the admin panel and down on the footer too. This makes it nice to not have to go looking for updates on a regular basis even if it doesn’t alleviate the annoyance of the moment when an unexpected update notification pops up. The plug-in page also displays info on plug-ins that are out of date, although this requires the plug-in to be hosted in WordPress.org’s plug-in library.

Some plug-ins don’t provide very much information about the update so it’s hard to know if it’s worth the update. I’ve avoided updating just because it says there’s a plug-in update. Instead I tend to group them together for when I have time or when I need to install a security related update (like this time). Some plug-ins can update frequently like the one that was updated twice (at least) this month. I found that out when the update I had download two days previously was out of date when I applied it.

There’s also been other little things that make doing update easier, like a link to deactivate all plug-ins at once.

WordPress Anti-Spam

The Akismet anti-spam plug-in is included with WordPress and it’s probably what most people use. It’s free (for non-commercial use on blogs that make less than $500/mth) so that’s a plus. The actual spam detection process occurs on Akismet’s. This means your server doesn’t have to handle the processing which could be a benefit. But it does mean that it the Akismet servers are busy your comments may not be processed and spam may get through. Paid Akismet users do get priority. Another benefit, at least in theory, is that Akismet can take the knowledge learned as it processed comments for spam and help everyone. I used it at first and have to say it worked well but did let some stuff through, especially trackback spam.

I started using Spam Karma 2 back in October and it’s worked almost flawlessly. I seem to recall a comment/trackback or two getting through but can’t remember anything specific. I also can’t recall it eating any legit comments. While the ability to tweak the settings is nearly endless I pretty much stuck to the defaults. The plug-in was just updated in May and the author recently announced another update is pending. But then he says:

This will also likely be the last update to Spam Karma (which should still give us all quite a few months respite from spam). Barring any unforeseeable circumstances, there will be no more compatibility update to try and keep up with WordPress’ habit of breaking compatibility with each of their [numerous] releases. Furthermore, there is increasingly little point in “competing” against Akismet, when it is bundled and marketed as the principal WordPress antispam tool (even if I personally do not like its approach).

It’s probably an unfair comment, but the bundling of Akismet reminds me of the bundling of IE with Windows. (But Akismet is a plugin so easily avoided, unlike IE) Still, Spam Karma 2 will work for the foreseeable future, hopefully through the next couple of WordPress upgrade cycles.

Dozens of other spam tools are available through the WordPress codex.

EMail Address Harvesting

There are several plug-ins available to protect email addresses from being harvested from WordPress. For awhile I used the email immunizer plug-in and this seemed to work well. This allows email addresses to be specified normally and they can be read by humans but put in their HTML equivalents for spam bots. But if the plug-in breaks or stops working the addresses will also appear in plain text for the bots. I stopped using this simply to reduce the number of plug-ins I used. There are several similar plug-ins at the previous spam tools link.

Backups

As with any security measures backups of data have to be included.

The WordPress Database Backup plugin can be used to backup the WP database. I only use this occasionally as I’ve had some problems with it. If I try to back up all the tables I inevitably exceed the cpu quota with my web host and get locked out for a minute or two. I still use it to back up the basic tables before an upgrade. I also had problems when trying to schedule backups through the plugin, again my web host didn’t seem to like it. The plugin has been updated since I tried scheduling backups but I’m not entirely comfortable sending a copy of my SQL database through email.

These days I’m more likely to use the built-in WordPress export feature to save all my posts, comments and categories to a local file than use the WPBackup plugin although the next two items are my primary backup methods.

I also use my web hosts own backup facility to back up my SQL databases and download the backup to my local computer.

To back up all the files on the site I schedule a nightly backup with Transmit.

WordPress Security Resources & Links

Some additional WordPress security resources:

BlogSecurity.Net – A site with information and tools related to blog security. Most of their content is related to WordPress.

The WordPress Development Blog will bring news of the latest releases.

Help Net Security is a general network security site that contains a lot of WordPress information. Their latest WordPress article is a list of WordPress security plug-ins.

Bad Neighborhood and the Bad Neighborhood blog are primarily SEO related sites but it includes the WordPress Login Lockdown plug-in which can be used to prevent brute force attacks to guess your WordPress admin password.

This article at Quick Online Tips has 3 suggestions for securing a WordPress blog such as removing the version info from the header and preventing the display of what’s in your plug-ins directory.

 

Spam Counts

This weeks spam counts:

Primary Mailbox 30-day spam count: 3

This is down one from last week and none of the spam is new, the last one arriving in the 13th.

Public Mailbox 30-day spam count: 176

The total is unchanged from last week but there was plenty of new spam.

Website comment and trackback spam: 7,500

This means there were 96 new ones from last week.

 

Other News & Links

Some non-WordPress news & links that caught my attention this week.

ArsTechnica.com: Adobe, Omniture in hot water for snooping on CS3 users – A little more info about the snooping being done in Adobe CS3. But no info from Omniture about the curiously crafted URL that the info is sent to.

CNet.com: Problems updating the Flash player in Firefox? Here’s help – The article provides the reasons I hate Flash player. What the rather long article explains is the steps necessary to remove the old, vulnerable versions of Flash Player.

Davidairey.co.uk: WARNING: Google’s GMail security failure leaves my business sabotaged – David has his GMail account hacked due to a vulnerability (since fixed) which led to him having his domain name stolen from him.

Dynamoo.com: Js/snz.a – likely false positive in eTrust / Vet Anti-Virus – Another probable false positive which will hopefully be fixed by the time you read this.

Lifehacker.com: How to Selectively Share Google Reader Feeds – There’s been a bit of a dust up over Google automatically sharing the Google Reader shared items with all contacts. Here’s a way to selectively share feeds.

Security Fix – Brian Krebs on Computer and Internet Security – (washingtonpost.com)– The storm work is now spreading via Google’s blogspot blogs.

Techdirt.com: Will Patent Battles Make Your Computer Less Secure? – TechDirt is concerned that patents could be used to hold back progress and make PCs less secure.

UneasySilence.com: Lies, Lies and Adobe Spies – No specifics as to what’s going on here, but Adobe CS3 seems to be calling home and trying to obscure exactly what it’s doing by using a website name designed to look like a local IP address.