Saturday, 29 November 2008

Upgrade to Ubuntu 8.10 Intrepid

I upgraded my laptop (Lenovo x300) to Ubuntu 8.10 a few weeks ago. The rumour was that power management was better, and I was longing for a kernel that handled the sound on the x300 without a re-compile of the driver every time I updated the kernel.

The upgrade went smoothly, although it took a very long time. The default Ubuntu mirror for Canada seems to be very slow these days. (I've since switched to It seems a lot faster.)

Two things I've had to work on. First, suspend and resume screws up the wireless until you add a line to /etc/pm/config.d/00sleep_module. First, you have to figure out which driver you're using for wireless. Do
lshw | more
Look for the line that says "wireless" by typing "/wireless" to the more prompt. Then look for the next line with "driver" in it. In my case it says "driver=iwlagn". So now edit the file and add the required line:
sudo gedit /etc/pm/config.d/00sleep_module
Add the following at the very end:
The other problem was more mysterious. CBC Radio's website wouldn't play after upgrading to 8.10. It had worked for my in 8.04 after some fooling around, but I couldn't get it to work. I had given up after wasting almost a whole day on the problem. After a few software upgrades, it started to work. For the record, I'm using gnome-mplayer to play Windows Media Player material.

Thursday, 27 November 2008

Geek Humour

Two jokes that are just so impossibly geeky that I have to repeat them. My apologies if they're so old you've seen them already.
There are 10 types of people in this world: Those who understand binary numbers and those who don't.
Heisenberg is speeding down the highway and is pulled over by the police. The officer comes up to his window and says, "Do you know how fast you were going sir?" Heisenberg replies, "No. But I know where I am."

Wednesday, 15 October 2008

Netbeans Rake Menu Missing

In NetBeans 6.1, I created a new project and the "Run Rake Tasks" menu was empty (it only said "Refresh List"). I found someone else through Google who had the same problem, but no solution other than "use NetBeans 6.5 beta".

After fooling around a bit, I simply un-installed Rails 2.1.1 and re-installed it and voila, the "Run Rake Tasks" menu was populated with all its tasks. Bizarre.

P.S. It wasn't completely random that I chose to re-install Rails 2.1.1. When I tried to check on the plug-ins I had installed, I got a message that said to install Rails 2.1.1. My gems list said it was already installed, so I decided to un-install it and then install.

Tuesday, 30 September 2008

Unit Dose Roll-Out Part III

Another key element to our success: Bring candy to nurses. Not only do you make nurses happy because they get candy, but you're also showing that you understand their culture and are at least a little bit willing to move yourself closer to it.

Just make sure you don't buy only "granny candy". As project managers and senior nurse educators, we tend to be closer to the end of our career than to the beginning. Don't forget that a significant number of nurses nowadays carry iPhones and have a huge number of friends on Facebook.

Sunday, 28 September 2008

Unit Dose Roll-Out Part II

The machines we're using to package the medications are the FastPak EXP from Automed (AmerisourceBergen). They have an awesome pre-installation support team. The front-end sales people were so-so -- your mileage will vary, of course, depending on the region. The sales team was Western Canada; the pre-installation support is for all of Canada.

The machines themselves have a number of quirks. Nothing that can't be worked around, but don't believe that you won't have to make any decisions yourself. Also, since we're running three machines, we've written our own little database scripts to keep the data in the machines synchronized. There's no way you should try to do it by hand, although I suspect that's what most people do because the vendor doesn't have anything to help.

The main competition to Automed are the Pacmed machines from McKesson. There are some differences between the two that will require a change to your extract or interface from whatever Pharmacy Information System you're using. Nothing big, but in software even a small thing can cost a lot of money. It's worth looking into the interface in detail if you're looking at switching from Pacmed or running both in parallel.

Because we're packaging all regularly scheduled oral solids (with some exceptions) we've found that our Pharmacy Information System wasn't really set up to handle some of the scenarios. Our distribution model seems to be different from the typical hospital pharmacy, but I don't have enough experience with hospital pharmacies to say if these challenges would generalize to other installations.

Saturday, 20 September 2008

Successful Unit Dose Roll-Out Part I

We've begun to roll out a just-in-time unit dose medication distribution system at GF Strong, UBC and Vancouver General Hospitals. Just-in-time unit dose medication distribution increases patient safety by making it easier for nurses to do what they always do: provide quality care, including medications.

Nurses are loving the new approach. "You just saved me ten minutes", "I really like it", and a big two thumbs up are some of the comments I've heard as I provide go-live support on a pair of medical nursing units.

I'd say there are two major reasons why it's going so well: First, the system is intrinsically good for nurses. Nurses are over-worked, under-paid, and totally committed to their patients. Anything that improves patient safety while making their job easier is going to be a hit.

The second reason is the excellent communication and training by our team of nurse educators. We have to train about 3,000 nurses. We started four weeks before the first units went live, and will continue up to the last go-live week in December. With four nurse educators giving a half-hour session, we're reaching 100 percent of the nurses on most nursing units, and well over 80 percent on the rest.

On any future front-line health care projects I may do, I'm going to insist on the budget to adequately listen to and train the front-line health care providers. This has been so key.

I've been the project manager on this project for just over a year. It's been a complicated, multi-faceted project with a lot of challenges. It's totally satisfying to see a successful start to the roll-out. We're phasing in nursing units for the next three months, so I'm sure there'll be some challenges along the way, but it's clear that we've got a winner.

Stay tuned for future posts about why this project is so successful, and what the challenges have been.

Tuesday, 2 September 2008


According to this, in a test of browsers against media intensive sites, Microsoft Internet Explorer 8 takes more memory than Windows XP does (did). Bill (or Steve or whoever): That's not what's meant by the phrase, "the browser is the operating system."

Sunday, 10 August 2008

Synching a Dell Axim X30 with Ubuntu

I have an old (?) Dell Axim X30 PDA that I use mainly as an address book and MP3 player (I added a memory card so I can listen to podcasts while walking the dog). Now that Ubuntu is my primary desktop OS, I wanted to be synching contacts and sound files with Ubuntu.

The SynCE project has done this. The documentation is pretty good, but as usual I managed to make it hard for myself. Here's what I did:
  1. Make sure the X30 is not plugged in to the computer
  2. sudo apt-get install synce-hal librra0-tools librapi2-tools
  3. Plug in the device
  4. synce-pls
The last line should show what's in the top-level directory of the X30.

Note that I was already running kernel 2.6.24-19, so I didn't have to rebuild the modules as described in the documentation. If your Ubuntu 8.04 is up-to-date, you'll be running at least this kernel.

My problem: All the installation instructions warn you to blacklist the ipaq module if you have connection problems. So I went ahead and blacklisted it before I even started. Then I fumbled around for a few hours trying to connect unsuccessfully and searching for information.

The X30 only supports Windows Mobile 2003 Second Edition. I don't know all the technical details, but I know it means it used a somewhat different protocol for connecting. In my search for answers, I found enough examples of people successfully connecting to X30s that I kept going. I also found enough references to the "old protocol" or "serial protocol" that I finally realized I should try allowing the ipaq module. I removed the blacklist and, presto, it worked.

Tuesday, 5 August 2008

Linux -- Ready for My Mom, But Not for Geeks

I think Ubuntu 8.04 is ready for anyone who wants a good desktop computer with a decent set of office productivity tools. The installation and update experience with Ubuntu is as good or better than Windows and Open Office does what the vast majority of ordinary users need it to do. My mother is using Ubuntu for e-mail and web surfing and having no problems (beyond what she'd have with Windows seeing as how she's never used a computer before).

Where I'm getting blocked is when I want to do the power-user type things: e.g. sync to a PDA that only has Windows Mobile 2003 on it or use cutting edge devices like a Lenovo x300. That's when I find I have to go to Linux Geek Land. At least I'm only recompiling modules, and not the whole kernel :-)

So in that sense, Linux is ready for the average person's desktop, it's just not ready for us geeks yet.

Friday, 18 July 2008

Why IT's So Hard

Why is providing reliable IT infrastructure so hard? Here's a good example.

There was a fire in downtown Vancouver this week that knocked out power to a good part of downtown for up to three days. Angela noted that the Internet was slow the day of the fire. I know there's a major network hub in the area of the fire at Harbour Centre, and I suspected that something had gone wrong there, despite all the precautions that would have been taken. Now I have proof.

The fire knocked out power to the network hub, and the generator kicked in as planned, but the Vancouver Fire and Rescue Services were sucking so much water to fight the fire that the generator had to shut down because it wasn't getting enough cooling water. Not only was that hard to predict, it would have been really hard to test -- I suppose the Fire Department would have loved an excuse to play with their hoses, but I'm not sure the City would have wanted them to run a test that tried to use up all the water in downtown Vancouver.

Saturday, 5 July 2008

Keyboard Layouts in Ubuntu

I type enough in Spanish that I like to be able to switch keyboard layouts between English and Spanish, rather than type Alt-whatever to get the Spanish characters (e.g. ñ, ¡, ¿).

In Ubuntu 8.04 (Hardy), go to System--> Administration--> Language Support and install Spanish. This requires a reboot. Truth be told, I'm not sure you have to do this step just to get the keyboard layout, but if you type enough in a language that you want the keyboard layout, you probably want the rest of the language support as well.

To install control over the keyboard layout:
  1. Right-click on the top panel (aka menu bar, aka tool bar) and select "Add to Panel..."
  2. Select "Keyboard Indicator". You'll see the indicator appear in the panel. In my case, it said "USA"
  3. Right click on the indicator and select "Keyboard Preferences"
  4. Select the "Layouts" tab
  5. Click the "Add..." button
  6. Select the keyboard layout you want. For those looking for Spanish, note that there's a layout for Spain and a layout called "Latin American". The keyboard I bought in Guatemala is actually the "Spanish" layout. You'll have to figure out what works for you
  7. Click OK until you're back to what you wanted to be doing
Of course, that doesn't change the physical keycaps. When you're using the Latin American layout with a USA keyboard, it's sometimes hard to find some of the special characters. They're not in the same place on the two keyboards. Fortunately, we have yet another nice advantage of Ubuntu over Windows. You can right-click on the keyboard indicator and select "Show Current Layout" and it gives you a picture of your keyboard. You can even print the layout.

Sunday, 15 June 2008

Sound with Ubuntu on Lenovo x300

One of the known issues with Ubuntu on a Lenovo x300 is the sound (up to and including Ubuntu 8.04). There are a few links out there that point to solutions. The one that worked for me is this one in Mikko's Blog. One very important note: Check which version of the kernel you're running before you start.
uname --release
Then substitute your kernel version in the rm command. I also got a bit confused by the statement under the "Sound" heading that said to remove the old sound modules. I eventually figured out that he must have been referring to the rm command in his script, rather than actually removing the running modules.

FYI: My rather short list of Ubuntu and Lenovo x300 links is here.

Friday, 13 June 2008

The Truth

From the Linux Loadable Kernel Modules HOW-TO: "This is Unix, and explanatory error messages are seen as a sign of weakness."

Tuesday, 3 June 2008

Lenovo x300 and Ubuntu

I'm writing this from my new Lenovo x300 on which I installed Ubuntu 8.04. It was way too easy -- just pop in the CD, answer a few questions, and wait. I'll give more reviews in the days to come.

Saturday, 10 May 2008

This is Ironic Somehow

The Canadian Internet Registration Association (CIRA) sent me an e-mail telling me that they were changing the "whois" data so they don't give out personal information of domain owners. Thunderbird thought it was a scam e-mail.

Thursday, 24 April 2008

Is This the Beginning of the End for Microsoft...

...or... The end of the... or whatever.

Here you can read about Microsoft explaining their Health Vault product (sorry, I think you might need to register). "

NetBeans Out of Memory Updating Ruby Gems

I got a message "Exception in thread "main" java.lang.OutOfMemoryError: Java heap space" while updating Ruby Gems in NetBeans 6.0.1 on Windows XP. It took a little longer than usual for me to find reports of this problem using Google, but when I did I found that it is a known problem.

Someone suggested changing the command line arguments to NetBeans to increase the size of the memory for the JVM, but that didn't work for me. Instead, I figured out how to load gems from the command line, and once I did that one time, I was able to use the gem manager in NetBeans.

So, in more detail: Originally I had installed NetBeans from a privileged account using all the default options. To get new gems, in a non-privileged account, I had to create two environment variables with the following values:
set JAVA_HOME=C:\Program Files\Java\jdk1.6.0_05
set JRUBY_BASE=C:\Program Files\NetBeans 6.0.1\ruby1\jruby-1.0.2
Your values may differ, particularly the version numbers. I wrote it here as you'd do it in the command prompt window, but I actually did it using the System control panel.

I opened a comment prompt window and did:
%JRUBY_BASE%\bin\gem install login_generator
After a couple of minutes, everything seemed to end normally. I did Tools->Ruby Gems from a running NetBeans instance and didn't get the desired results, so I restarted NetBeans and then Tools->Ruby Gems got me a list of gems, including login_generator as an installed gem.

Monday, 21 April 2008

MySQL From Remote Host

I couldn't get MySQL Administrator on a Windows XP desktop to connect to a MySQL instance I had running on an Ubuntu 6.06 server that I built as a LAMP server. I was getting:
Could not connect to the specified instance.

MySQL Error Number 2003
Can't connect to MySQL server on 'server' (10061)
I had to edit /etc/my.cnf (or /etc/mysql/my.cnf depending on where yours is stored) on the Ubuntu server to comment out the "bind-address" line, then restart the server. I also had to add a non-root user with all privileges. In fact, you have to add two users as described here:
use mysql
grant all privileges on *.* to 'user'@'localhost' identified by 'password' with grant option;
grant all privileges on *.* to 'user'@'%' identified by 'password' with grant option;
flush privileges;
There are some posts that show how to enable remote logins by the MySQL "root" user, but I prefer not to do it that way.

Tape Rotation with Bacula

I love the topic of backups. I say that because it's IT's dirty secret. No one should keep data in one place only, yet it's very difficult to set up a backup solution. Different organizations have different needs, and so backup software has to provide a lot of options. But the need for options means when you just want to get basic backup running quickly, it's a challenge.

This post is part of a series about rolling your own backup solution. There are other ways to do it, but I wanted to do my own solution one more time...

I'm backing up a Windows XP desktop and a Windows XP laptop, a Dell SC440 which is the VMWare host, plus a number of Linux VMs that provide my basic infrastructure: DNS, DHCP, file server, Subversion server, test platforms for software development, and the backup server itself.

I chose tape in part because I can take the backup off-site. I'll take a tape off-site once a week. That means I might lose a week's worth of work if my house burns down, but I'm not ready to invest in the time and effort to swap tapes every day, either.

The Bacula documentation has a good section on backup strategies, but none of them include mine. I'll have to figure it out myself.

Bacula manages tapes in a tape pool. A pool is just a group of tapes. (Bacula calls tapes "volumes".) I want to let Bacula fill up one tape per week before it uses another, which is the default behaviour. At the end of the week, I want to eject the tape and use another. I'll let Bacula automatically recycle the tapes, meaning that after a week (in my case), Bacula will reuse a tape, overwriting the old backups on it.

Anyway, I started with a rotation to do a full backup Sunday night, incremental backups all week, and then eject the tape Saturday night after the last incremental. With three tapes I would always have last week's tape off site, except on Sunday.

I really only got started when I realized that that's a lot of tape wear given that the off-site happens once a week and that I have a fair bit of disk space on my main server. So my next idea is:

Take a full backup Monday night to disk, and incrementals up to Sunday night. Then, Monday morning write the whole disk volume to tape and take it off-site. That way I only run the tape once a week, and hopefully in a scenario that minimizes the chance of shoe-shining. I'll write the data to disk without compression, and let hardware compression compress the data to tape.

This also has the nice property that last week's backups are also on the disk (if I have enough disk space), so if I need a file I can get it from disk rather than retrieving the tape.

Friday, 18 April 2008

Securing DNS/bind/named

This is another late posting of some notes when I built some new infrastructure servers on VMs to replace my aging PowerPC Macs that ran my network.

The security info I got when my ISP told me I had a badly configured name server requires that you create a /var/named directory:

sudo mkdir /var/named 
sudo chgrp bind /var/named 
sudo chmod 770 /var/named 
sudo chmod g+s /var/named 
sudo mkdir /var/log/named 
sudo chmod 770 /var/log/named 
sudo chmod g+s /var/log/named

Wednesday, 16 April 2008

Building A DHCP/DNS Server

Months ago I built a DHCP/DNS server from scratch. Most of these notes I made at the time I was building it, meaning to fix them up within a day or two and post them. Of course, I kept doing other things before finishing the documentation, so here are my rather raw notes. This was for Ubuntu 6.06 running on VMWare Server.
  1. Create a new VM with a 2 GB disk, don't preallocate and make sure all disks are less than 2 GB. Only give it 64 MB of RAM
  2. Attach the Ubuntu .iso to the CD and start the VM
  3. Build with the options you want
  4. Do the following:

  5. sudo apt-get update
    sudo apt-get upgrade
    sudo apt-get install ssh ntp-simple snmpd snmp bacula-client build-essential linux-headers-$(uname -r)

  6. Install VMTools
  7. cd /tmp
    sudo mount /cdrom
    sudo tar xvfz /cdrom/VMwareTools-1.0.0-27828.tar.gz
    cd vmware-tools-distrib
    sudo ./

  8. Edit /etc/dhcp3/dhclient.conf to send host-name "netres01";
  9. Restart the network to get into DNS and DHCP (if you already have one)
  10. Install DHCP and DNS and stop the services:
  11. sudo apt-get install dhcp3-server bind9 sudo
    /etc/init.d/bind9 stop
    sudo /etc/init.d/dhcp3 stop

  12. Since this is a DNS server, I'll allow it to use a fixed IP address. Edit /etc/network/interfaces. Edit the forward and reverse zone files.
auto eth0
iface eth0 inet static
pre-up iptables-restore < /etc/iptables.rules post-down iptables-save -c > /etc/iptables.rules
You have to kill the existing dhclient process because ifdown/ifup doesn't (it wouldn't know how, really).

Change the key for the DNS server before starting it, or you'll have to manually look up the pids and kill the named processes. rndc stops working because the key has changed since named started.

If you had a name server from DNS before, it will still be in /etc/resolv.conf.

The biggest thing is to get the permissions right on the /etc/bind directories and files.


Set up the new DNS first and get it working.

Monday, 14 April 2008

Challenge # 42 of Healthcare IT

Many who've worked in healthcare IT believe it's more difficult than IT in other contexts. Everyone has their reasons. I'd like to add mine here.

Mistakes in healthcare are really bad. They literally lead to people's health being compromised, or in the worst case, people dying. Projects are about doing something new. Doing something new is about making mistakes and learning from them, or at least trying out new ideas, some of which will turn out to be wrong.

Sometimes these two things are in direct contradiction. More often it leads to all sorts of misunderstandings between the healthcare team and the external project team that are hard for either side to recognize, let alone overcome.

For example, it's pretty standard practice on a project to do a design and put it in front of a group of people for review. While it can be hard to listen to others criticize your design after all the work you've done on it, we all get used to it.

Now imagine you're a nurse, doctor or pharmacist. All your life you've been terrified of making a mistake because someone might die because of it. Everyone around you is also terrified of making a mistake, and in fact the best way for them to feel good is to catch you making a mistake. It's pretty easy to fall into a pattern of avoiding mistakes at all costs, avoiding blame for mistakes when they do occur, and catching others' mistakes in order to appear to be a better nurse, doctor or pharmacist than the others.

You're not likely even to be able to understand a consultant who suggest you put up a proposed design and let others criticize it. And if you understand, you're not likely to want to go along with it. Every fibre in your being is about avoiding mistakes. And everyone you work with considers making a mistake to be the worst thing anyone can do. No consultant is going to convince you that you should publicly set yourself up to "make a mistake".

If you're running a project in a healthcare environment, you need to understand the depth of fear of making mistakes. To move the project forward in spite of this fear, try some of these ideas:
  • Let the people you're working with tell you what makes them comfortable. They won't necessarily tell you just because you ask them. You have to listen to how they want to do the project
  • Bring groups together and facilitate group decision making, rather than expecting one person to tell you an answer. It will take longer than if you could find one person to make the decision, but the reality is, you aren't going to find that one person
  • Use project staff if you can. Just let them know they're going to take a beating. The passion with which many people expose other people's mistakes in healthcare is unnerving
By the way, I'm really glad that healthcare providers have a phobia about mistakes. If I'm ever in the hospital I want to know that everyone there is doing everything they can to avoid mistakes. It's only difficult when you're trying to run a project.

Sunday, 13 April 2008

Bacula Catalog Job and MySQL

To make the Bacula catalog job work:
  1. Edit /etc/bacula/bacula-dir.conf on the backup server
  2. Change where it says -u -p to -u bacula
  3. Edit ~bacula/.my.cnf and put this:
  1. chmod 400 ~bacula/.my.cnf ; chown bacula:bacula ~bacula/.my.cnf
Now it should work. Don't forget to do "reload" in bconsole.

Bacula Notes

The Bacula documentation is good, but given the complex and interdependent nature of the backup problem, it's pretty overwhelming at first.

One thing that's not immediately obvious is where the configuration files are. The bacula-fd.conf file for Windows XP clients is at C:\Documents and Settings\All Users\bacula\bacula-fd.conf. On Ubuntu using the packages installed from universe, the configuration files are in /etc/bacula.

If you get errors that the server can't connect to the client, make sure the director definition in the client's bacula-fd.conf allows the director to connect, and that the client's password matches the server's password in the client resource of /etc/bacula/bacula-dir.conf. There's a helpful picture of what you need to do in the Bacula documentation.

Friday, 11 April 2008

Accessing a SCSI Tape Drive from a VM

I ordered my Dell SC440 with an internal DAT tape drive. lsscsi reports it as a Seagate DAT72-052. I'm pretty sure that the Ubuntu 6.06 installation picked it up automatically -- I flailed around a bit to get this working but I don't think at the end of the day that I did anything on the host to get the tape drive working.

I'm creating a VM to run my backup. For large installations you won't want to do this, but for me I see no reason not to. And a big part of the reason I'm doing this is to see what's possible, so onward.

To enable the tape on a VM, you have to shut down the VM. Then, in the VMWare Console select VM > Settings > Hardware > Generic SCSI, and specify the physical device to connect to. In my case it was /dev/sg0. You also have to specify the controller and target for the tape drive.

I had no idea what the controller and target were, so on the VMWare host, I did:
sudo apt-get install lsscsi
lsscsi -c
and got:
Attached devices:
Host: scsi1 Channel: 00 Target: 06 Lun: 00
Vendor: SEAGATE Model: DAT DAT72-052 Rev: A16E
Type: Sequential-Access ANSI SCSI revision: 03
Host: scsi2 Channel: 00 Target: 00 Lun: 00
Vendor: ATA Model: WDC WD1600YS-18S Rev: 20.0
Type: Direct-Access ANSI SCSI revision: 05
I took the channel as the controller: 0, and the target: 6. I entered all that into the VMWare Console and clicked enough okays to get out of the configuration. (I couldn't find the link in VMWare's on-line documentation for configuring generic SCSI devices, but if you type "SCSI" in the "Index" tab of the VMWare Console's help window you can find slightly more detailed instructions.)

When I started the VM, I got a message that said, among other things: "Insufficient permissions to access the file." Since it looked like everything else was right, I did ls -l /dev/sg0 on the VMWare host (not the VM) and got:
crw-rw---- 1 root tape 21, 0 2008-03-23 17:23 /dev/sg0
Since VMWare was running as user vmware, I added the vmware user to the tape group:
sudo adduser vmware tape
Then I restarted the VM and it worked fine. It pays to read the error message closely.

Thursday, 10 April 2008

Another SCSI Package

This is useful:
sudo apt-get install lsscsi
It shows you what SCSI devices you have attached to a machine and some important values.

Wednesday, 9 April 2008

Installing Bacula

To install bacula with MySQL (after you do this):

sudo apt-get install mysql-server bacula-director-mysql

Then you have to set up exim4, the mail system. Choose:

mail sent by smarthost; no local mail

After you install the MySQL version of the bacula director, you can install the rest of bacula this way, and also install some recommended packages:

sudo apt-get install bacula
sudo apt-get install dds2tar scsitools sg3-utils

I had these notes from an earlier set-up of exim4:
Look into setting up /etc/aliases later to redirect mail to more useful places. Also, make sure the domain of the outgoing address is one known to the outside world (e.g. or the SMTP server will probably reject the message.

Bacula: Backups

To install bacula on Ubuntu, you need to add the universe repositories to /etc/apt/sources.list. It's just a matter of uncommenting four lines:
deb dapper universe 
deb-src dapper universe
deb dapper-security universe
deb-src dapper-security universe
sudo apt-get update
The standard install of bacula uses sqllite, which the bacula guy reports as having problems...

Tuesday, 8 April 2008

Copying VMs

I tried copying my tiny Ubuntu VM, and it ran, except eth0 wouldn't come up, and of course the host name was wrong.

To fix eth0, you have to update /etc/iftab with the new VMWare-generated MAC address for the Ethernet interface. I added a script to the base VM in /usr/local/sbin/changemac to make it easier:

sudo vi /usr/local/sbin/changemac

And add:

mac=`ifconfig -a | grep "HWaddr" | cut -d " " -f 11`

echo "eth0 mac $mac arp 1" > /etc/iftab

Then do:

sudo chmod u+x /usr/local/sbin/changemac

Note that you're adding the script to the "template" VM, so you'll only have create the script once for each template you create, not each time you create a new VM.

Now you can copy the "template" VM. Make sure the "template" VM isn't running. Log in to the VMWare host, change to the directory where you have the VMs, and copy the VM:

cd /usr/local/vmware/Virtual\ Machines
sudo cp -R --preserve=permissions,owner old_VM_directory new_VM_directory

Now in the VMWare console:
  1. Import the new VM and start it.
  2. Log in at the console and run /usr/local/sbin/changemac.
  3. Change /etc/hostname, /etc/dhcp3/dhclient.conf, and /etc/hosts to have the host name you want for the new machine.
  4. Reboot.
I'm sure you should be able to do this without a reboot, but I don't know which startup scripts do what needs to be done. Also, I had some problem with sudo not working after changing /etc/hosts.

If you forget to change the host name in /etc/dhcp3/dhcient.conf the first time around:
  1. Change it
  2. Type sudo date and then enter your password. This is just to make sure that sudo isn't going to prompt you for passwords
  3. Type sudo ifdown eth0 && sudo ifup eth0
The above process will work even if you're on a remote ssh session (e.g. Putty), because the network will go down and up before your terminal times out.

Monday, 7 April 2008

Firewall on the VM Quick Reference

Here's how to set up the firewall. Here's my /etc/iptables.rules:

:INPUT ACCEPT [273:55355]
:LOGNDROP - [0:0]
:OUTPUT ACCEPT [92376:20668252]
# Accept SSH so we can manage the VM
-A INPUT -i eth0 -p tcp -m tcp --dport 22 -j ACCEPT

-A INPUT -i lo -j ACCEPT
# Allow ping (Zenoss uses it to see if you're up).
-A INPUT -p icmp --icmp-type echo-request -j ACCEPT
# Allow SNMP.
-A INPUT -p udp -s 0/0 --sport 1024:65535 --dport 161:162 -j ACCEPT
# Silently block NetBIOS because we don't want to hear about Windows
-A INPUT -p udp --dport 137:139 -j DROP
# Drop and log the rest.
-A LOGNDROP -p tcp -m limit --limit 5/min -j LOG --log-prefix "Denied TCP: " --log-level 7
-A LOGNDROP -p udp -m limit --limit 5/min -j LOG --log-prefix "Denied UDP: " --log-level 7
-A LOGNDROP -p icmp -m limit --limit 5/min -j LOG --log-prefix "Denied ICMP: " --log-level 7

More on this later.

ntp on the VM

Bringing up the firewall on the "template" VM, I noticed that I was getting more ntp traffic than I expected. I discovered that in my ignorance, I had set my local ntp server to broadcast, which I don't need. I commented the broadcast line, and everything's still working.

I also found a good post on ntp that answered one of my long-time questions: What should I look at to see if the ntp client was actually working. Do ntpq -p. On the resulting listing, "the delay and offset values should be non-zero and the jitter value should be under 100." (The post is Red Hat based, but the information specifically about ntp is distro-agnostic.)

Sunday, 30 March 2008

SNMP on the VM

Setting up SNMP on a machine so it can be monitored by Zenoss seems to mess me up every time. This time the problem was the -i option of snmpconf. It's advertised to put the configuration file where the SNMP programs will find it, but it doesn't put it at the front of the list of paths where the programs look, at least not on Ubuntu 6.06.

The solution: don't use snmpconf -i. Run snmpconf to set the access. Make sure it matches what you've set up in Zenoss, particularly the version of SNMP and therefore the access model. When you're done, do sudo mv snmpd.conf /etc/snmp/.

Friday, 28 March 2008


The basic VM needs to have SNMP running on it, because there's no point having a server if you're not monitoring it. I had Zenoss set up a year ago monitoring some of my computers, but I was getting "bad oid" messages on the new VM template I was setting up.

The solution: Zenoss had a default SNMP version of 1 for Linux systems. I had set up SNMP on the new VM for version 2c. In Zenoss 2.0 I navigated to /Devices/Server/Linux page and selected the zProperties tab, then scrolled down to zSnmpVer and set it to v2c.

Tuesday, 25 March 2008

Basic Tiny VM Part 1

The basic tiny VM needs:
  • Ubuntu 6.06.1 Server (the basic install, not LAMP)
  • VMTools
  • SNMP so you can monitor it (I'm using Zenoss)
  • ssh so you can administer it
  • ntp as a client so it keeps time. For now I'll sync to my existing ntp server
  • basic firewall rules that allow the above
Build an ISO library in /usr/local/vmware/ISOs. Put in the Ubuntu CD and type:

mount /dev/cdrom
sudo dd if=/dev/cdrom0 of=/usr/local/vmware/ISOs/Ubuntu-6.06.1.iso

The VMTools ISOs are in the /tmp/vmware-server-distrib/lib/isoimages:

sudo cp /tmp/vmware-server-distrib/lib/isoimages/*.iso /usr/local/vmware/ISOs

Install VMTools. Here are some good instructions.

sudo apt-get install ssh ntp-simple snmpd snmp

(snmp is the package that contains snmpconf, which you need to set up snmp, and snmpwalk, which is useful for debugging.)

Configure the ntp server. I've set up an ntp server in the DNS, so I set the "server" line in /etc/ntp.conf to the following:

server ntp

And then restart ntp:

/etc/init.d/ntp-server restart

Run snmpconf to set up SNMP. That's probably a whole post in itself.

I'll do the firewall later. I've ignored my family for too long tonight.

Can't Connect to Console of VMs

I had everything built and running VMWare Server. Good. So I copied all the VMs I'd built when I was running VMWare on my desktop over to the new server. I started a few, and they were running fine. I could connect to the Zenoss console on one of them, and could ping both. However, all I got was a black screen when I tried to look at the console of the VM using VMWare Console.

The VMWare documentation recommended using the version of VMWare Console program specific to the server you're running. I grumbled a bit and re-installed (which was actually quite easy), then tried viewing the console of my VMs again. I still got a black screen, but I also got an error message saying that the .vmx file had to have execute permission for the user running the VMWare Console. I checked the .vmx files and sure enough, because of the way I copied them everything had 0644 permissions.

So I cd'd to the directory where all the VM directories were and typed:

find . -name "*.vmx" -exec chmod u+x \{} \;

That worked because the user connecting with VMWare Console is the same one that owned all the files. You'll have to do something slightly different if that's not the case.

Now they work fine.

Monday, 24 March 2008

VMWare Server on Ubuntu 6.06.1

The install went smoothly. I created a user "vmware" and added it to the admin group. Then I had to:

sudo apt-get install xinetd
sudo apt-get install libx11-6 libx11-dev libxtst6 xlibs-dev

The last line was thanks to this post. Without it, it wouldn't validate my serial number (and I'm sure I would have run into other problems).

The only default I changed was to put my virtual machines in /usr/local/vmware/Virtual Machines, because /usr/local is the big partition I made for VMs.

Sunday, 23 March 2008

Virtualization So Far

As should be obvious from my recent posts, I've been trying to set up a host for virtual machines. I need to be able to try things out easily, and virtual machines are great for that. I'd also like to get rid of my old boxes that are running core network infrastructure. It's not so much that I want to get rid of them, but the risk of continuing to use them is a problem. I have an 11-year-old Macintosh Performa that's my DHCP and DNS server for my whole network. If it breaks, I'm scrambling to replace it unless I get something new built. Obviously if it runs on a computer with a 1 GB hard drive and 32 MB of memory, I should be able to run it on a VM.

Anyway, being cheap I wasn't sure I wanted to pay for VMWare. They have a free version of course, but XenSource's free version is a license-key upgrade, whereas VMWare Server to Virtual Infrastructure (AKA ESX) is a complete software upgrade. So I thought I'd try XenSource, especially since they seemed to be saying that they could run any OS if you bought a CPU with virtualization support.

So I carefully researched the chips I was looking for and bought a Dell SC440 with an Intel Xeon 3050. A low-price server but with the right parts, or so I thought.

The install of XenSource was easy, as was the install of XenCenter, the control program on Windows. Unfortunately, there was a problem with the shortcut to install XenCenter. I posted a question in the Xen community boards and got no help. I found the solution myself a few days later, but not after noticing that there was very, very little activity on the community boards. I wonder if anyone is using Xen, or at least is anyone using it without paying Citrix for support?

Also, it turns out you can't run anything you want as a VM. I tried to run Ubuntu Server 6.06.1 and it gets disk errors. This is a known problem, apparently. Okay, I know it's hard to support every Linux distro, but Ubuntu should be one you support. Look at the numbers.

Anyway, worse than not supporting Ubuntu is that the answer from Citrix seemed to be, "use one of our supported distros." They'll always be niche if that's their approach. The market for virtualization is the world of heterogenous data centres that need to shrink their power and A/C footprint. You're not going to get into that market unless you can run anything that an off-the-shelf PC can run. So, I decided to try VMWare.

Installing a 60-day evaluation copy of ESX 3i didn't work. Neither did installing an evaluation copy of ESX 3.5, but at least it told me that the network card wasn't supported. So I tried Ubuntu 6.06.1, and the network card wasn't supported there, either. Broadcom, what are you doing releasing a NIC that doesn't work with older drivers? I found how to get Ubuntu installed, and so I'll continue with installing the free version of VMWare Server. This is not what I wanted to be doing.

I guess the lesson is you really have to check the hardware compatibility list, but I didn't even know I was going to go this path. I'm interested in how many other problems I'm going to have.

Even though I'm not up with VMWare Server, I have to say that it's the preferable approach. At least you have an underlying OS you can work with, and my experience with VMWare elsewhere says it's going to run whatever I try to put on it. Too bad the thinner versions (ESX) don't work on my hardware.

Ubuntu 6.06.1 on Dell SC440

The Dell SC440 has a Broadcom BCM5754 NIC, which isn't supported by the Ubuntu 6.06.1 server CD. You have to build the server without a network interface, then copy the new driver source onto it using a USB drive and build and install the driver.

I'm building Ubuntu on this server to run VMWare Server, so I also was particular about the disk partitioning. I created a 4 GB partition for root, then let it partition the rest itself. It made partition 5 the swap with 6.1 GB (I have 2 GB of RAM), and the rest of the disk on partition 2 (143.7 GB), which I put on /usr/local. I changed the usage of the file system to "largefile4" to give one inode for every 4 MB. I don't really know what that's going to do to performance, but it seems to make sense given that I'm going to be creating VMs there.

Next I followed the instructions here to build and install the driver. The instructions worked perfectly (with the usual 50 percent "forget to sudo" rate).

The I edited /etc/dhcp3/dhclient.conf to send the hostname:

send host-name "vmhost01";

And restarted the network:

ifdown eth0; ifup eth0

Then next time my DNS refreshed it got the server name.

Mount a USB Drive on Ubuntu 6.06.1

If this is the first time, make yourself a mount point for USB devices, like:

sudo mkdir /mnt/usb


sudo mount /dev/sdb1 /mnt/usb

If mount can't figure out the filesystem type, you'll have to figure it out yourself and specify the type to mount with the -t option.

Tuesday, 18 March 2008

Installing the Xen Guest Agent on an Ubuntu VM

To install the guest agent on an Ubuntu VM, I tried:
  1. Start the Ubuntu VM if it isn't already running
  2. In XenCenter, select the VM in the left panel, and select the Console tab in the right panel
  3. Select "xs-tools.iso" from the drop down list just above the console window
  4. Click on the console window, log in if necessary, and type sudo mount /dev/hdd /mnt
So in other words, where the Xen documentation says /dev/xvdd, use /dev/hdd on Ubuntu.

If you just type sudo /mnt/Linux/, it tells you you're running an unsupported distribution. I thought I'd be clever and try the force it to use the Debian 3 tools:
  1. Type: sudo /mnt/Linux/ -d debian -m 3
  2. Reboot: sudo shutdown -r now
That left me with an unbootable kernel. I booted from the original Ubuntu kernel, then edited /boot/grub/menu.lst to change the default kernel to 2 so it would boot the good kernel.

Monday, 17 March 2008

Installing Ubuntu on Xen

I was able to install Ubuntu 6.06.1 on Xen by putting the Ubuntu install CD in the CD drive of the Xen server. I got lost for a while because I didn't create the storage when initially setting up the VM. Once I actually made a disk, the install went okay. I've found a post saying that Ubuntu will start having errors and switch the disk to read-only, so I'll watch for that.

I still need to install the Xen tools...

Sunday, 16 March 2008

Ubuntu on Xen

I wanted to continue to use Ubuntu Server in my VMs. I like their approach to long-term support, the server install, and the way they provide a pre-configured LAMP server.

As I read about how to build one, I stumbled across a post from someone else who was doing what I wanted to do, and had problems. That scared me off just blindly stumbling ahead, so now I have more research to do.

I'm pretty sure it should be relatively easy to have a kernel that works on top of Xen, and on which I can run an otherwise standard Ubuntu 6.06 install, but it's going to take some digging... This is harder than I wanted it to be.

First Guest on Xen

I created a Debian 3.1 VM from the templates using XenCenter. I just followed the XenCenter wizard. It was trivial, although before I created it I spent almost 1 1/2 hours reading documentation and the Internet.

The only thing I had to do was set the time zone. Being old fashioned, I used the command line console and tzconfig(1).

Saturday, 15 March 2008

If IT Isn't Broken...

If NT provides what you need, who cares if the platform is a bit worn and some of the paint doesn't look so good?
Posted by Picasa

XenServer Install

I bought a Dell SC440 for Jade Systems (that's me). I'm setting it up as a virtual host so I can create virtual computers at will. I've decided to try XenServer, I think mostly because the free version is just a license-key upgrade to the full version. Besides, I already have some experience with VMWare, albeit second hand.

I had to F2 into the BIOS on initial boot of the SC440 to turn on virtualization in the CPU. It was relatively easy to find on the menus, but not right at the top level (My apologies for the vagueness here. The servers are in another room, so I wasn't blogging while I was installing).

The install of XenServer 4.0.1 went exactly according to the instructions. I read through the four pages of documentation first, and so I had all the answers I needed -- the usual network set-up, which time servers to use, etc.

I just finished installing the security patch. First challenge: actually getting the file to the XenServer host. I put a DVD in the DVD drive, but there's no entry for CD or DVD in /etc/fstab. The installation instructions recommend using a USB key. I flailed on that for a while, Googling for Linux-for-dummies level help on USB, before I said to myself, "Right. I guess I still have to be a Linux sysadmin," and found the appropriate dev to mount the dvd. From there it was maybe a minute to install the patch.

Total time: About half an hour to install XenServer the other night, mostly unattended after the usual initial Linux questions, plus 45 minutes now, mostly Googling.

Thursday, 13 March 2008

Why Open Source is Better for Us All

I just read a very interesting way of looking at how open source creates a fundamental shift in the factors that drive software acquisition. Given that much of what's wrong in IT is about the relationship that customers and software developers are driven into by the economics of developing software, I really like the ideas in the post.

Friday, 1 February 2008

AJAX Security

It seems like every time I get to the point where I think I might get excited about writing a web application, I listen to a podcast about the abysmal state of security on the Wild West Web. The technology has to be made secure. We can't rely on programmers to do the right thing because we're human and we'll mess it up from time to time.

Saturday, 5 January 2008

Task vs. Service

At my current client, a large health care organization, I needed to dispose of some old equipment that had personal health information about patients on it. I got directed to a front-line employee who operates a machine to degauss disk drives.

Knowing the organization, I knew that wasn't all I needed to do. And fortunately I knew how to track down the financial, inventory and other people who would be interested in reselling the machine if possible, and then getting rid of it all the way to the dump and removing it from the financial books. In total, I'll have to manage the disposal myself through three or four departments, and at least that many individuals.

What I really wanted was a single phone number I could call and say, "In April, get rid of this thing for me." and be done with it.

I think that's why we hear so much about "aligning IT with the business" these days. It's not just the big picture, find-a-way-to-put-your-business-on-the-web-and-make-yourself-rich alignment. It's also because we confuse an IT task with a business service. To the business there's value in an internal 1-800-got-junk number for information assets. There's very marginal value having someone in a room who can degauss disk drives (and who only gets called if someone is technologically savvy enough to know to call them anyway).

How can you tell if something is an IT task or a business service? Start by really getting into the head of the person who would use your service. Like ask them. If you can't sell the service, or at least get someone excited in about five minutes, then you better re-think your service.

By the way, my remarks about tasks shouldn't be taken as disparaging the people who do the real work. The internal 1-800-got-junk model needs someone to run the degausser, and their work is critical to making the whole model work. IT is sufficiently complicated that in medium to large organizations almost any business service will require multiple tasks carried out by multiple individuals. The shift to service thinking has to happen at the management level. The people doing the tasks are usually doing the right thing.