Saturday 29 December 2012

Fixing a Crash in Team Fortress 2

My son wanted Team Fortress 2 for Christmas. So far we've been mostly blessed with not having to feed a relentless video game appetite (aside from Minecraft). But I looked into it, and the game was free, with a very recently released Linux version. So I thought, "what the heck. It would probably only take me a few hours of fooling around to make it work."

Well, it was more than a few hours, but mostly because of my insistence on doing things "right".

TF2 is an interesting game. It runs in an environment, or framework, or something, called Steam. Steam supports many other games. In fact, it appears to be a whole ecosystem of games and communities around the games. There's a .deb to install Steam on Debian-derived LInuxes, and that's the first thing I installed.

I followed the Ubuntu forum for the installation, specifically using the experimental nVidia driver. I have a 9300, which is less than the forum says I need (9600 and above). Using the experimental driver allowed me to get Steam to run.

You have to sign up to the Steam community to use it. You can do so in the game.
 
To install TF2, I started Steam and found it in the on-line store. It's a long download. I think it took five or six hours on my reasonably fast ADSL. (I usually get 250-300 KB/s).

Finally, I could run the game under my user.

Here's where my insistence on doing it "right" first caused issues. My son has his own Linux user on his computer, which is not the user that installed Linux. His user was created as an ordinary non-admin user. My son doesn't have any special privileges on his computer, which is fine for me at his age. I don't want him to be able to mess up the configuration of his computer.

TF2 gets installed under the user's home directory, so I had to download again for my son. (You could probably just copy the appropriate directory or directories from one user to the other, but that would make the problem of getting the game running even harder if it didn't work the first time, which it didn't.)

Trying to run the game from my son's user name caused some disk activity and a few progress dialogues to appear, but then I'd just end up staring at the Steam home page after a few minutes. Running Steam from the command line allowed me to see all sorts of output, including the report of a "Segmentation fault" at the time the disk activity stopped.

Many hours of thrashing about and googling followed. Finally, it dawned on me that the only real difference between the users (mine and my son's) had to be the groups that they were in. (The Linux security model allocates some privileges to "groups" rather than directly to users. You then assign the user to a group to allow them the privileges of the group.)

Some trial and error fairly quickly determined that the user running TF2 has to be in the "sambashare" group. I logged in as me, the user who installed Linux. Then, in a Terminal, I could have typed:

sudo adduser user sambashare

However, I got intrigued that I couldn't find the GUI do manage users and groups. I discovered that it doesn't come installed by default on Linux Mint 13. So I installed the Gnome system tools:

sudo apt-get install gnome-system-tools

With the Gnome system tools installed, I:
  1. Went to Menu-> Administration-> Users and Groups
  2. Selected my son's user name
  3. Clicked "Advanced Settings"
  4. Entered my password
  5. Clicked the "User Privileges" tab
  6. Checked the box beside "Share files with the local network"
  7. Clicked OK all the way out again.

Note that I did all the above as myself, the user who installed Linux, not as my son.

Now I logged out of my session and logged in as my son and TF2 ran. Woo hoo!

Note that the LInux version of Steam and/or TF2 is very new right now (end of December, 2012). I found a lot of info on the net was no longer applicable, because of the evolution of the game and the platform. Even the contents of the Ubuntu forum for Steam changed drastically in the few days that I was working off and on to get the game running.

Off topic, but of interest to my geek friends: Here's a blog post about how the Steam effort is contributing to better graphics support in the Linux world.

Sunday 30 September 2012

Cinnamon Performance -- It was Chrome's Fault

I've written lately about my struggles with sluggish Ubuntu and Mint desktops. Finally, I discovered that Chrome was the problem. At one point in my ramblings, I recommended using Mate instead of Cinnamon. Well, I'm happy to report that my slow Dell Vostro 1440 runs Cinnamon just fine, as long as I'm not running Chrome.

Long Fat Networks

Long fat networks are high bandwidth, high latency networks. "High latency" is relative, meaning high latency compared to a LAN.

I ran into the LFN phenomena on my last data centre relocation. We moved the data centre from head office to 400 kms from head office, for a round trip latency of 6 ms. We had a 1 Gbps link. We struggled to get a few hundred Mbps out of large file transfers, and one application had to be kept back at head office because it transferred large files back and forth between the client machines at head office and its servers in the data centre.

I learned that one can calculate the maximum throughput you can expect to get over such a network. The calculation is called the "bandwidth delay product", and it's calculated as the bandwidth times the latency. One way to interpret the BDP is the maximum window size for sending data, beyond which you'll see no performance improvement.

For our 1 Gbps network with 6 ms latency, the BDP was 750 KB. Most TCP stacks in the Linux world implement TCP window scaling (RFC1323) and would quickly auto tune to send and receive 750 KB at a time (if there was enough memory available on both sides for such a send and receive buffer).

SMB 1.0 protocols used by most anything you would be doing on pre-Windows Vista are limited to 64 KB blocks. This is way sub-optimal for a LFN. Vista and later Windows use SMB 2.0, which can use larger block sizes when talking to each other. Samba 3.6 is the first version of Samba to support SMB 2.0.

We were a typical corporate network in late 2011 (read, one with lots of Windows machines), and they were likely to suffer the effects of a LFN.

Note that there's not much you can do about it if both your source and destination machines can't do large window sizes. The key factor is the latency, and the latency depends on the speed of light. You can't speed that up.

We had all sorts of fancy WAN acceleration technology, and we couldn't get it to help. In fact, it made things worse in some situations. We never could explain why it was actually worse. Compression might help in some cases, if it gets you more bytes passing through the window size you have, but it depends on how compressible your data is.

(Sidebar: If you're calculating latency because you can't yet measure it, remember that the speed of light in fibre is only about 60 percent of the speed of light in a vacuum, 3 X 10^8 m/s.)

There are a couple of good posts that give more detail here and here.

Sunday 16 September 2012

Ubuntu and Mint Very Slow

I've been struggling for some time with poor performance of Ubuntu, and now Mint, on my Dell Vostro 1440. Admittedly it's a cheap laptop, but in this day and age a Linux desktop should run decently on pretty much anything, as long as you're not using a lot of fancy desktop effects.

Running top I was seeing a lot of wait time. When the performance was really bad, I'd see over 90 percent wait time. Typically I'd be dipping into swap space when performance was bad, but it would be bad without swapping (I "only" have 2 GB of RAM). I would see this when running only Thunderbird and Chrome, although Chrome with a lot of tabs open.

I spent many frustrating hours Googling for performance issues on Ubuntu or Mint and didn't find anything really promising.

Finally, last weekend I was dropping off some old computer gear for recycling at our local Free Geek and saw a pretty sweet Dell laptop for sale. I started playing with it, partly to see how it performed. They sell used computers with Ubuntu, and Ubuntu comes with Firefox. Firefox was snappy as all get out, and on a lower powered CPU than mine at home.

So I went home and tried Firefox. It works great. So I started Googling performance problems with Chrome on Linux and got all sorts of hits. This one looks like it's turning into a bit of an omnibus bug report, but has some good info and links to other places.

It looks like one factor is that Google has made its own Flash viewer, since Adobe is no longer supporting new versions of Flash on Linux. Many people report disabling the Google Flash viewer helps, but it didn't work for me.

Others report that it is indeed due to memory usage of Chrome with many tabs. Others report that it has something to do with using hardware graphics rendering, that the hardware is actually slower. Still others report issues with Chrome scanning for devices, and particularly webcams.

My gut says it's a combination of things -- perhaps all of the above are involved, but you only see the performance problem when two or more of the factors coincide.

I haven't found a solution that works for me yet, so I'm somewhat reluctantly using Firefox. It's certainly a lot faster than it was two years ago. However, I miss the combined link and search field in Chrome, amongst other things. It does seem like Firefox has stolen most of Chrome's good ideas, so it's not as hard as I thought it might be to readjust.

Installing Ruby on Rails on Linux Mint 13

A few months after my last post about installing Ruby on Rails, and much has changed. There was an issue with zlib so I had to flail around a bit. The following instructions are what I think I should have done:
  1. Run "sudo apt-get install zlib1g zlib1g-dev" in a terminal window
  2. Install the Ruby Version Manager (rvm) from these instructions
  3. Run "rvm requirements" in a terminal window
  4. Install all the packages the output of "rvm requirements" tells you to install (apt-get install...). You must do this before you start installing any rubies with rvm. If you don't, you may have all sorts of problems crop up later, like weird messages from irb ("Readline was unable to be required, if you need completion or history install readline then reinstall the ruby.")
  5. Do the following in a terminal window:
rvm install 1.9.3-p194
rvm --default 1.9.3-p194
gem install rails 
sudo apt-get install sqlite 
sudo apt-get install libsqlite3-dev libmysqlclient-dev
sudo apt-get install nodejs 

Now create an application to test:

rails new testapp 
cd testapp 
rails server 

Browse to localhost:3000 and you should see the Rails default page.

Sunday 2 September 2012

Use Mate with Cheap Hardware

[Update: This whole post is a lie. See how Chrome is the culprit here. and see my confirmation that Cinnamon is fine here.]

A few weeks ago I switched to Linux Mint and chose the Cinnamon desktop. That turned out to be a disaster with my cheap Dell laptop (Vostro 1440). The performance was excruciatingly slow -- apps would take tens of seconds to open, the mouse would freeze, etc. Switching to Mate has made it bearable. Unfortunately, switching the desktop, while doable (Google it) doesn't work perfectly. I have duplicate items in some menus, or multiple ways to do the same thing. I never know when I'm going to see Nautilus or Caja when I open a folder.

I also suspect, based on what I was seeing while trying to debug the performance problem, that a 5400 RPM drive just doesn't cut it for a desktop user. The minute I dip into swap space, performance starts to fall off. I guess what I really should say is that 5400 RPM disk plus 2 GB of RAM doesn't cut it. That would solve the swapping. However, I would expect the slow drive would still slow initial program start-up.

Sunday 19 August 2012

Goodbye Ubuntu Desktop

As of about three weeks ago I'm using Linux Mint 13 with the Cinnamon desktop. I don't even know what half of that means. I'm just doing what comes up by default. It's nice to be back to a desktop that works for content producers -- not that I produce that much content.

[Edit: Cinnamon runs fine on a low-powered computer like a Dell Vostro 1440, but don't try to use Chrome.]

Sunday 22 July 2012

Upgrading Android Phone With Linux

I've had a Samsung Galaxy S II with Android 2.3.3 from Virgin Mobility in Canada since last year at this time. Part of the reason I went Android is I wanted to get away from having to have a Windows VM just to manage my phone. When I got the Galaxy I asked how upgrades worked, and I was told it was a stand-alone upgrade.

Time goes by and I rather enjoy not having my phone's behaviour change every time I plug it in to my computer. For that matter, I enjoy not having to plug it into a computer all the time. Lately, however, some of the things that I didn't like about my phone bugged me enough that I thought I should do something about them. And before doing that, I thought, well I better upgrade Android first.

So it turns out I needed a PC or a Mac to upgrade using the Samsung Kies software. No stand-alone upgrade like they told me. Crap. Well, I thought, I'm a Linux user so suck it up and Google for the solution. Many Androids do indeed have a stand-alone upgrade on the phone, and people report it works quite well, although it's best to be connected to WiFi first. I suspect, therefore, that either it's something that Virgin/Bell did to their version of Android, or it's an artefact of the old version of Android.

In other words, these instructions are only for if you can't find the stand-alone upgrade on your phone. Look under Settings-> About Phone.

Without the stand-alone upgrade, here's what I did:
  1. I went to samfirmware.com to find the version of software for my exact model of phone, a GT-I9100M and download it. The version I got was 4.0.3 (Ice Cream Sandwich)
  2. I uncompressed it
  3. Heimdall is the Linux program to flash the Samsung's firmware. I downloaded heimdall 1.3.1, both frontend and the base, from the heimdall github site. At the time I wrote this, heimdall 1.3.2 was the most recent, but apparently it had an issue and the Internet recommended using 1.3.1
  4. I installed each of the .deb files by double-clicking on them
  5. I ran "sudo heimdall-frontend" in a terminal. I had to run it with sudo or I would get a "libusb error: -3" [Update: On another system I got "heimdall-frontend: error while loading shared libraries: libz.so.1: cannot open shared object file: No such file or directory" when I installed the 32-bit Heimdall on a 64-bit Linux Mint.]
  6. Somewhere around here you may want to back up your phone. I didn't, but I was quite confident that most of what I have on the phone is also in the cloud somewhere. I copied all my photos to my computer before starting the upgrade
  7. The phone has to be in "download mode" before connecting to it with heimdall. To put it in download mode, turn off the phone, disconnect the USB cable, and hold down Volume-Down, Power, and Home all at the same time for a few seconds. The phone shows a display that it's in download mode within a couple of seconds. To quit without downloading anything, just hold down the power button for five seconds or so (http://forum.xda-developers.com/wiki/Samsung_Galaxy_S_II_Series#Download_Mode)
  8. I prepared the firmware to flash according to the Heimdall instructions: https://github.com/Benjamin-Dobell/Heimdall/tree/master/Linux, under the heading "Performing a Custom Flash with Heimdall Frontend". Note that the instructions say to get the PIT file from the phone first
It took a couple of minutes to download all the files. For the larger files the feedback would pause every once in a while. I was patient and waited and it continued after a brief pause.

Once it's done the phone rebooted and set about doing a bunch of post-install updating. It took maybe ten minutes tops and the phone was ready to go. My memos were still there.  I lost all my playlists. As far as I can tell, that's all I lost (but then I don't have a lot of stuff on my phone that isn't in the cloud).

Finally, an annoying irony: One of the first things I noticed when I started to use the new version is that it does indeed support a stand-alone upgrade, so now all I need to do is connect to WiFi and upgrade.

Wednesday 23 May 2012

Getting to Xubuntu


I don't want to start a flame war, but I've been unable to adjust to Unity. So I've been using Xubuntu for some months now. I just installed the Xubuntu packages over my Ubuntu 11.10 from the instructions here: http://www.psychocats.net/ubuntu/xfce.

After installing Xubuntu, some things still seemed to be weird. Much of it I could imagine might have to do with the fact that I had a lot of perhaps unnecessary Ubuntu stuff hanging around my system. So yesterday I finally followed the excellent instructions at: http://www.psychocats.net/ubuntu/purexfceoneiric to get rid of the unnecessary stuff.

I had to reinstall LibreOffice and a few other packages after following the instructions. It's worthwhile to capture the output of the command given by the link above, so you can see what's been removed.

I had one obscure problem that gave me many hours of angst. When I restarted my computer, it wouldn't boot. By booting to a live CD and checking /var/log/syslog, and Googling, I discovered that /etc/lightdm/lightdm.conf was sending me to the unity-greeter. I had to change /etc/lightdm/lightdm.conf to look like this, and then I could happily boot again:


[SeatDefaults]
greeter-session=lightdm-gtk-greeter
user-session=

I knew that after switching to Xubuntu but before following the above instructions, my startup was still going to the Ubuntu login screen, rather than the Xubuntu login screen. Part of the reason I was trying to go to "pure" Xubuntu was to get rid of the Ubuntu login and get to a pure Xubuntu experience.

I can't say enough good things about pyschocats. She has done an excellent job of documenting a number of tricky topics.

Thursday 16 February 2012

Sluggish Ubuntu Video

Since I got my Dell Vostro 1440, I felt that it wasn't quite as responsive as it should have been. Yesterday I spent a little time trying to figure out why that might be. (A little time -- like all morning.) I stumbled across what seemed like good instructions on troubleshooting Linux video. At the start of all the instructions was a warning to make sure the user was in the video group. If not, the user's desktop wouldn't be able to use the graphics hardware.

Well I checked my groups, and sure enough I wasn't in the video group. I added myself to the video group, and after logging out and in, and a day of use, I'm confident in saying that the desktop is much more responsive.

More Googling turned up a useful command: 'glxinfo | grep "direct rendering"'. It will tell you "yes" if you're going straight to the graphics hardware.

Friday 27 January 2012

Installing Ruby on Rails on Ubuntu 11.10

[I've made an important change to this post -- steps 3 and 4 below are new. Apologies to anyone I've lead astray.]

I'm back to playing with Rails a bit. NetBeans for Ruby is gone, so I'm going to do things the macho Rails way and just develop with an editor and a bunch of terminal windows. (One of my open source rules is "do whatever everyone else is doing." Trying to use an IDE with Rails was always a violation of that rule.)

rvm
 is a great idea. I found it really helpful to read about named gemsets early on. I had to install rvm, then install rails and a few other packages.
  1. Install the Ruby Version Manager (rvm) from these instructions
  2. Put this line at the end of your .bashrc: "[[ -s "$HOME/.rvm/scripts/rvm" ]] && . "$HOME/.rvm/scripts/rvm" # Load RVM function"
  3. Run "rvm requirements" in a terminal window
  4. Install all the packages the output of "rvm requirements" tells you to install (apt-get install...). You must do this before you start installing any rubies with rvm. If you don't, you may have all sorts of problems crop up later, like weird messages from irb ("Readline was unable to be required, if you need completion or history install readline then reinstall the ruby.")
  5. Do the following in a terminal window:
rvm 1.9.3 
rvm --default 1.9.3 
gem install rails 
sudo apt-get install sqlite 
sudo apt-get install libsqlite3-dev 
sudo apt-get install nodejs 

Now create an application to test:

rails new testapp 
cd testapp 
rails server 

Browse to localhost:3000 and you should see the Rails default page.


Sunday 22 January 2012

Know What You're Building

"Know what you're building" seems like an obvious thing to say, but I don't think we do it that well in IT. For my recent data centre relocation project, we applied that principle successfully to a couple of areas. The network lead wrote up exactly what he was building, and the storage lead listed out every device he needed. But we never did a complete "final state" description of the new data centre.

It all worked pretty well, although we needed a number of meetings during the design phase of our new data centre -- laying out the racks, non-rack equipment, power, cabling for the networks. I think we needed to have a lot of meetings because there isn't a commonly accepted way to draw a plan of a data centre that covers the requirements of all the people in the room.

I'm running into the issue again in a smaller way now that we're designing the new central communication room for the equipment that used to be in the old data centre, but needs to remain behind for local operations (mostly the network gear to service a large office building).

Just as a refresher, here are all the people you need to involve:

  • The server team(s) know the physical dimensions of the servers, their weight, how many network ports they have and how they need to be configured, whether they need SAN-attached storage, backup requirements, how much power and cooling the server needs
  • The network team(s) know the network devices, which have most of the same requirements as servers, the approach for connecting, which defines the need for cables and patch panels, and the cabling, which may affect weight of cable trays or floor loading
  • The storage team(s) know the switching devices, which have most of the same requirements as the network devices
  • The electrical engineer or consultant needs to know all the power requirements and placement of all the equipment
  • The mechanical engineer or consultant needs to know the cooling requirements and placement of all the equipment
  • The structural engineer or consultant needs to know the weight and placement of all the equipment
  • The trades who actually build it all need to know exactly what they're building
  • There's likely some other poor person, maybe a building architect, who has to pull this all together

Add to all that the fact that the technology in a data centre is constantly changing, at least in terms of the number and type of servers in the room. Also, the requirements and constraints tend to be circular: For example, the number of network ports on a server affects the amount of network gear you need, which affects how many servers you can have (either through port capacity or rack space), which affects how much power and cooling you need but also how many network ports you need.

You also have to worry about other details than can seriously derail an otherwise great plan. For example, when running fibre, you need to make sure it's the right kind of fibre and that it has the right connectors. Power cables in a data centre can be varied, so again you need to make sure that the power distribution units (PDUs) in the racks can be connected to your servers.


With all this, it can be hard for people to come to an agreement on what to build. We don't have well-established ways of describing what's going to be built in a way that everyone understands. There's software to help do this, but it tends to be unreasonably expensive for a medium-sized enterprise.

Regardless of how hard or expensive it is, there's a lot of value in figuring out what you're going to build, before you built it. We were successful using Excel and Word to describe what to build, and drawings of floor plans. We had to be extremely careful about versions and keeping the different documents in sync. In the end, happily it all worked out.