Tuesday 11 November 2014

Movies at Home

I keep hearing about how we don’t need cable any more to watch movies or TV. All the talk convinced me I should try. Well, talk about some serious time wasting…

I wanted to:

  • Put my DVDs onto a file server and play them, without having to load them in the DVD player
  • Play on the TV anything I can see on my computer through my browser. In particular, I wanted to play TV from Guatemala. Some of the channels there stream a lot of their programming straight to the Internet
  • Play Netflix, and possibly other services, with a decent selection of material. I had Netflix for a few months a couple of years ago but, living in Canada, we quickly ran out of material to watch
  • Do everything in such a way that everyone else in the family can use the technology, once I get it set up

What have I managed to do?

  • I can play my videos on my TVs, via a Roku 3. I’m also optimistic that I can get my WDTV Live to work as well. It required a lot of research, mostly because I had to convert the DVDs to a different format, and buy a big new storage device, a Synology DS412+ NAS device
  • I can play some stuff on my TV that I can play in a browser, but not everything. To be more accurate, I can play stuff from YouTube, but not anything else. This is quite useful, but not all that I wanted
  • I haven’t tried Netflix with the VPN yet, but I don’t expect any issues. I have a VPN from PureVPN. Setting up the VPN the way I wanted it was a true adventure, not covered in this post
  • The younger members of the family can use it, but I’m frequently frustrated by the number of hoops I have to jump through. It’s sure not like just turning on the TV and flipping through the channels

Some of this was surprisingly easy, and some required the typical technology flailing that I get into. Overall, it’s a solution that requires a certain amount of comfort with technical topics. I’m starting to get my head around digital video, but I’m nowhere near an expert. I also know a lot about Linux, and enough about networking to have an idea of what I wanted to do.

This post will only talk about the process of getting my DVDs onto my network and playing them from the TV. I’ll cover:

  • The storage device for movies
  • How to play a movie from the storage device on a TV
  • How to put your DVDs on the storage device
  • What if you want to do something different from what I did

Storage

Video work requires lots of disk space. A non-HD movie from a DVD takes more than a GB. In my experience, a typical movie DVD has more than 4 GBs on it. And the software for playing movies on a TV, at least the software I found, doesn’t play from the ISO file (direct copy of a DVD), so you have to convert it. In the process of converting, you may need even more space.

The need for storage space was what made me buy the Synology DS412+ NAS device, which runs DSM 5.1, a BusyBox-based Linux machine.

The Synology doesn’t actually come with disks (so, for example, don’t get excited about how cheap it is when you look up the price). You buy the disks you want to put in it. That gives you the freedom to decide how much storage to buy.

I bought the maximum of four disks, 3.5 in, with 3 TB capacity each, and used the default formatting option, which is a type of RAID-5. The result is that I have just under 8 TBs of usable storage space, plus the ability to replace any single-disk that fails with no loss of data.

I ordered from NCIX, which has a big presence where I live, so they delivered in less than 36 hours. I had it running on my network in 48 hours after ordering. Total cost was around C$200 per usable TB.

(You could never get storage that fast and that cheap in the enterprise IT world. I know it’s a bit unfair to compare, as it’s not completely apples-to-apples, but seriously, CFOs need to ask their CIOs what benefit the corporation is getting from overpaying for storage from EMC, HP, NetAPP, or Hitachi. They don’t get responsiveness or agility. They sure don’t get cheap storage – at work I pay $9,000 per TB. That’s right. 45 times as much.)

I thought about putting together my own storage box using an old computer from FreeGeek. I’m sure it would have been a lot of fun for a geek like me. The reality is that it wasn’t going to be much cheaper, and it would have taken a lot more time.

Note: The DS412+ doesn’t appear on Synology’s site any more, so perhaps there’s a newer equivalent.

Playing DVDs

Once I got the Synology running, I was pleasantly surprised to discover that I had something that serves up videos to a Roku 3. The Synology comes with built-in software to be a media server.

The Roku 3 has an app called DS Media that works with the Synology media server. I had to get it from Roku’s channel store, but that’s pretty easy. It was under the “Audio and Video” category, and was free.

Once I had the DS Media channel on the Roku, all I had to do was upload my movies, in the right format, to the “video/movie” folder on the Synology. Getting them in the right format was the next trick – see the next section.

I haven’t got the WDTV working with the Synology media server, but it seems to recognize and connect to it, so I’m hoping…

I had started to play around with Plex on my home-built file server, just enough that my free trial period had run out. Since the Synology came with its own thing, I haven’t pursued Plex. A lot of people like Plex.

Ripping DVDs

I figure if I buy a DVD, I can make a copy of it and watch it on my TV. (I guess that’s my disclaimer that I’m not encouraging you to make illegal copies of your videos.)

I already had a lot of DVDs copied to ISO images, by using:

dd if=/dev/cdrom of=movie-name.iso

That’s a Linux terminal command. Mac users can do something similar in a terminal. Windows users: you’ll have to figure it out for yourself. Sorry.

It turns out, in this fancy modern world, video players don’t play ISO files. It sort of makes sense. You don’t want to have to go through a DVD’s menu if you’re watching on your phone or tablet.

It turns out that converting an ISO to a file playable by a phone, tablet, or TV (like the Roku or WDTV) can be a savage journey into the morass of video encoding. The morass includes open-source telenovelas about competing projects (this seems to be a relatively unbiased summary), patent-encumbered video formats, lossy video formats, and differences in Linux distributions.

You can avoid most of that trip by doing this:

  1. Install VLC media player and Handbrake from your distribution’s repository. You don’t need to use VLC directly. VLC installs software that enables Handbrake to rip some, but not all, copy-protected DVDs
  2. Review this link for how to optimize the Handbrake conversion for the Roku. Standard DVDs don’t have HD video, so 480p is as good as it’s going to get
  3. Use Handbrake to rip your DVDs or ISO files to the open Matroska container format (.mkv). Matroska is now well-supported on Android and TVs/TV boxes like the Roku

If you want to play your videos on an Apple device, it’s more complicated. In fact, I haven’t got it to work yet. The version of Handbrake on distributions derived from Ubuntu 14.04, like Linux Mint 17, doesn’t support output to the MP4 container format, for software patent reasons. The MP4 container is the only format supported on Apple products.

There are suggestions that I could build my own version of Handbrake that would work, but one set of instructions I followed didn’t work, and I haven’t pursued it further.

Doing Something Different

Most of the time I spent on this was the research and learning. If you want to try exactly what I did, and you’re comfortable Googling for advice on technology topics, it’s not that hard.

However, there’s a good chance that you won’t want to do, or won’t be able to, do exactly what I did. Here are some things to watch for:

  • The Linux video world is constantly in flux. If you’re using versions after Ubuntu 14.04, or distributions not derived from Ubuntu, you should definitely confirm that you can rip your DVDs before you spend a bunch of time, and money on hardware for storage or playing
  • If you’re not using Linux, confirm that Handbrake and VLC work on your version of Windows or Mac OS, and can do what you need
  • If you have anything other than a Roku 3 for playing Internet TV, you need to find evidence on the Internet that your device can work with the Synology media server. Look for the evidence by Googling the name of your device and the model of Synology you plan to buy
  • If you want to use a different storage device, you have to figure out whether it has a media server, and whether the media server is compatible with your TV device

Summary

With a Synology NAS storage device, a Roku 3 with the DS Media channel, my own DVDs, and Handbrake, I was able to convert DVDs to movie files, store them on the storage device, and play them on a TV through the Roku.

Monday 3 November 2014

There's No Such Thing as a Dry Run When You're Moving a Data Centre

There's no such thing as a dry run when you're moving a data centre. That may not seem sensible. But here's why. I think it's easiest to explain in one sentence:

If you do a dry run, moving a computer to a different data centre, and it works, why would you move it back?

If that still doesn't make sense, think back to the days when moving a computer included a physical activity: unplugging the computer, putting it on a truck, and shipping it to your new data centre. Would you really propose that you do a dry run of that, then, if your dry run succeeds, putting it back on a truck, moving it back to the old data centre, getting it running again, only to then do it "for real" some time later?

Granted, in the world of virtual computers, you don't have to actually move the computer back. However, there is still a list of activities you have to do to move a virtual computer, that you have to undo. There's just as much a chance you'll screw up the undoing of those steps, as there is that you'll screw up the doing of them in the first place. A dry run actually increases the overall risk of the relocation.

Monday 29 September 2014

Definitive Guide to Recovering from a Full Disk

Cheap, stingy guy that I am, I allocate really small system partitions to my Ubuntu servers. This means that periodically my disk fills up. It fills up because every kernel upgrade takes a fair amount of space, and old kernels aren’t cleaned out automatically. Unfortunately, the disk usually fills up when trying to do an upgrade, so apt-get fails, and terminates with a partially installed package. You’ll know that has happened when you get a message like this when you run an apt command:
E: Unmet dependencies. Try using -f.
Once that happens, you can’t use any apt command.
There’s lots of advice out there about what to do, but the pages I’ve found always seem to leave something out, or assume knowledge of apt or dpkg that I don’t have.
So based on the last time this happened, here’s how I plan to recover the next time I run out of space. Warning: lots of Terminal commands coming up. I do everything in the Terminal for a few reasons:
  • The happens to me most often with servers, as I’m trying to save space, especially for virtual machines. My servers don’t have a GUI
  • Terminal works for both desktop and server machines
  • It’s easier to document commands for the Terminal
First, I have to make sure the problem really is that I’m out of space. (Looking for 0 in the “Avail” column, on the line that has “/” under the “Mounted on” column):
$ df -h
Filesystem      Size  Used Avail Use% Mounted on
/dev/sda1       3.7G  3.7G     0 100% /
...
Then I find out what version of the kernel I’m running:
$ uname -a
Linux ixmucane 3.13.0-24-generic #47-Ubuntu SMP Fri May 2 23:30:00 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
Next, I find out what kernels are installed:
$ dpkg --list | grep linux-image
If I have at least two versions older than the one I’m currently running, I can remove the oldest one (replacing the “n.nn.n-nn” with the version number I want to remove):
$  sudo dpkg --purge linux-headers-n.nn.n-nn-generic 
$  sudo dpkg --purge linux-headers-n.nn.n-nn
$  sudo dpkg --purge linux-image-extra-n.nn.n-nn-generic 
$  sudo dpkg --purge linux-image-n.nn.n-nn-generic 
This should free up lots of space, but I check again with df -h. Then run:
$ sudo apt-get -f install
If the amount of space it needs is less than what’s available according to df -h, then I go ahead and finish the install. To be safe, I also do:
$ sudo apt-get update
If there’s not enough space, and I have more old versions of the kernel installed, I just repeat the above dpkg until I have enough space to finish the install.
The above is the happy path. If I didn’t have two versions older than the current running kernel, I would try to remove the partially installed packages. Looking again at the output of:
$ dpkg --list | grep linux-image
if the newest version there is newer than the kernel currently running, then I would try the above dpkg commands to remove the partially installed packages. Some of them won’t work, of course, since the package isn’t installed, But once all the installed packages are removed, presumably there would more space and I could try:
$ sudo apt-get -f install
The reason I want to have two versions older than the current version is in case for some reason the current kernel doesn’t work, I can go back to the previous version. This is a cautious approach. If I’m really stuck, I would remove all versions except the current version. I’d probably make sure that I could boot the current kernel first. I haven’t had to do this and I hope I never do, but…

Monday 26 May 2014

IT Lottery

So today I had to participate in one of the many little rituals of an enterprise IT project manager: Get someone signed up to my project time charge code.

I submitted the usual paperwork like I've done a number of times before, but this time I was told that the role I selected wasn't a role in the timesheet software, and neither was the individual's official job title. The individual is an employee of the client's company.

I felt somewhat smug that I was able to completely ignore the absurdity that an employee's job title isn't acceptable to the company's timesheet software. But then I realized what I was being asked to do: I had been told two titles that weren't acceptable to the timesheet system, but I had not been told what would be acceptable. I guess I'm supposed to randomly guess until I get the right answer.

Definitely a Dilbert moment. But then I asked myself, "Why would someone respond to me this way?" The person I was dealing with is a very nice, dedicated worker. They weren't just trying to make my life difficult.

I think it's because, in the enterprise IT world, there's no upside to providing service. An IT manager has too many demands, and not enough people to meet the demands. In addition, the path to promotion for the manager is through more responsibility, and the way to get that is by having more staff and a bigger budget. If you provide good service for the same personnel level, you're not meeting your boss's needs.

Also, there's no upside to taking responsibility. If you take responsibility, you can be blamed if, sometimes, you don't achieve the desired result. Better to leave all decisions up to someone else, and don't give them any help, in case they blame you if your help turns out not to be helpful.

This culture so permeates our business that it's not absurd to just tell someone, "You got it wrong," without giving so much as a hint as to what the right answer might be.

No wonder people have such low expectations of IT.

Monday 12 May 2014

Gender Imbalance in IT -- It Wasn't Always Like This

I was in a meeting a couple of weeks ago at a client's office. This client still uses an IBM mainframe, and the meeting was about some mainframe activities for my project. I realized that four of the seven people in the room were women.

I think the whole mainframe department has slightly more men than women, but it's much closer to gender parity than most IT gatherings.

Then I thought back to my graduating class in Computer Science at the University of Saskatchewan in 1980. We were also close to gender parity.

This certainly shows that there's nothing preventing women from getting into and being successful in IT. What it shows is that, during decades when women were increasing in numbers in most professions, things were actually going in the other direction in IT -- women were being driven out of the field.

Sunday 20 April 2014

Mounting Windows Shares with Nautilus

From time to time I need to access a Windows share on a domain (not is a local work group) via Nautilus (the LinuxMint file browser I use). It always takes me too long to figure out the trick...

The trick: The domain has to be specified in upper case (or perhaps it's simply case sensitive and it depends how the domain administrator has specified the domain). So in Nautilus, I do File-> Connect to Server..., then fill in the appropriate values in the dialogue that appears, with domain in upper case.

To debug issues when connecting to Windows shares, I open a Terminal window and type:

smbclient -U domain\\windows_user_name //windows_server_name/share

and follow the prompts.

If that works, I try:

gvfs-mount "smb://domain;windows_user_name@windows_server_name/share"

and follow the prompts. (The quotes are required because the semi-colon is a special character to the Terminal program.) I have seen cases where the mount takes a minute or so to connect, so I have to be patient.

If the above works, I unmount the drive with:

gvfs-mount -u "smb://domain;windows_user_name@windows_server_name/share"

Thursday 27 February 2014

Relocating Another Data Centre

I recently took part in another data centre relocation project. I was one of a number of project managers moving some of the servers in a 1,300 server data centre. I moved about 200, and decommissioned another 50. I was directly planning and executing moves, so my role was different from on my previous project. It was good to experience a move from another position.

The project was successful in the end. I have to say that there were a number of lessons learned, which goes to prove that no many how many times you do something, there's always something more to learn.

Unlike my previous experiences, there were three major organizations working together on this relocation: the customer and two IT service providers to the customer. All organizations had good, dedicated, capable people, but we all had, at a minimum, a couple of reporting paths. That in itself was enough to add complication and effort to the project.

The senior project manager identified this right from the start and he made lots of good tries to compensate and mitigate for it. We did a number of sessions to get everyone on the same page with respect to methodology. Our core team acted as a cohesive team and we all adhered to the methodology. And in fact, across the project I think it's safe to say that the front line people did as much as they could to push toward to project goals.

Despite our best efforts, we all, across the three organizations, had to devote significant efforts to satisfy our own organization's needs. It's worth noting that much of this is simply necessary -- organizational governance is a big issue in the modern economy, and appearing to have management control is an business reality.

So if you're planning a relocation, take a look at the organizational structures that will be involved, and take them into account when planning your data centre relocation project.

Friday 24 January 2014

Time Zone in Rails

There’s pretty good info out there about using time zones in Rails, and Rails itself does a lot of the heavy lifting. The Railscast pretty much covers it. It’s only missing a discussion of using Javascript to figure out the client browser’s time zone.

Time Zone from the Browser

To get the time zone from the browser, use the detect_timezone_rails gem. The instructions give you what you need to know to set up a form with an input field that will return the time zone that the browser figured out. That would work perfectly if you were implementing a traditional web site sign-up/sign-in form.
However, I needed to do something different. Since I’m using third party identity providers (Google, Twitter, Facebook, etc.) via the excellent Omniauth gems, I needed to be able to put the time zone as a parameter on the URL of the identity provider’s authorization request. Omniauth arranges for that parameter to come back from the identity provider, so it’s available to my app’s controller when I set up the session.
To add the parameter, I added this jQuery script to the head of the welcome page:
<script type="text/javascript">
  $(document).ready(function(){
        $('a.time_zone')
          .each(function() {
            this.href = this.href + "?time_zone=" +
              encodeURIComponent($().get_timezone());
      });
  });
</script>
This added the time zone, appropriately escaped, to the URL for the identity provider (the href of the <a> elements). This worked because I had set each of the links to the identity providers to have class="time_zone", like this:
<div class="idp">
  <%= link_to image_tag("sign-in-with-twitter-link.png", alt: "Twitter"), 
    "/auth/twitter", 
    class: "time_zone" %></div>
In the controller, I did this (along with all the other logging in stuff):
if env["omniauth.params"] &&
  env["omniauth.params"]["time_zone"]
  tz = Rack::Utils.unescape(env["omniauth.params"]["time_zone"])
  if user.time_zone.blank? 
    user.time_zone = tz
    user.save!
    flash.notice = "Your time zone has been set to #{user.time_zone}." +
      " If this is wrong," +
      " please click #{view_context.link_to('here', edit_user_path(user))}" +
      " to change your profile."
  elsif user.time_zone != tz
    flash.notice = "It appears you are now in the #{tz} time zone. " +
      "Please click #{view_context.link_to(edit_user_path(user), 'here')}" +
      " if you want to change your time zone."
  end
else
  logger.error("#{user.name} (id: #{user.id}) logged in with no time zone from browser.")
end      
Of course, you may want to do something different in your controller.

Testing Time Zones

However you get your time zones, you need to be testing your app to see how it works with different time zones. YAML, at least for a Rails fixture, interprets something that looks like a date or time as UTC. So by default, that’s what you’re testing with. But that might not be the best thing.
I had read that a good trick for testing is to pick a time zone that isn’t the one your computer is in. Finding such a time zone might be hard if you have contributors around the world. I like the Samoa time zone for testing: Far away from UTC, not too many people living in the time zone, and it has DST.
If you want a particular time zone in your fixtures, you have to use ERB. For example, in my fixtures I might put this:
created_at: <%= Time.find_zone('Samoa').parse('2014-01-30T12:59:43.1') %>
And in the test files, something like this:
test "routines layout" do
  Time.zone = 'Samoa'
  correct_hash = {
    routines(:routine_index_one)=> {
      Time.zone.local(2014, 01, 30)=> [
        completed_routines(:routine_index_one_one)
      ],
      ...

Gotchas

I found a few gotchas that I hadn’t seen mentioned elsewhere:
  • Rails applies the time zone magic when it queries the database, so if you change your time zone after you retrieve the data, then you have to force a requery, or the cached times will still be in the model. Shouldn’t be a problem when running tests, but is when using the console to figure things out
  • You can’t use database functions to turn times into dates, as these won’t use the time zone. No group by to_date(...) or anything like that

Tuesday 7 January 2014

Self-Referential, Polymorphic, STI, Decorated, Many-to-Many Relationship in Rails 4

Preamble

I wanted to model connections à la connections in LinkedIn or Facebook in a Rails application. This means a many-to-many association between instances of the same class. That caused me some grief trying to get it hooked up right because you can’t rely on Rails to figure everything out.
The other trick in my application is that the people involved in the connections might be users who have registered to use the application, or they might be people created by a registered user, but who aren’t registered to user the application.
Concretely, and hopefully more clearly, I have “users”, who have registered, and I have people who can be involved connections. In my app the people who aren’t registered users are “patients”.
In the course of trying to get this all to work I stumbled across three approaches to this type of problem:
  1. Polymorphic classes
  2. Single Table Inheritance (STI)
  3. Decorator pattern
The combination of the many-to-many combined with the two classes took a lot of work to get straight. The Rails Guides were a great starting point, but I find that specifying Rails associations can be tricky if it’s not completely straightforward, and especially when you start chaining them together.
In the end, I decided to go with the Decorator pattern. But I’ll start with the one I threw out first: Polymorphic.

Polymorphic

I got pretty far with polymorphic associations, but I couldn’t figure out how I was going to get a list of all people (patients and users) connected to another person. I could either get all patients or all users from the methods that the Rails associations gave me, but not a list of all together.
I realized in writing the preamble above that I probably should have realized that what I was trying to model wasn’t really a polymorphic situation. Polymorphic in the examples I saw was used to connect an object to another object from any one of a number of unrelated classes. Of course, hindsight is 20/20.
This post convinced me that trying to get a list of all people wasn’t going to come naturally from a polymorphic approach, so I stopped pursuing it.

Single Table Inheritance

I got fired up about single table inheritance (STI) as I was reading about how to make the polymorphic approach work. A good brief write up is here: http://blog.thirst.co/post/14885390861/rails-single-table-inheritance. The Railscast is here: http://railscasts.com/episodes/394-sti-and-polymorphic-associations (sorry, it’s a pro Railscast so it’s behind a paywall).
Others say I shouldn’t do STI. People say it can cause problems. One problem is if the type of an object will change, and change because of user input, it’s hard to handle. The view and controller are fixed to a certain object, so you can’t change the object type based on user input.
So here’s the code. First, create the models:
rails g model person name:string type:string provider:string uid:string
rails g model link person_a:references person_b:references b_is:string
person.rb
class Person < ActiveRecord::Base
  has_many :links, foreign_key: "person_a_id"
  has_many :people, :through => :links, :source => :person_b
  scope :patients, -> { where(type: "Patient") }
  scope :users, -> { where(type: "User") }
end
user.rb (obviously there will be functionality here, but this is what I needed to get the associations to work):
class User < Person
end
patient.rb (as with user.rb, functionality will come later):
class Patient < Person
end
link.rb
class Link < ActiveRecord::Base
  belongs_to :person_a, class_name: "Person"
  belongs_to :person_b, class_name: "Person"
end
It was a little hard to get the associations to work. The key to making the has_many :links,... in person.rb work was the , class_name: "Person" on the association in link.rb.
With the above, I can do things like:
person = Person.find(1).first
person.people.first.name
person.people.patients.first.name
person.people.users.first.name
That’s all pretty sweet, and I really considered using this approach. In fact, I may return to it. There’s a lot left to do with my application. However, I’m pretty sure that I will need to deal with cases like a registered user corresponding to multiple patients (e.g. people get created under different names). Eventually I need a way to consolidate them.

Decorator

In the end, perhaps the simplest was the best. I just decorated a person with an instance of a user when the person is a registered user. (This allows multiple people for a user, which might be useful for consolidating duplicate people.)
Here’s what I did:
Generate the models:
rails g model link person_a:references person_b:references b_is:string
rails g model person user:references name:string
rails g model user uid:string name:string provider:string
person.rb
require 'person_helper'

class Person < ActiveRecord::Base
  belongs_to :user
  has_many :links, foreign_key: :person_a_id
  has_many :people, through: :links, source: :person_b

  include PersonHelper
end
I thought the person model should have has_one instead of belongs_to, but that would put the foreign key in the wrong model.
user.rb
require 'person_helper'

class User < ActiveRecord::Base
  has_many :identities, class_name: "Person"
  has_many :links, through: :identities
  has_many :people, through: :links, :source => :person_b

  include PersonHelper
end
lib/person_helper.rb
module PersonHelper
  def users
    people.select { |person| ! person.user_id.nil? }
  end

  def patients
    people.select { |person| person.user_id.nil? }
  end
end
link.rb
class Link < ActiveRecord::Base
  belongs_to :person_a, class_name: "Person"
  belongs_to :person_b, class_name: "Person"
end
With the above, I can do things like:
person = Person.find(1).first
person.people.first.name
person.patients.first.name
person.users.first.name
user = User.find(2).first
user.users.first.name
Again, sweet. Same number of files at the STI version. Instead of subclassing, common functionality is handled by a mixin module.

Postscript

Another thing people don’t seem to like about STI is that it’s easy to end up with a big table full of all sorts of columns used only in a few places. Most modern database management systems aren’t going to waste a significant amount of space for unused columns, so I’m not sure what the problem is.
However, it got me thinking if there isn’t a way in Rails to have more than one table under a model. Or more to the point, could you have a table for the base model class, and a different table for each of the subclasses, and have Rails manage all the saving a retrieving.
I’m sure I’m not the first person to think of this. But I’m not going to go looking for it right now.

Other Resources

Rails 4 guides on associations: http://guides.rubyonrails.org/association_basics.html and migrations: http://guides.rubyonrails.org/migrations.html.
Ryan Bates’ Railscast on self-referential associations: http://railscasts.com/episodes/163-self-referential-association, and on polymorphic associations: http://railscasts.com/episodes/154-polymorphic-association.

Thursday 2 January 2014

Moving to rbenv and Installing Rails on LInux Mint 13

I'm back to doing a bit of Rails. As always, the world has moved on. Rails is at 4.0.2, and Ruby 2.0 is out. The Rails folks are recommending rbenv to manage different Ruby versions and their gems. I knew I still had some learning to do to be using rvm properly, so I decided to invest the learning time in learning rbenv, since that's what the mainstream was using.

First, I had to remove the lines at the end of my ~/.bashrc, ~/.profile, and ~/.bash_profile, and restart all my terminal windows.

I followed the rbenv installation instructions here: https://github.com/sstephenson/rbenv#installation, including the optional ruby-build installation.

Then, I did:

rbenv install -l

that shows 2.0.0-p353 as the newest production version of MRI. So I did:

rbenv install 2.0.0-p353
rbenv rehash # Either this or the next was necessary to avoid trying to install Rails in the system gem directories.
rbenv global 2.0.0-p353
gem install rails
rbenv rehash # Don't forget this again

Now I was ready to test a new application:

rails new example
cd example
rails server

Then I pointed a browser to: http://localhost:3000, and voilà.

I'm not sure I want to leave the rbenv global in place...