SEO for Ruby on Rails
Now that the NotSleepy camp is actively switching to mostly Rails development for our web apps I’ve been exploring how to accomplish the typical onpage stuff commonly needed with PHP and JSP to get Googlebot and Slurp to snuggle up all cozy with your site. The following tips are just a way to use Rails to accomplish common tasks you should be comfortable with in your current choice of web app/language.
Search Engine Friendly URLs
NO MORE MOD_REWRITE! Boy that feels good to scream. I’m sure many a SEO will agree that Mr. Ralf S. Engelschall’s creation was a beautiful one when he came up with the ‘Swiss Army knife of URL manipulation’ but damn it can be so difficult to debug. I also find it cumbersome to have to manage my URL functionality outside of my application code. This is especially difficult to manage when you have multiple developers working on different operating systems, filepaths, and httpd.confs.
Lets say you had a list of companies you wanted to display but instead of a dynamic URL like
you wanted to show a nice static URL like
In your company list view you would create a link to a company like so:
:name => company.name.downcase.gsub(/ /, ‘-’)) %>>
<%= company.name %>
Then in your config/routes.rb file you will add just one line:
map.companyshow ‘company/:id/:name’, :controller => ‘company’, :action => ’show’
Now your clean URL’s will work no matter where your app is deployed whether its a naive developer still stuck on Windows or a test server running SuSE.
UPDATE: This is not a good solution for clean URLs. Better Answer:
Newer post for search engine friendly URL’s
The permanant 301 redirect is the hammer in the SEO toolbox; a must for moving a nasty site written in ASP to a slick new CMS or for simply making sure that all non-www requests get redirected to www.yourdomain.com. Based on recent conversations with a friend from the darker side of the aisle, it appears that the big GOOG still isn’t doing a great job of dealing with 302 redirects so make sure you get it right.
First generate a controller just for handling your redirects:
ruby script/generate controller Redir
Edit that controller so it looks like so:
class RedirController < ApplicationController
headers["Status"] = "301 Moved Permanently"
Finally create a new route for your old URL to the new URL in the config/routes.rb
map.connect '/someoldcrap.asp', :controller => ‘redir’, :newurl => ‘/sweet-new-url’
This is just a simple one-to-one redirect but you could easily extend this to something dynamic like we did in the end of the SEOBook 301 redirects post by adding a function to dynamically determine where to redirect to and placing it in helpers/redir_helper.rb.
Boost Your Page Load Speed with Page Caching
This one isn’t so much a SEO thing as it is a general user experience nicety but I don’t think it hurts you in the SERP’s to have fast loading pages and it will definitely help your conversions if your visitors can see the full page before their super-short-American-attention-span decides to go look for Lindsay Lohan pics.
How simple is this!?
class SomeController < ApplicationController
caches_page :index, :view, :list
In the above example Rails will cache the view for index, view, and list by creating a flat file in public and serving that file up until you explicitly invalidate the cache in your controller during an action such as an update.
Problems with Ruby on Rails page caching:
Page caching does not work with pages that have dynamic query strings but of course you shouldn’t need query strings if you use the static URL’s I detailed in the first segment of this post. Page caching also doesn’t work if you are dealing with pages that require authentication or rights management but of course you can’t cache information in any environment that requires such checking.
Dont’ Worry About Session ID’s and URL Rewriting
With PHP one of the many things a SEO has to remember to check for is nasty URL rewriting in which session ID’s are appended to the URL via standard URL rewriting and look like this:
You often see it visible in URL’s of apps such as PHPBB and those session ID’s on the URL mean that Google, Y!, MSN bots see an infinite number of indexable URLs on your site and either bail on any attempt to index the relevant content or simply dampen your rankings.
According to Lee Nussbaum Rails handles sesssions as follows:
- creates a cookie using an MD5 hash of data with tolerable amounts of entropy, though more would be desirable.
- seems to avoid session fixation attacks by taking session IDs only in cookies (which are basically site-specific) and not in links (which can be used to communicate information cross-site).
- * makes a store for session state (e.g., @session['user']) available on the server, where it is found by session id and not subject to manipulation by the user.
@useragent = @request.user_agent
if @useragent.downcase =~ /googlebot/
<%= render :partial => ‘bot’ %>
<%= render :partial => ‘notabot’ %>
A partial is a beautiful feature built into Rails that lets you stick with the DRY principle even at the presentation level by allowing you to create chunks of HTML and Ruby that can be reused in multiple places (such as a contact form). In the above code, we choose a partial to show based on whether or not the request came from Googlebot. I am sure there is a way to do with this without the <%= and so many open/close rails tags but I don't know it. If you do, please leave a comment.
Have some questions about Ruby on Rails and SEO? Post them in a comment and we’ll try to work them out.