Wednesday, August 14, 2013

Learning Ruby, Rails and Git on OSX

Every so often folks ask me what they need to know to learn ruby. I have a variety of resources I send them and thought it might be helpful to put those all into one blogpost that I can point to in the future. If anyone has any additional thoughts please let me know in the comments. Thanks!

There are a wide range of tutorials / books and online resources for learning the "typical" Ruby on Rails stack. Listed below are a number of educational resources as well as common tools used by members of the ruby community. 

Learning Ruby

Online Tutorials
     http://rubykoans.com/
          - Built by NEO Columbus, a technology consulting company in Columbus, OH, the Ruby Koans take an iterative "zen-like" approach to learning the      language. Each tutorial takes the form of a step on the path to "enlightenment" while hitting the major aspects of the Ruby language. Despite the gimmicky setup this is one of the best tutorials out there. 

     http://tryruby.org/levels/1/challenges/0 
          - Produced by codeschool.com try ruby is a walk through of the language borrowing from "_Why's Poignant Guide to Ruby." _Why's book is a bit… odd.. but this tutorial still hits on the major points of the language. 

     http://www.ruby-lang.org/en/documentation/quickstart/
          - ruby-lang is the official site of the language. This is their quickstart guide, It's ok but a bit simple, doesn't hit a lot of the most useful libraries. 

Books
     http://pragprog.com/book/ruby4/programming-ruby-1-9-2-0
          - Referred to as "The Pickaxe Book", Programming Ruby is generally regarded as the authoritative print resource for the language.

     http://shop.oreilly.com/product/9780596516178.do
          - O'Reilly's version of a reference book. This may not have been updated for Ruby 2.0 yet.


Learning Rails   

Online Tutorials
     http://railsforzombies.org/
          - Also produced by codeschool.com this is a great primer on the basics of Rails version of MVC and core rails.
     
     http://ruby.railstutorial.org/
          - I've not actually run through this tutorial, but it comes recommended from DHH - the original creator of the rails framework.

     http://guides.rubyonrails.org/
          - Not really a tutorial but the documentation on the official site is some of the best language docs I've ever read. The guide are a primer on the major aspects of the framework and are worth reading through. 

Books
     http://pragprog.com/book/rails4/agile-web-development-with-rails-4
          - Keeping with the theme of The Pragmatic Bookshelf publishing the definitive references in the Ruby world, Agile Web Development with Rails is the most commonly recommended book on Ruby on Rails. 

Learning Git
Most Rails projects use Git as their source control system. Specifically most projects host their source control on github.com. Here are a few resources for someone either learning SCM with Git for the first time or transitioning from another SCM solution. 

Online Tutorials
     http://gitimmersion.com/
          - Another online tutorial by NEO, Git Immersion walks through the most important features of Git including places where it differs from traditional SCM. 

     http://try.github.io/
          - Also from Code School, this interactive online tutorial covers the basics of Git.

     http://www.youtube.com/GitHubGuides   
          - I haven't walked through these videos, but they come from Github lending a bit of credibility.

Books

     http://pragprog.com/book/tsgit/pragmatic-version-control-using-git
          - Prag Prog's take on their "Pragmatic Version Control" series. If you are used to the previous books in the series on CVS and SVN then this should be familiar. 


Testing
The Ruby world tends to be pretty strongly behind the notion of unit testing as a means for building, and debugging your application. Further a very large contingent of the community is focused on a sub-set of unit testing called BDD or Behavior Driven Development. The main difference between BDD and traditional testing is intent. BDD is focused on writing tests using a language that is oriented at the business requirements of an application and not the technical requirements. In Ruby a number of tools have emerged to make this a bit easier. The following tools are commonly used in Rails projects but are not part of the core Rails stack and will not be covered by the tutorials listed above.

RSpec - The most commonly used BDD tool, most rails projects substitute the baked in Test::Unit with RSpec
     Github Repo: https://github.com/rspec/rspec
     Online Tutorial: http://www.codeschool.com/courses/testing-with-rspec
     Book: http://pragprog.com/book/achbd/the-rspec-book

Cucumber - Sometimes Cucumber will be referred to as Acceptance Test Driven Development. The main distinction is that Cucumber tests are usually full stack including browser interaction, where RSpec tests are at a unit level. 
     Github Repohttps://github.com/cucumber/cucumber
     Bookhttp://pragprog.com/book/hwcuc/the-cucumber-book
     Related Tools: Capybara (https://github.com/jnicklas/capybara), Watir-Webdriver (https://github.com/watir/watir-webdriver)  


Text Editors and Development Environments
There's a pretty wide range of environments people use to develop with Ruby. The most common is a very terminal-focused workflow involving a good understanding of Bash, ZSH or some other common unix shell. It's recommended that anyone learning by Ruby and a Unix-style environment for the first time spend some time familiarizing themselves with the basics of the bash shell and get comfortable working in the terminal. 

http://shop.oreilly.com/product/9780596009656.do - is a great book for learning Bash.

There's also an assortment of text editors people tend to work with ranging from in-console editors like VIM or Emacs, to full fledged IDEs like Rubymine. Here are some of the better options. 

VIMhttp://www.vim.org/ - Over 20 years old vim is still regarded as one of the most customizable programming editors out there. There are thousands of plugins to improve workflow and an infinite ways to modify this editor to fit your needs. The downside of VIM is that it employs a somewhat distinct "modal" approach to editing requiring a significant learning curve. VIM also puts a lot of emphasis on keyboard efficiency requiring the user to master some somewhat esoteric keyboard commands to accomplish common tasks. Platform-specific wrappers for VIM are available for any major OS.

EMACS http://www.gnu.org/software/emacs/ - Clocking it at around 35 years old emacs has also withstood the test of time.  Emacs is as customizable as VIM and also as universal, running on any major OS platform. One advantage of Emacs is that it is built on top of, and allows the execution of Lisp scripts. This makes the editor extremely customizable while also utilizing a "real" programming language unlike vim's proprietary vimscript. Also worth noting is that a number of default emacs shortcuts work throughout OSX. Platform-specific wrappers for Emacs are available for any major OS.

Sublime Text 2http://www.sublimetext.com/2 - Sublime text 2 has quickly become the most popular programmers editor in Ruby. It runs outside of the console and doesn't depend on a fairly complex and specific set of shortcuts and commands to use. As a result the learning curve is relatively low. It also supports the old Textmate bundle (plugin) architecture, allowing users to import any of the bundles written for the previous king of ruby editors - Textmate. 

RubyMinehttp://www.jetbrains.com/ruby/  - For those looking for a more traditional IDE, RubyMine is the best on the market. It offers integrated source management, test execution, refactoring support, auto-completion and other common IDE features. One downside - it has (or at least had when I last used it) a tendency to run a bit slowly. That said Jetbrains makes great development tools and RubyMine is always improving. 

Ruby Development on OSX
I'm most familiar with building Ruby / Rails applications so I'm going to include a quick run-through of the stages involved in installing an OSX development environment. I'm not actually setting up a machine right now and documenting the stagings so there may be a few mistakes here, I'll update the next time I have a chance to run through that process. 


  1. Install XCode and the Command Line Development Tools
    You can get XCode from the App Store or by visiting https://developer.apple.com/xcode/ and clicking the link to the app store.

    Once installed you want to open XCode and visit the XCode -> Preferences -> Downloads screen, then click "Install" next to Command Line Tools.
  2. Install Homebrew - http://brew.sh/
    Homebrew is an excellent package management tool for console/daemon style apps in OSX. You will use it to install any number of needed servers (MySQL, Postgres, Redis, MongoDB, etc.
  3. Install RVM - https://rvm.io/
    The Ruby Version Manager, RVM allows you to have different versions of ruby on your machine at the same time. This is useful when working on multiple projects, it will also ensure that the pre-installed system ruby is left unmodified.

    Make sure you follow the instructions when installation completes to add RVM to your path. 
  4. Install Ruby 2.0 and make default.
    You may need a different version of ruby for a specific project but it's a good idea to install the latest version for general use.
    > rvm install ruby-2.0.0
    > rvm use ruby-2.0.0 --default
  5. Install bundler gem
    Bundler will be used to download every other important rubygem for your project. You need to install this upfront in order to be able to later download the rest of the necessary libraries
    > gem install bundler
  6. Brew install the necessary databases
    > brew install mysql
    > brew install postgres
    > brew install mongodb

    Make sure you follow the instructions after installation to add the databases to system startup
  7. Create an SSH Key for use with Github (and other services)
    > ssh-keygen
    Follow the on screen prompts, you can optionally enter a password or just press enter. This will create a folder - ~/.ssh/ with 2 files id_rsa and id_rsa.pub. id_rsa is your private key and you should not hand that out to anyone, treat it like a password. id_rsa.pub is your public key. You can upload that to your github account, under account settings -> SSH Keys
  8. Setup git global configuration options.
    > git config --global user.name "Your Name"
    > git config --global user.email "youremail@domain.com"
  9. git clone your project
    Create a folder for your projects, (I use ~/Projects) and clone the repository there. The command will look something like this.
    ~/Projects> git clone git@github.com:GHAccount/ProjectName.git
  10. cd into the directory and follow any instructions from RVM.
    ~/Projects> cd ProjectName

    If RVM is setup in the project you'll see a big wall of warning text, and possibly an error about not having the proper ruby version / gemset. To fix the ruby version problem just run rvm install <version>. Once that's done you can cd .. to go back to ~/Projects then cd ProjectName to try again.
At this point directions will vary based on your project. It would be best to consult your team for further setup instructions


Sunday, February 3, 2013

Specing an initializer block in a Rails Engine

Im currently in the process of writing a ruby gem that acts as a plugin CMS for an existing rails application. To do this I'm learning a lot about Rails Engines.

I may write more about my experiences with rails engines but for those who haven't used this (awesome) feature yet a rails engine is basically a gem (or plugin... for now) that not only interacts with a rails application, but can inject its own controllers, models, views and other standard rails components into your application.

As part of this CMS I want to dynamically map routes to the controller path in the gem that handles displaying a page. So if the user creates a page called Foo with a "slug" of foo/bar I want to dynamically bind a route like:

get "/foo/bar", :to => "cmsgem/pages#display"

This should happen when a page is published (if the route doesn't already exist.) It should also happen when the application spins up. It should look at all published pages and bind them.

To do this you can define an initializer in your main rails engine class. It looks something like:


module Cmsgem
  class Engine < ::Rails::Engine

    initializer 'cmsgem.bind_dynamic_routes', :after => :disable_dependency_loading do |app|
      app.routes.draw do
        Cmsgem::Page.where(:published => true).each do |p|
          get "/#{p.slug}", :to => "cmsgem/pages#display"
        end
      end
    end

  end
end

(note: if you want to know why i have the :after => :disable_dependency_loading check this article)

The problem I ran into was, how do I test this? This code executes when you initialize rails. Rspec spins up it's access to rails prior to running your tests, which means I can't create a Page and test for the existence of that route, because the page would be created after initialization and not get referenced when the code block above runs.

So I dug into the routing API a bit, and came up with this:


describe Cmsgem::Engine do
  it "dynamically adds routes from pages at initialization" do
    page = Cmsgem::Page.create(:slug => "rspec/test", :title => "Rspec Test", :content => "This is an rspec test")

    initializer = Cmsgem::Engine.initializers.select { |i| i.name == "cmsgem.bind_dynamic_routes" }.first
    initializer.run(Rails.application)

    route = Rails.application.routes.recognize_path("/rspec/test", :method => :get)
    route.should_not be_nil
  end
end

Basically what's happening is that I'm finding my custom initializer in the collection of initializers that my gem runs, then explicitly rerunning it. You have to pass in the rails application as it is used to bind the routes in the gem. 


I'd be curious if there's a better solution to this problem. Maybe some way to setup a record in rspec pre-initializer or a better way to reinitialize rails. If anyone has any suggestions let me know. Hopefully I'll post more about this gem as I go along. 



Saturday, March 3, 2012

Best Buy Needs to Change

12 years ago I got my first job. It was part time selling computers at my local Best Buy in Toledo, OH. I was an incredibly shy kid and as a result I almost didn't get hired. The sales manager at the time asked me to sell him a pen in an interview, a task I found incredibly difficult. I didn't know how to sell, all I knew was that I loved computers and I loved browsing the aisles at Best Buy looking at the newest stuff. I wanted to work there, partly because it looked cool, and partly because, well, there was a great employee discount.

I got lucky. That store actually had two sales managers and Larry Neal, the other manager, saw something in me that I honestly didn't see in myself. Larry taught me that even if you're a shy, introverted person you can put on a persona of a salesman. I went on to be a pretty successful part timer at that store. I made MVP a couple times and consistently hit my numbers.

So you could say that I owe Larry and Best Buy a debt of gratitude. That job was my first chance to learn how to interact with strangers. It taught me how to make a convincing argument, how to present myself as helpful and sincere when working with people. I've learned a lot at every job I've ever had but I probably grew more as a person working for Best Buy than any job I've had since.

I tell this story to make it clear than I'm not a Best Buy "hater." Quite the opposite in fact. I used to love opening up their Sunday flier and seeing what new cool stuff was on sale. I still shop there every black friday and I have camped on their sidewalk on thanksgiving night more times than I can remember (seriously the cold has wiped out my memories.) I am a fan through and though, but like many fans of a franchise on the downswing my hope is dwindling.

In the past 3 years I cannot claim one situation where I've had a good experience shopping at a Best Buy. That isn't to say they are overtly bad, we're not talking consumerist.com style interactions with employees or pseudo-scams. My problems have been much more subtle but I believe the issues I'm encountering will do more to hurt their brand than even the biggest of PR disasters. Their's will be a death by empathy as their customer base realizes that amazon.com and their other online competitors are simply more convenient and a better shopping experience than walking into a Best Buy.

As an example I'll talk about today's trip. It's one of many similar experiences but highlights some of my frustrations.

My wife and I walked into the store today, a particularly busy one out in Mentor, OH. I was there to spend a reward zone certificate and a couple gift cards on a Kindle. The first problem we run into is neither of us knew where we would find Kindles. This is partly because they are either frequently moved or located differently based on which store you're in. First I thought maybe the mobile department would have them, I swear I've seen e-readers over there before. No, ok what about media, I remember they used to stock hardcover books a few years back an e-reader isn't much different. Nope. Last stop is computers where we finally find what we came for.

We walk up to the display, I'm not quite sure yet if I want the normal entry level kindle or the kindle touch. This is a problem that a brick and mortar store is uniquely qualified to help with. I can't pick up and play with something on amazon.com but I can at Best Buy. So first I walk over to the entry level kindle. I pick it up and immediately an alarm goes off. Great, now everyone in the department is looking at me like I'm some kind of petty thief all because I wanted to play with something on display. Let me be clear this is not the first time this has happened to me. I've set off alarms on tablet computers, digital camera, just about anything that you pick up and hold in order to try out. The salesman comes over to turn off the alarm and even confirms that this particular one is prone to frequent false activations.

So while the alarm is being deactivated I pick up the Kindle touch and play around. I'm not too impressed so I switch back to the classic Kindle. I find out this unit has been damaged by customers and so it was reset to factory settings and now it doesn't have any books on it.

I walk away to head to car audio to setup an appointment for my wife's stereo install. This goes pretty well, it's a little slow to get the attention of a sales person but all told everything is handled well.

I've had time to think now so we head back to computers and I stand around for ten minutes while all the sales people are busy. I want the $79 kindle but there are none to be seen on the shelves, which leads me to believe my only option is to ask for one, meaning a lengthy wait for an available sales clerk and then another wait to checkout. All for a product not much more expensive than the $60 video games I can just pick up and carry to the counter.

Finally we're rung out, which goes well, and I'm on my wait out the door. No receipt check this time (another practice that gives retail a bad name in my opinion.)

Compare this to a trip to another retail electronics store - the Apple store.

When I bought the original iPad my trip to Apple's store went like this -

I walked in and an employee quickly asked my name and what I was there for that day. I told her I came to look at the iPad. She pointed me to a large display off to the right where I was able to pick one up and play with a number of apps that had been preloaded, and carefully selected, to show off the devices best uses. After a couple minutes another employee walked up and said "Hi Josh, I hear you're interested in the iPad." I had a couple of questions all of which he was able to answer then I told him I'd like to buy one. He pressed a button on his headset and requested someone bring up the product. While we waited he explained to me their Applecare plan, which I politely declined and then I was able to checkout, right there, as the employee ran my credit card on his iPhone which doubles as a point of sale device and emailed me my receipt. Meanwhile a third employee came from the back with my new iPad in a bag and ready to go. I walked out and the greeter thanked me for shopping.

The point of all of this is simple. The success of retail today is all about the experience. The experience of shopping at Best Buy is frustrating. Terribly ambience, employees who tend to be uneducated about the products, a confusing store layout where it's difficult to find anything, product displays that don't work, long waits for help when you need it and you feel like you've been labeled a shoplifter the moment you walk in the store.

I love going to the apple store, and clearly I'm not alone. They get it, and guess what, they don't give up anything to get there. You know those guys at the front of the Best Buy who check your receipts? The security guys (called LPs or Loss Prevention when I worked there.) Apple has that too, but she's a greeter who helps you get what you need and makes sure your name is in queue for help as soon as you walk in the door. She's also looking out for someone walking out the door with an iPod in their pocket but you don't realize that because she's genuinely helpful.

What about those dreaded add ons. The extended warranties and accessory sales. Apple does that too but they convince you it's good by making it part of the experience of playing with the product. They place the awesome 27" monitor with the mac mini and the magic trackpad. They put information about Apple care right next to the product and don't treat it as an upsell but a great product in and of it's own.

I've talked long enough and I could write a book on the things I think Best Buy could do to improve their experience but I'm going to wrap up with a list of a few heavy hitters.

1) Change your LP team into a customer care team. Give them an in-store intercom and when someone walks in with a purpose make sure they get to the right place and there's someone waiting there to help them with their buying experience.

2) Stop putting out demo units that don't work. This is such a pet peeve of mine. If you walk into any Best Buy in the country their mobile department will be made entirely of plastic phones with stickers made to look like the phone you really want. They don't work at all, they are completely fake. Stop that. Spend the money, open the box, work with your partners to get demo units, whatever you have to do but if I'm spending time to leave my home and go to your store I want the opportunity to use the product not look at a plastic brick of a phone. The same applies to any product anywhere in the store. This means no more locked screen savers on the computers, no more glued down remotes for TVs and no excuses. Customers will break things, they will steal them... deal with it.

3) Stop treating the people who are still showing up as criminals. We both know most theft in retail comes from the employees not the customers. I'm not saying you have to put out untethered demo units, but pay attention to how you tether. I should be able to comfortably use the product without your security getting in the way. For a phone that means a 4-6 foot cord not a 1 foot cord. For a kindle that means at least 2'. And for the love of all things holy ditch the ridiculously embarrassing alarms and come up with a way to draw attention to a untethered product without your customers getting a stare down. Remember that intercom thing in #1. Maybe you can put out a targeted announcement in your employees earpieces that a product is disconnected.

4) Make the checkout process better. Seriously just steal this one from Apple because they do it right.

5) Stop putting low cost items in cages. Sorry but $79 is far to low a price to be locked up. I should be able to walk in, pick up a kindle and take it to a checkout counter (or someone to check me out, see #4.) I understand the $1500 laptops and $700 iPads.

6) If something is locked up have dedicated runners instead of waiting for a sales guy who is constantly busy. Maybe your geek squad people could double as this on lighter days.

7) Your store layouts feel dated and confusing. There shouldn't be three places that a e-reader could reasonably live. I have no real suggestions here other than to look at what Ikea does for appliance sales and maybe borrow a bit of their "small homes" model for your bigger departments. You actually bought Magnolia a decade or so ago and they did this well.

8) Get rid of the furniture departments. It's always been half assed and it makes you look bad. If you must keep it around then take some pride in it and come up with a way to make it work. Maybe see the Ikea like homes in #7. This one is kind of random but it's always bugged me.

9) Display the accessories for the product with the product. Put them at eye level and make them relevant. That means Kindle cases should be next to the kindle. The Ink for the printer, even if it's an end cap should be next to the printer. Never ever let the wrong accessory be next to an incompatible purchase (this happened to me once and I almost walked out with 60 dollars worth of the wrong ink for a printer.) This is your bread and butter, you customers shouldn't have to hunt to help you sell them.

That's just a few ideas. I'm no retail expert and these could all be terrible from a business perspective. Clearly some of them will cost some money to implement, but I really do believe in order to compete with online Best Buy has to make the most out of the advantages of bring a Brick and Mortar store. That means the convenience of getting your product the day you want it, the ability to see related products and get those too, and the ability to hold, feel, touch and play with the thing you're thinking about buying.








Sunday, January 15, 2012

Mapping (Multiple) Legacy Databases With Datamapper

We're starting in on a new project at work and it comes with some unique requirements (don't they all?)

To get things going so the rest of the team can take over and run with it I spent a couple days figuring out Datamapper the ruby ORM and mapping a number of legacy data tables.

This application is unique in that it will run against 3 separate databases. Two of them come from an older version of the application that the client built several years ago, we'll call these databases db1 and db2. The third database is where any custom data will be held for the web interface itself. We'll call it web1.

The first trick to getting datamapper to work in a rails applications is knowing what to actually install. Datamapper is a general purpose ORM and can be used quite effectivley outside of rails, as such the documentation on datamapper.org doesn't really focus on rails. No problem, take a look at the dm-rails gem.

dm-rails overrides a number of ruby on rails default generators, patches rake db:migrate to work with datamapper upgrades and automigrates and even integrates well with rspec and cucumber. Also worth noting that it worked fine on the latest and greatest (as of this blog post) - Rails 3.1.3 with Ruby 1.9.3.

Part of setting up dm-rails is getting your database.yml configured correctly. The documentation on the github page is great but as an added example here is a version of my config (with secure info removed.)

db1_defaults: &db1_defaults
  adapter: mysql
  username: [USERNAME]
  password: [PASSWORD]
  host: localhost
  database: db1

db2_defaults: &db2_defaults
  adapter: mysql
  username: [USERNAME]
  password: [PASSWORD]
  host: localhost
  database: db2

web1_defaults: &web1_defaults
  adapter: mysql
  username: [USERNAME]
  password: [PASSWORD]
  host: localhost
  database: web1

development:
  database: web1
  <<: *web1_defaults
  repositories: 
    db1: 
      <<: *db1_defaults
    db2: 
      <<: *db2_defaults

test: &test
  database: web1_test
  <<: *web1_defaults
  repositories: 
    db1: 
      database: db1_test
      <<: *db1_defaults
    db2: 
      database: db2_test
      <<: *db2_defaults

Once your database is configured you should be able to run rake db:migrate and see success messages. At this point you have no models configured but you can see that it can connect correctly.

The next step is mapping your existing tables.

db1 and db2 have quite a bit of existing data, their own data formats and a somewhat inconsistant naming structure. This resulted in a lot of custom mapping and eventually I just decided to explicitly spell out every mapping for consistency (even if the datamapper conventions would have applied.)

Here's an example model file. Again I've removed business context by renaming some things. I'll call out some pitfalls after.

class Procedure
  include DataMapper::Resource

  def self.default_repository_name
    :db1
  end
  storage_names[:db1] = "procedures"

  property :id, Serial, :field => "id"
  property :name, String, :field => "procnam"
  property :length, Integer, :field => "avgminlen"

  validates_length_of :name, :max => 25
end

The first important thing to note is this method declaration.

  def self.default_repository_name
    :db1
  end


This is how you tell Datamapper which database this model's table is in. In this case Procedures come from the db1 database.

Next thing worth noting is:

  storage_names[:db1] = "procedures"

Which is how you set the table name. In this case the table was pluralized. If you find that all of your tables follow a naming standard (something like tblProcedure) there are ways to override the naming globally. In this application naming standards were fairly inconsistant so I ended up adding this line in every model even when it was technically unnecessary.

Also for each varchar type column I added a length validator. The length of the varchar varied wildly in my application from 1 char fields up to 50 varchar fields. The best way I could come up with to deal with that was to simply validate that it never overflowed. Would love suggestions on any other config flags I could add if need be so datamapper knows what it is structurally.

One last thing to note. The override to default_repository_name MUST BE BEFORE THE PROPERTY DECLARATIONS. Sorry for the all caps. I spent about 2 hours fighting datamapper before figuring that one out. I even made a SO post on it before i figured it out.



Saturday, January 14, 2012

Codemash 2.0.1.2

2012 marks my fourth year at Codemash. I've written about this conference before but in short, if you don't go to Codemash you should. For your (or your companies) dollar it is the best value you will get. If you run a company you should send your people to Codemash, or sponsor it, maybe even host a bacon bar (thanks DI).

So the promotion is out of the way, what I really want to do with this year's recap is talk about each of the sessions I attended because they were all phenomenal.

The Scala Koans - Dianne Marsh & Daniel Hinojosa


The first precompiler was a half day session that covered the Scala Koans. The Scala Koans are a Scala implementation of the Ruby Koans.

Koans, in the context of computer programming, are "short, enigmatic stories that usually involve some kind of epiphany. (Jim Weirich)." The idea of a language koan exercise is that by completing a series of short questions you will learn the core of a language. Jim Weirich and Joe O'Brien really started this approach to leaning programming languages with the aforementioned Ruby Koans but now implementations exist in Scala, Objective-C, C# and many other languages.

Scala is a pseudo-functional, pseudo-object oriented language that has static typing with a dynamic style syntax. So.. it's a Mutt, but a very interesting Mutt. As a ruby / C# dev I saw a lot of promise in having the power of a runtime like the JVM and static typing with the comfort of ruby-style syntax.

If nothing else it's worth checking out the Scala Koans to get a really quick overview of another language.

Advanced Patterns with Ruby on Rails - Jeff Casimir

This precompiler was targeted at an existing ruby developer with a few projects under their belt that wanted to learn more about applying various design patterns or concepts to their ruby (and rails) applications.

Among the topics covered was isolating business logic and limiting law of demeter violations with Draper by using the decorator pattern.  Also slimming down controllers by refactoring calls to activerecord into one method and limiting the number of instance variables used in the view.

Keynote: Rethinking Enterprise - Ted Neward


Thursday's morning keynote focused on how enterprise software needs to reevaluate what it values. How reuse is no longer as significant as it was in the past, how we should be not only looking at the tools we are comfortable but looking to find the best tool even if it isn't part of out "enterprise stack." It warned agains being technist "This platform is always better than that platform."

It's the Little Things - Brad Colbow


This talk focused on the little issues in UX design and how when put together they can add up to larger problems with your applications. He used the example of the Cleveland Public Library (and really Overdrive.) He previously did a comic strip to express his frustrations with their interface for downloading audiobooks which became quite a popular strip. How taken individually the issues he ran into were minor but as a collection completely infuriating. He advocated small changes like remembering to turn off autocorrect or autocapitalize on a login form for iphone users.


Brad also talked a bit about his experience building an iPad magazine and how so many of the other iDevice magazines tend to make bad assumptions about their audience and use custom gestures that are confusing and easily forgot.

Building Windows 8 Applications with HTML5 and jQuery - Rich Dudley


Rich covered the new Windows 8 developer preview and how to use HTML5 to craft a metro style application that can be deployed via the Windows Store.

Regardless of what you think about Windows 8 I'm really excited about the prospect of using HTML5 and CSS to layout the UI of a client application. I've always thought that Microsoft's previous attempts to handle UI layout were really poorly implemented. HTML in my mind is a more intuitive way to make a UI and I prefer working at a markup level than using a designer (yes i know you could layout xaml in an text editor too.)

It looks like Microsoft is serious about not breaking the standards based nature of the markup too, which is encouraging. Looking at the examples shown they make liberal use of the data- attributes in order to bind data and actions but all of that is standards compliant. I saw no custom metro only tags only attributes and data attributes. WinRT is implemented through a javascript layer that can be granted system level access.

A number of precautions are taken with the applications including making the developer select one of two modes, essentially net or client. Net mode sandboxes the application and it's data and prevents access to most system resources. Client mode provides system access but isolated most network calls. The apps run like mobile apps, they do not remain in memory when switched away from but instead must have their state stored and recovered when focus changes.

In short metro looks promising, especially for non-.net devs who want to build something on windows but don't want to get too far out of the comfort zone of web development.

Mastering Change with Mercurial - Kevin Berridge

So to be perfectly honest I wasn't going to go to this talk except Kevin's an old college friend of mine and I wanted to be there to support his first talk at a larger conference like Codemash. Before I left though I was completely blown away and realized I was very much being a technist like Ted Neward had described.

See there's a bit of a git / mercurial rivalry. Kind of like vim / emacs or other opposing tools. Developers love rivalries and I very clearly have fallen on the git side. Truth be told though that has more to do with Github being awesome than git itself. So i spent most of Thursday harassing kevin (jokingly) about mercurial and asking if he was going to admit at the end of his talk that git was simply better. Instead he came out swinging and knocked everything I knew about distributed version control on it's proverbial ass.

Mercurial is awesome... dammit... it pains me to admit it and then publicly post it in this blog but it really is.

So first off kevin did the best "intro to the thing I'm about to talk about" that I've ever seen. Most "advanced" talks end up spending 30 minutes doing an intro and barely ever get into advanced content. Kevin played a minute long video and narrated it and if you had any experience with distributed version control (including git) you were now up to speed. Seriously i basically never used hg but the video was all i needed to make the connection to my normal workflow in git.

He then talked about stuff like viewing and searching history, simple aliases or tricks that can make your life better, and ended with some of the coolest history management stuff I've ever seen using hgs queue's extension. I may also be convinced that hg's way of labeling branches (by default) is better than how git does it.

The plan now is to get Kevin's slides, convert them to git and then jointly present the topic at burning river devs as how to do these things in both worlds.

CoffeeScript is for Closers - Brandon Satrom

Brandon's talk was an honest evaluation of why Coffeescript is worth looking at. He made every attempt to avoid the CoffeeScript > Javascript false debate people tend to have online. Instead he made a number of points as to why, even if you stick with normal JS for your daily work, you should give coffeescript a shot. The argument that best resonated with me was that using CS and then looking through the JS code is compiles into you can get an idea of how to be a better JS programmer. CS doesn't compiled into impossible to read minified JS code, instead it compiles into very readable code that makes every attempt to follow best practices. If you're a bit of a JS noob it might be worth rewriting some of your stuff in CS, compile it and then see how it turns out.

Dealing with Information Overload - Scott Hanselman

So i went to this talk because Scott Hanselman was speaking. I really didn't even read the title or abstract. It was added as a talk kind of last minute because of some other cancellations.

Scott is a freaking great presenter, which makes sense as that's what he does. This talk could have been a keynote. It focues on real world hacks to make your life easier to handle. Things like not answering email until a set time so that you can establish the precident that you work in the morning when your most "fresh." One hack I actually did right away was setting up an inbox rule to redirect email where I am cced into another folder meaning I focus on the email that I am directly referenced on.

Be The Input - Kinect With Your Computer - Ben Barefield

Ben's talk was on the Microsoft Kinect API for windows. Ben was lead dev on a small racing game that uses the Kinect sensor to control the carts on the screen. His company, SRT Solutions, had the game on display at their booth.

The thing that most surprised me about this talk was how easy it is to get up and running with both Kinect and the XNA platform for game development.  Ben was able to build a small test app that tracked a user's hand on the screen using Kinect in about 20 minutes during the talk. A really impressive demo and now I need to learn a bit more about this platform.

Everything Else


Some of the best parts of a conference like this don't happen in the session but rather in the side conversations you have with friends, peers and strangers in the hallways. I learned about dynamic vs static typing on a more fundamental level, closures, JVM backed languages, the logistics of spinning up dynamic servers with chef and vagrant and a million more things meeting with people in the halls and chatting over beers at the evening parties.

Codemash is worth it for just the inbetweens. The 20 mins between sessions and the events that occur between Thursday and Friday. You can't possibly make it to everything but the joy of codemash is you can always go next year.




Edit: Left off the speakers for the precompilers. Sorry.

Friday, December 30, 2011

Yet another move

So my year with Squarespace is up and they are coming knocking to renew. I don't really want to pay for a blog I update maybe 3 times a year so I'm moving this all over to blogspot.

I migrated all the posts (manually as the export thingy didn't work). I even moved over comments that weren't spam. You'll notice a lot of "Originally from ...." if you look at the comments. There's no way for me to add a comment on someone's behalf it seems so I just added them all myself.

Hopefully I'll have more to say soon.

Wednesday, February 16, 2011

Mocking Out Omniauth for Cucumber Testing

A few months ago some friends and I got together for Cleveland Startup Weekend and started building our "next big thing" DreamKumo

We had a couple challenges. First we knew from the get go that we wanted the site to be 100% facebook integrated. That meant implementing Facebook Connect and the Facebook Graph API. We also are a group of people who throughly believe in ATDD, TDD and agile development principals (and practices) in general. This led us to building up cucumber features for every area of the site. 

We hit a snag though when we tried to cuke our login process. We're asking our users to login through Facebook connect, which means when they click the login button they are redirected to facebook.com, and then sent back to us. Implementing the requirement was pretty simple using the omniauth gem with a little help from the folks at Rails Rumble. Cuking it is another story. Since facebook connect actually redirects the user we can't mock out the request using fakeweb and we don't want to cuke what happens on Facebook.com since 1) we can't control it, 2) it isn't consistent (sometimes you get a screen to accept the permissions that your app is requesting, sometimes you get the facebook login screen and sometimes you get redirected straight through) and 3) we don't want to setup a dummy facebook account for our cukes to use. 

So we needed to prevent the site from redirecting to facebook, and dummy up the site cookie so it looked like we were authenticated. We also needed the failure case to test what would happen if our user was sent to facebook and they decided they didn't want to grant us permission to their account. 

Enter monkey patching. 

Ruby has this wonderfully powerful ability to override the native functionality of a method or class at runtime. It's part of it being a dynamic language and it let's you do some really cool things, like manipulate the functionality of any class, including library classes like those in omniauth, at any time. So we dug through omniauth's code to find the point where it did the redirect to facebook and mocked out a response. We then had it immediatley redirect back to the callback url, or the url in our application that responds when facebook redirects back to us. So the result is we never hit facebook and omniauth sends along what we need to determine if the authentication was successful or not. 

Ok enough talk - here's the code. 

These steps are in a generic_context_steps.rb file in our step_definitions

 

Given /^I have valid facebook credentials$/ do
  module OmniAuth
    module Strategies
      class OAuth2
        def request_phase
          redirect callback_url
        end
        def callback_phase
          @env['rack.auth'] = {"provider"=>"facebook", "uid"=>"123", "user_info"=>{"first_name"=>"John", "last_name"=>"Doe", "nickname"=>"JohnDoe", "email"=>"john.doe@test.com"}, "credentials"=>{"token"=>"abc123"}}
          call_app!
        end
      end
    end
  end
end
Given /^I have invalid facebook credentials$/ do
  module OmniAuth
    module Strategies
      class OAuth2
        def request_phase
          redirect callback_url
        end
        def callback_phase
          fail!(:invalid_credentials, nil)
        end
      end
    end
  end
end
We simply call those in any feature where we want to authenticate the user and it continues on with the test as intended. 
There is a slight ATDD tradeoff. You aren't actually testing everything exactly as your user would see it since you're skipping the Facebook part. But In this case we borrow a bit from our TDD book and justify this by saying we only test our own code, and we didn't write facebook.com.
Hope this helps.