Learning Guitar One Handed and in the Dark

In my ill-advised never-ending quest to acquire new hobbies, I’ve adopted a thought technology that I learned from Merlin Mann.

Merlin talks about how he needs “important” things to be able to be done “one handed, and in the dark.” He uses the specific example of being able to change a blown fuse this way: the bag of fuses is in a known place, and the bag contains a torch. He can change a blown fuse one handed, and in the dark.

The general application of “one handed, and in the dark” is that you should have everything you need where you’re most likely to need it, and so accessible that you don’t have to think about it. I’ve used this advice to great effect in the placement of phone chargers, letter openers, allergy medication, measuring tape, and everything else I’m likely to need to use.

But I’ve taken it a step further to help develop new skills, and I want to share how it’s turned me from a perpetual beginner guitar player, to a fool who can play a few of his favourite songs on a moment’s notice (poorly) for other people.

I’ve been a beginner guitar player since I was a teenager. I’ve had one guitar or another in my life for many years. And I’ve always had a copy of the Mel Bay Guitar Method Grade 1 close at hand during that time. Periodically, I’d decide that now was the time I was going to start to learn guitar. For real this time. And I would play through the first half of the book, before losing steam.

For many years, I could play the first half of the Grade 1 book with only a few minutes of preparation. I’d open the bag, take the guitar out, play from page one all the way through to where I struggled. I’d do a bit of practise, get a little further, put the guitar away, and pick it up and do it all over again the next day. Until one day I didn’t. And then I didn’t for months.

This continual cycle meant that I could do the basics, but I couldn’t really play a song. After a few days, it wasn’t fun enough to get the guitar out of the bag any more.

When we moved in to our new house, and I configured my home office, I hung the guitar I played most often next to my chair. Now I can reach my guitar at any time: one handed, and in the dark (if necessary.)

It’s changed my life.

Waiting for a build? Pull the guitar down and play a few chords. Ten minute warning for an online meeting? Connect, pull the guitar down and play until the next person shows up. Brain stuck on a problem? Pull the guitar down and play until brain gets unstuck.

In the past three or four months I went from a beginner who knows the parts of the guitar, and can play individual notes above (below? I don’t know, I’m a beginner) the third fret, to a (still beginner) guitar player who has played songs (poorly) for other people!

All of this, largely, because I have a guitar that I can reach without lifting my arse out of my chair, and a browser with chord charts in the background. It takes me ten seconds or less to go from not playing to playing. I estimate that I am getting a solid hour of practise time in every day, in five to ten minute increments, because I’ve removed the barrier to going from not playing to playing.

(I actually have three guitars and a mandolin I can reach without moving my chair or lifting my arse; and three more guitars and another mandolin that I can reach by moving slightly but not standing, but that’s a story for another day.)

There are other things that have helped immensely as well: I have met some actual musicians who are incredibly encouraging and helpful, and I’ve learned to pick the easy songs out of my favourites. But these things don’t help without continual practise.

Having a guitar accessible one handed and in the dark has made all the difference, and I highly recommend adding this bit of thought technology to your toolbox for whatever skill you want to learn.

Joplin Word Counter

Joplin Word Counter counts the words in the currently selected note. It might evolve into more statistics, but for now word count is all I really need.

Source Code

Static Typing Makes (Some) Tests Unnecessary

There is a common claim that by using a statically-typed language in place of a dynamically-typed language, you save the number of tests you have to write. So I wanted to help out by showing you the tests that you might write in a dynamic language, that you don’t have to write in a statically-typed language.

To demonstrate this, let’s use the simple example that is always thrown around: a method that adds two numbers. It’s small simple and easy to understand. You probably won’t ever write this method because your language already provides this functionality, but it’s enough to demonstrate all of the tests you don’t have to write when you use a static language.

In a static language (let’s use Java) the add2 method might look like this:

public int add2(int a, int b) {
    return a + b;

Yes, you in the corner waiving your hand. I see you. I know Java is not the best example of a static language and all of the things one can provide. I know. But it’s well known, widely used, and it is sufficient to demonstrate the point. Don’t you have a paper to write about something you don’t understand?

In a dynamic language (let’s use Ruby) the add2 method looks like this:

def add2 a, b
  a + b

Here are all of the tests that you might write for the dynamic version, that you didn’t have to write for the Java version.

def test_that_add2_returns_an_integer
  assert_equal(Integer, add2(1, 2).class)

That’s it. That’s all of the tests you wouldn’t write when using a dynamic language instead of a static language.

The rest of the tests you have to write in both languages are the same: the simple case of adding a and b, the edge cases (around the boundaries of the size of supported numbers, for instance), that it supports negative numbers, etc. This is the only test you were saved from writing because the compiler would’ve caught that one for you.

But let’s make something clear: you shouldn’t be writing that test in a dynamic language, either. It has nothing to do with the behaviour that the software will have, and the purpose of writing tests is to demonstrate that your program behaves in the expected way.

Now, if you’re a type supremacist, I’m sure you’re frothing at the mouth right now. So I’ll try to answer the things you’re thinking about.

What if I have code that passes a string to add2?

I love this argument. Because it demonstrates a complete lack of understanding about how developing software systems works, and an even deeper misunderstanding of the purpose of automated testing, and driving the design of your system by describing the behaviour it will have.

First you need to think about why you’d ever do this. Since a lot of applications today are written for the web, you’ll deal with a lot more strings than you otherwise would. You can make every caller sanitize their data first, or you can (and your test suite will let you know if this is a problem for you) get the number you’re looking for out of the string in the first place. In Ruby, you can use to_i, to_f, to_r, and a variety of similarly named methods to get the kind of thing you’re looking for.

If your add2 method needs to accept strings as input, that’s behaviour that your add2 must accommodate, and you’ll have a test for that. Regardless of the type system you’re working with.

That’s fine in an application with a good test suite, but most test suites are bad.

This is absolutely true. But it’s incomplete.

Many test suites are bad. That’s a problem with the test suite, and represents deficiencies in the development of the entire system. If the test suite doesn’t cover all of the behaviour that the system exhibits, then it’s incomplete. And having a test suite that doesn’t exercise the behaviour of the system is irresponsible.

The minimum responsible test suite will catch this kind of error.

It might catch it slightly later in the process, but this is a tooling problem. We have tools and methods that address this, and as an added bonus they add to the reliability of your system and the joy of developing in it.

What about libraries?

Apparently we’re supposed to treat libraries differently than non-library software. I don’t understand why: if the library dictates that a string should not be passed to this method, then there should be a test that covers this. If the library dictates that you can pass a string to this method, then the test suite should cover that case too.

The same arguments apply here: the minimum responsible test suite should cover the possible use cases. The documentation (which can be generated from the test suite!) should cover this case, too. This makes your library more robust, and provides a better experience for its users.

What about nil?

Discussions with type supremacists often come back to our old friend nil and this is another case that I don’t find useful. Why are these folks passing around nil so often, and why is some other test not failing when they do so?

If you pass a nil where you didn’t expect to, your test suite will catch this. If it didn’t, your test suite is incomplete.

Static Types Enable Code Inspection and Refactoring

So many favourites to pick from! This is a strong contender.

Code inspection and refactoring are not only easy to do (easier in many systems, because the reflective capacity of dynamic languages is often built-in, and more robust) but they were invented in a dynamic language. They were later ported to other languages, but the fact remains that they were in dynamic languages first, and they were (and are!) rich and useful.

Yes, when you have substandard tools, static typing makes inspection and refactoring easier. But shouldn’t we be asking for better tools?


My general feeling is that a lot of the “type supremacists” don’t feel that tests are valuable, and are looking for excuses to not write them.

I think most people are treating the test suite as optional, and practise that way. They’re backfilling tests after the fact, and missing a lot of valuable cases. They’re aren’t “not writing tests that are obviated by the type checker” they’re leaving out valuable behavioural checks that demonstrate that the software does, in fact, work.

If you have a test suite, and you’re running it anyway, regardless of whether you’re using static types or dynamic types, this error will be caught. Perhaps a little later in the cycle, but how much time was saved by not having to appease the overly-strict type system on matters that don’t make any difference to the behaviour of your software?

What’s my preference? Well, I prefer Smalltalk and Lisp. In that order. But, sadly, I’m not getting paid to write Smalltalk or Lisp most of the time. So I’ll use whatever language the team is using, and adjust my designs accordingly. We, as programmers, like programming languages, and for some reason we want to pretend that our irrational and emotional choices on the matter are rational and based in reason. They aren’t.

Developing software of a high quality, that behaves as expected, can be done in many environments. But if you think static type checking prevents you from writing tests, you have not demonstrated knowledge of programming, you have demonstrated a lack of knowledge of automated testing.

Use whichever languages and tools make you happy (or whichever you’re forced to) but please stop pretending that using one kind of language means you get to do less work to prove that your software behaves the way it was intended.

My First Programming Job

In 1996, I attended a computer show called VIEX, the Vancouver Island Expo in Victoria, BC. I was 13 years old. My Dad dropped me off at the venue, had a quick walk around himself to look at the neat gadgets, and then left me there for the rest of the day with instructions to call him when I wanted to be picked up. My Dad was always supportive of my interest in computers, and did what he could to encourage it.

VIEX was pivotal for me in so many ways, but it’s really where my career in software started. I had been hacking on various computers since the tender age of eight, and developing software. I had a weird fascination with what we used to call “business software.” While the other kids were making games (I did that, too) in BASIC, I was writing PIMs, scheduling software, accounting tools, and various other utilities that I thought would improve life.

The largest booth at the Expo was at the front door, operated by Shaw Cable. They were demonstrating The Wave, which was their cable internet offering. I think they were offering 2mbits/sec, delivered to your home via the same cable that brought television. I was intrigued: I was lucky to have early access to internet via dialup modem, and paid accounts (courtesy of my gadget-freak father) on all of the local BBSes. But we all knew high speed internet was the future.

I walked up to one of the demo terminals, and loaded some web pages. I had recently been fiddling with GNU/Linux and knew that downloading new versions would be something I wanted to do: so I downloaded the latest Slackware on the cable internet demo computer to see how long it took. It was so fast, I missed it. I was hooked.

I walked around the Expo and looked at all of the booths. I ran into the Linux nerds, and they had a stack of burnt CDs for the taking. They were organizing “Install Fests” where you would bring your PC and all of your peripherals, and they would help you install it. I signed up to help with this immediately, and made many fast friends with similar interests.

But I kept circling back to the Shaw booth and playing with the fast internet. At one point, my friend Pavel (who I met in the computer books section at the local Chapters when we both reached for the same C++ book) showed up and we hacked together a Python script to see how much speed we could squeeze out of the internet connection.

Pavel went home, but I kept at it. I typed furiously, made progress bars (dots, let’s be honest) appear on the screen. And was amazed at the speed. Periodically, someone would walk up and ask me questions about the technology, which I happily answered when I knew the answer, and helped them look it up when I didn’t, because I wanted to know too.

At one point in the middle of the afternoon, I looked up from my furious typing, looked around, and noticed that there were twenty people standing around watching me. I felt a bit embarrassed that I was hogging the machine, and apologized and walked away to talk to the Linux nerds again. When I left, the crowd dispersed.

I didn’t realize it at the time, but the crowd had gathered to watch a very young looking 13 year old boy typing 100 WPM into a computer and making it do things with the new internet connection.

Later, I circled back around to the Shaw booth and found that my favourite demo computer was available, and my programs were still there on the desktop. There wasn’t a queue, so I picked up writing programs and playing with the connection. I kept answering questions when asked. At one point, I was told that the sales guy was directing technical questions to me because he didn’t have the answers, and I seemed to enjoy answering them.

At the end of the first day of the Expo, the sales guy (who I’ll call Tim, because that’s what I remember his name being) asked me if I wanted a job. I told him “Oh, I’m not really a professional. I just like computers, and this is really fast and fun to play with.”

Tim told me that I knew more than enough, and the job would simple: wear a Shaw t-shirt, draw a crowd by playing with the computer, and answer technical questions when they came up. They would pay me $15/hr for this, for the rest of the Expo. When my Dad arrived to pick me up, Tim explained this to my Dad and asked if he could bring me back every day for the rest of the Expo.

To put this in perspective, I was 13 years old. Up to that point, my only job had been the three paper routes I kept (so I could buy computer stuff), and selling newspaper subscriptions. $15/hr, my Dad told me, was more money than I could expect to make doing literally any other job, and agreed to bring me back the next day. My adult cousins who worked in construction at the time didn’t make $15/hr.

So that was it: the next day, I came back in my Shaw t-shirt (several sizes too big, because they didn’t have children’s sizes) and played with the fastest internet connection I had ever experienced, while answering occasional technical questions. It only seemed a little weird that I was able to do this, but I felt very lucky to be getting paid to do what I would have been doing anyhow.

On the last day of the Expo, I was asked if I would prepare a “technical” demo to show just how fast the internet was compared to dial-up. Shaw had booked the main stage, but didn’t really have anything to show. I modified my Python script to run in two modes: full-speed, and throttled. And I stood on a stage, 13 years old, wearing a t-shirt made for a grown-ass man that hung to my knees, in front of 400 people describing how cable internet would change their lives.

Tim gave me my pay cheque which was more money than I’d seen in one place at a time. And asked if I would be willing to do the following shows. I worked a dozen Expos of all shapes and sizes all over southern Vancouver Island that year. Just a kid, getting paid a ridiculous amount of money, to play with computers.

I was in heaven.

This job was pivotal for me in an unexpected way: nobody told me that speaking in public was scary. I never learned to be afraid of speaking in front of a room full of people. It wasn’t until many years later, when I started to recognize the faces of my idols in the audience that I first felt a knot in my stomach before introducing myself.

It was many months before we qualified to have the fast internet installed in our own home. I got a discount, on account of being an employee, and my Dad (always supportive of my computering habits) was very happy to pay the monthly cost.

I remember the day the technician showed up to install The Wave. I insisted on installing the network card myself: I had deliberately bought one that I knew worked with Linux, and I didn’t let anyone else touch my computer. When the installer finished, he made me boot my computer in to Windows (which I, dutifully, referred to as Micro$oft Windoze in those days) and he installed an ancient, branded version of Netscape Navigator on my computer from a CD-ROM.

Some of you might not remember, but the mid-to-late 90s, were the middle of what we refer to as “the browser wars.” Netscape Navigator was released, and Microsoft proceeded to abuse their monopoly in Operating Systems to try and put them out of business. New browser versions were released rapidly (for the times), and the version that the technician had installed was what I considered to be a positively ancient version of Netscape Navigator with Shaw branding.

I asked the technician what the difference between Netscape Navigator and Shaw Navigator were, and why it was so old. He didn’t know, but he suggested that maybe it was because they changed the “throbber” to be a Shaw Wave icon, and that they had pressed CDs with this version included along with the network card driver for the default card they provided (which didn’t work with Linux.) He reminded me that I could use it to download whatever version I wanted, in seconds, as soon as he left.

I was offended by this. Shaw had pressed thousands of CDs with an old version of the browser, to install software on a computer that was now connected to the internet with the fastest available internet connection of the time. The only change was the throbber, and it was so old.

I wouldn’t let this go. So when the technician left, I inspected the Shaw branded Navigator and compared it to the version of Netscape Navigator it was based on. I realized quickly that indeed the only difference was the throbber, and that it was easy to replace the throbber with a custom image.

I spent several frantic days writing a program in Visual Basic to put the Shaw Wave throbber in the latest version of Netscape Navigator. I could then replicate the Shaw Wave browser in the latest version of Navigator. I decided that I should do the same for Internet Explorer (obviously I called this Internet Exploder.)

After a few days, I had a small Visual Basic program which would ask you which browser you wanted, download it, install it, and replace the throbber with the Shaw Wave version. In less than a minute, from the comfort of your amazingly fast internet connection.

I had an idea for a complete product: my goal was to put this program on a floppy disk, along with the network card driver. A technician could then install the network card driver, and then run my program which would grab the latest version of the browser. No CD needed. No old-ass version of a web browser for your fancy new internet connection.

I was excited. I showed this to Tim at the next Expo. He was impressed. He asked me “Do you know how much it costs to press a CD?” I did not. I don’t know if he did, either, but he knew they had to order many thousands of them on a long lead time. And you couldn’t fit a network card driver and a web browser on a single floppy disk.

A few days later I was asked to go to the Shaw office in Victoria and demo this thing that I had built for some technicians. So I took the bus to the office with a stack of floppy disks, and showed them what I had built. I brought the documentation I had written, and the source code in case they had a real programmer there to tell me what I had done wrong.

Tim’s boss was there in the meeting. He was impressed, too. He knew how much it cost to press CDs. With it, they could use the speed of the new internet connection to install the branded browser, and avoid the cost of pressing CDs. In those days, all computers still had floppy disk drives, and not all had CD-ROM drives, so the floppy-based solution was superior in multiple ways.

Tim’s boss asked me how much money I wanted for my software. I wasn’t expecting them to buy it: I thought I had just done a cool thing that would make their lives easier. But I could tell that this was valuable. I really wanted a Pentium upgrade for my computer, so I told him the largest number I could think of, expecting that it was ridiculous. I told him I wanted a thousand dollars. Tim’s boss didn’t even blink. He said “No problem. It’s ours now.”

I didn’t realize it until many years later, I probably could have charged way more for this software. By at least an order of magnitude or more. Pressing CDs cost more than $2 per at the time, the lead time was long, and there were minimum order requirements. A thousand dollars to some barely pubescent kid to eliminate the next round of CD pressing was a bargain.

I gave Tim’s boss the floppy disk with the Visual Basic source code, a printout of the documentation I had written, and was instructed to delete any copies I had of it when I got home. There wasn’t a contract: I was 13, I couldn’t sign one. But I was an employee, so they would just add it to my next pay cheque.

When my next pay cheque arrived, I had enough money to upgrade to a Pentium 90 with 32MB of RAM and a harddrive with more than a gigabyte of storage.

More importantly, I was now a real software developer. I was already living out my career ambitions, and I couldn’t even drive a car yet.

For Xmas that year, my Dad bought me Symantec C++. Someone at his work told him that I had to learn C++ in order to be a “real programmer,” and that Symantec C++ was his favourite compiler suite. So my Dad bought it for me, and printed the manuals at his work during lunch breaks.

My parents had a massive fight over it that Xmas. Symantec C++ cost almost a thousand dollars, and my mum was furious that my Dad would spend that much money so that I could spend even more time “playing with computers.” This was a common refrain in my house: my Dad encouraged me to explore technology and learn about computers. My mum wanted me to focus on getting what she considered to be a “real job.”

I continued getting programming and technical jobs of all sorts after that. It really was the start of my career. By the time I was old enough to drive, I was working full-time hours doing programming and technical jobs. My school work suffered as a result, but it didn’t matter to me, I was going to be a real programmer.

That experience taught me a few things that are still with me today. Most of my near-25 year career has been spent working from home in various capacities. I never learned that you had to go into an office every day in order to get paid to develop software, and I always found it weird when companies insisted on you keeping their chairs warmed in addition to developing software.

The work in the exhibitions taught me that talking in front of a room full of people just wasn’t that big of a deal, that it was part of the job, and nothing to be concerned about.

It also taught me that a small drive to solve problems that exist is an entrepreneurial skill that is hard to teach.

I guess you could say that in a small way, I started my career as an exhibitionist.

History of RSpec

In 2001 I started teaching Test-Driven Development to teams. Back then it was still a fairly new concept. Few teams had automated tests of any kind, and fewer still had heard of XP and TDD. Writing tests first and using the practise as a design activity was a completely foreign concept, and folks had a really hard time grasping it. (This fact hasn’t changed entirely, two decades on.)

It was a tough sell in those days. I worked hard to present the concepts in the best way possible, and to bring teams around. But it was really difficult, and I struggled. More importantly, and the problem I aimed to solve: the people I was teaching struggled.

One of the common problems I was experiencing in those days was people having trouble with the word “test.” Not just the word “test” but all of the vocabulary around TDD was, understandly, very test-centric. People were very confused. “You can’t test something that doesn’t exist,” they used to say, and often they’d be smug and add, “therefore your entire premise is flawed.” The word “test” was giving them an excuse to disregard a concept and method that I found very valuable, and I sought to change that.

In searching for solutions, I learned about the Sapir-Worf hypothesis of linguistic-relativity, which suggested to me that by changing the words used to communicate the ideas, I could influence how the idea was received and interpreted. Others around me had discovered this as well. Aslak Hellesøy, Dan North, and Liz Keogh, to name a few, were early on this train and provided a lot of insight and help.

So I started creating tools to use in my training. I wanted to remove the testing-centric vocabulary. To slightly paraphrase a prophet, “Testing is not the preferred nomenclature, dude.” By 2003, I had started teaching TDD using Ruby. In those days, Ruby was still a niche language. Ruby on Rails hadn’t yet taken the world by storm, but I appreciated that it was a proper Object-Oriented language that I could use to implement these ideas.

When pairing with others of similar mind, I found their tests all had names like test_should_do_this_thing. The word test was still there, but they were using a better word to communicate intent: should.

How do we get rid of the word test? In Ruby, we can easily remove it. So I had a plan: create tools for teaching TDD as a way of describing the behaviour that the completed software should have. This idea was still new at the time, but represented what the creators of TDD had been saying all along: automated testing communicates the behaviour that the completed software should have.

The first tool I created was a small Ruby file called describer.rb that simply aimed to remove the word test. It looked something like this:

require 'test/unit'

Description = Test::Unit
Context = TestCase

# redefine Test::Unit::TestCase.suite to use methods starting with "should" instead of "test"

With this tool, your “test” would look like this:

require 'describer'

class SimpleAdditionDescription < Description::Context
  def should_add_one
    assert_equal 2, add_one 1

Now we can write tests without ever using the word test! This was a beautifully simple solution: we’re still using all of the wonderful tooling that existed in Ruby’s built-in standard library, including the robust set of runners, while enabling new practitioners to be using different words that wouldn’t create a cognitive dissonance seemingly caused by the constant use of the word “test.”

I incorporated this into my TDD training. In my workshops, we would use this single file to write our software descriptions. I stopped using the word test in my training. Suddenly the hand-wringing around the word “test” disappeared.

I kept this up in each of my training sessions until the last day. On the last day, I would tell everyone: “These things you’re writing that descibe your software. They actually represent an automated test suite that can be run to test the software and ensure that it acts the correct way.” I would show my classes, here’s how you actually write your tests in Ruby when you leave here:

require 'test/unit'

class SimpleAdditionTest < Test::Unit::TestCase
  def test_add_one
    assert_equal 2, add_one 1

And it worked! I wanted my attendees to use the “standard tooling”, but I wanted them to approach it with a new frame of mind. The introduction of the testing vocabulary at the end allowed my attendees to use standard tooling, but to approach the problem from the intended perspective. Finally my training was providing the value that it needed to, and reaching a wider variety of people. People who might have been very resistent, previously.

The remaining problem in my classes, and one I didn’t notice at first, was attendees reversing the parameters to the assertions. It was common for my attendees to reverse the expected and actual values in their assertions, leading to confusing error messages. I’ll admit that I made this mistake occasionally, but was familiar enough with it that I recognized the error quickly. I tried to fix this by writing the expectations in a way that read more like plain language. I added a method assert_that which took a block, and you could use the assertions on the returned object.

assert_that { expected }.equals actual

This solution was a lot of code for a single file library that I only intended for use in the classroom. So I looked for a simpler way. With the power of Ruby, we can fix this! So I dipped my toe in the metaprogramming waters, and mixed the assertions in to Object. (This was a mistake in the released library, but it was really great for teaching.)

class Object
  def should_equal expected
    assert_equal expected, self

  # every other assertion

And with that change, the teaching tool was complete. But it wasn’t released. It only existed in my workshops. I never intended anyone to use this outside of my workshops: I thought they should be using the tools that existed in the standard library. The xUnit pattern was well understood, available in nearly all languages, and very well supported. I thought it would be a huge mistake to add a new library to the mix, and endure the maintenance overhead of this.

There was no need for new tools, I thought, which only changed some word usage around. In hindsight, it was silly of me to think that this was a valuable activity for the classroom, but wasn’t necessarily a valuable activity for practise.

I never wanted to release this tool. xUnit existed, and was fit for purpose. To me, the thing that became RSpec was nothing more than a teaching tool, and I didn’t think it should live outside the classroom. But, I was coerced.

RSpec is Born

In the mid-2000s, Bob Martin was trying to make the same impression while introducing TDD. He was saying the same thing others were saying, but in a different way: “Testing is about specification not verification.” And he talked about the tests as “executable specifications of the software.”

I liked this framing, but I always had problems with the word “specification.” I still do. To me, an elder of the Internet who implemented many things from RFCs and specifications, the word specification was a reserved word, that should not be used for new purposes. I thought we should find a new word.

But Bob Martin was a hero of mine at the time, and he had developed a following around the specification-centric vocabulary. He was shown a demo of the tooling I had created for my workshops, and contacted me to encourage me to release it. He strongly encouraged me to focus on the specification-centric vocabulary.

I still think RSpec was the wrong name for RSpec. When I first gave in to the idea of releasing it, I was still calling it “describer” and I thought I might do the popular thing of the day: drop the e and call it DescribR. But I wasn’t really happy with this name either, and I didn’t feel I was qualified to argue with Bob Martin on what it should be called. So RSpec was born.

RSpec was released in 2005.

I was very happy with RSpec at that point, as a teaching tool, but I still didn’t think anyone should be using it. People fell in love with the idea, and they really wanted to write their tests in this way. It amazes me that it caught on so widely, and so quickly. There have been surveys on the matter, and by some reports, up to 90% of new Ruby on Rails projects go out of their way to use it instead of the built in xUnit style frameworks. Very quickly, “RSpec clones” started popping up in nearly every other language.

RSpec as a DSL

Around the time RSpec was released, the idea of Domain-Specific Languages was also becoming popular. As a fan of using language to change the way people think about software, I wanted to see if we could make RSpec even better by creating a complete DSL for specifying the behaviour of your software.

Joe O’Brien and I paired on the first version of this at RubyConf 2005. The words we used are different than the words used now, but the idea was solid. What Joe and I paired on at RubyConf 2005 ended up looking something like this:

specify "add numbers" {
  it_should "add one to the provided numer" {
    add_one(1).should_equal 2

This change, we hoped, would cause people to stop thinking about code at all, and start focusing on the desired behaviour of the software instead. I am still pretty happy with the fact that we turned RSpec into a DSL.

Hand-off and Lessons Learned

Making RSpec into a complete DSL was, I think, my last real code contribution to the project. I still wasn’t using RSpec myself, because I considered it a teaching tool, and preferred to use the “standard” utilties. And since I was (and still am) working in many languages, I wanted to use similar tools across my work. I didn’t understand why, after learning the appropriate way to think about TDD, they would choose to use a non-standard library whose DSL caused awkward interactions with the object model.

Since I didn’t understand why people used RSpec, and I didn’t use it myself (to date, I’ve only ever used it on projects that had already started using it; I never choose it myself), I didn’t feel that I would be an appropriate steward of the project. There were many others who were more excited about the future of the project than me. So at some point in 2006 or 2007, I gave the project to its most prolific and important contributor to date: David Chelimsky.

Chelimsky was an amazing steward of the project. I could not have made a more correct choice about this. Handing RSpec over to the more-than capable (more capable than me, especially) hands of David Chelimsky is possibly my most positive contribution to the project. More positive, even, than creating it.

I also thought (naïvely, the historical record has shown so far) that RSpec wasn’t going to be the only lasting contribution I made to the software industry as a whole. I was insistent, in 2007, that I wasn’t going to be a One Trick Pony. I still like to hope that I have many tricks left in me, but I’m less confident that I will be able to move the needle again in a similar way. I’ll be honest: this fact has caused me a lot of sadness over the years.

Lessons Learned and Apologies

There were a lot of mistakes I made. The main one is more soft: I built a thing that people enjoyed, that changed their way of thinking, and I strongly resisted releasing it and allowing them to use it in their own way. I didn’t realise until many years later, when I started getting “thank you” messages from people telling me that they were unable to grok TDD and BDD, or testing at all, until they came across RSpec. I am humbled and appreciative of such messages, and I’m sorry I tried to stand in your way.

On the technical front: it was a mistake to mix the assertions into Object. It caused a lot of pain for the early adopters, and in the early days of the project we spent a lot of time doubling down on that decision and working around the problems this caused instead of re-thinking and changing the expectation format. This was rectified in future versions with the expect(expected).to equal(actual) which is the most perfect way to write an expectation, in my opinion, and is closer to the original experiments I had before making the mistake of mixing in to Object. If you spent time annoyed by this in the early days, I am sorry for the pain I caused by doubling down on a bad decision, and causing your great efforts at building more robust software to be hampered by this.

I also regret letting others take credit for my work. An early mentor of mine, who I do not name here because he has proven himself to be a self-centred and generally abhorrent individual, and in hindsight a terrible mentor, tries to tell others that he created RSpec. That the things he built after RSpec was created were the thing that RSpec is based on and inspired by. RSpec is based on contributions of code and knowledge from many people, but this individual’s actual contributions to the project were minimal, and in the early days limited to stealing someone else’s code and throwing away the license headers, which is a thing I find reprehensible.

And finally, I regret putting distance between myself and the thing I’d created. I think I could have built more useful tools in the space, and helped more people in bigger ways if I was a bit more accepting of the fact that people were using a thing I created for purposes other than what I intended it for. I am sorry for not helping more people.

Gratitude and Future

I am grateful for the opportunity to have contributed to the software community in a small way, and to have helped people realise their own potential. And I hope to be able to help even more people realise their own potential in the future. I am eternally grateful for the appreciation for my creation, its continued use in the world. I am grateful for the many beers that people have bought me as thanks for a thing I never wanted to release.

I’m hoping there are more tricks left in this pony, but for now I’m focusing on helping individuals and teams do their best work. I will continue to be happy with the many small contributions I can make, and grateful for the opportunity to work with people to help me make those contributions.

And I’m grateful for the future contributions I will have with people who will help me help them, and I hope I can make the development of software just a little bit nicer for everyone involved.

Pen Power

Very early in my career, I worked in Economic Development at the Western Valley Development Authority (now defunct) in Cornwallis, Nova Scotia. I got to do a lot of really cool things there, and work with people who loved their work and wanted to make a difference in the region. It was easily the most rewarding work I’ve ever done, and the artifacts of that work still benefit the region more than two decades later.

After a year or so of being overworked, I was excited to get a new colleague. A freshly minted Computer Science graduate named Graham (not Graham Lee, my current partner in multiple crimes) was joining to help out. I was really excited because I didn’t get to study at University, and was really self-conscious about this fact. I really thought I’d have a lot to learn from Graham.

Graham showed up first to work every day in a button-down shirt, and sometimes a tie. And he shared an office with a high-school dropout, foul-mouthed, blue-haired, atheist who wasn’t yet old enough to legally drink alcohol, but did anyway. Graham was everything I wasn’t: he did well in school, went to university, got married, bought a house, had kids, saved for retirement, all in the correct order. We respected each other, despite being polar opposites.

I learned a lot of things from Graham, but none of them were related to the CS education he’d had that I so desperately wanted. This is one of those things.

Every time we went to a meeting, Graham would bring a pen and a pad of paper. He was always first to volunteer to take the minutes. I found this weird: I always avoided being nominated to take the minutes because I found it a boring bit of busywork that I didn’t get much satisfaction from. I teased Graham for being a try-hard and assumed he was doing it in order to carry favour with the bosses. It annoyed me that Graham always volunteered to take the meeting notes.

During a drinking lunch one day, I told Graham that it annoyed me that he always volunteered to do things like this. Graham let me in on the secret. He told me “Steven, he who holds the pen, holds the power.” I snickered. I’d heard the phrase before, but I don’t think I’d ever really thought about it. He explained, “The minutes of the meeting are the official record. By taking the minutes, I can decide what to highlight and draw attention to, what to … the opposite of highlight, and perhaps in some cases what to forget to add.” In addition, he explained that when there was a meeting that didn’t have “minutes” he always took notes, and would send around a summary email to the participants after. This served the same purpose.

It was an almost too-cynical take for the do-everything-right goodie-two-shoes yes-man I’d assumed he was. I was impressed.

It has come up a lot in the decades that have passed. Not the sinister message manipulation bit, necessarily, but I have often found value in writing and sending a summary of the important bits of a meeting that I participated in. These summaries can come in handy, as a reminder of the things I want to remember, but also as a record of the decisions that were made and why they were made long after everyone has forgotten.

They’ve come in incredibly handy with problematic clients who want to insist on decisions that were never discussed, promises that were never made, or deliverables that were not late. When I’ve not taken these notes (and shared them), I’ve often come to regret it later.

I haven’t talked to Graham in over a decade. I hope he’s doing well. Last I heard, he had given up on the software industry and is a teacher now. I hope his students learn something like this that sticks with them for decades.

Scribble Nearly Anywhere. Sort of.

I watched WWDC this year, and as with most years, didn’t really care about most of what I saw. The usual suspects made an appearance: required re-work that you won’t get paid for that only serve Apple’s purposes, a new look that nobody asked for that includes walking back decades of UI research and norms (many of which Apple themselves pioneered), and anti-features that make it harder to use things that you paid for in a way you see fit.

But there was one, possibly obscure, feature announced that I was immediately in love with: Scribble.

Scribble is iOS 14’s handwriting recognition feature. It’s not exactly new: it’s the welcome return of a beloved feature I lost when I parted ways with my MP2000 nearly twenty years ago. If Scribble in iOS 14 were only as good as the MP2000, I would be overjoyed to use it.

Scribble has been added to all native text fields. This means that you can Scribble in any text field that uses native components. If you have made a todo list application, and you used native widgets for this, your app is “automatically upgraded.” You don’t have to do any work. Your app on iOS 13 didn’t have handwriting recognition, your app on iOS 14 does. Apple did the work for you by improving the native text field widget.

This is the kind of user-positive “rising tide raises all ships” behaviour that I want to see from a platform vendor. It is Apple at its best, and it’s an unfortunately uncommon look for them these days.

I have had iOS 14 on my iPad Pro for a little while, and I’ve used Scribble a lot. I was surprised to find that the handwriting recognition isn’t actually better than it was on my Newton MessagePad. And it can’t seem to understand any of the Swedish words I write. But it’s still a welcome feature for when it does work, and I hope it will improve.

I had assumed based on this lovely implementation detail, that Scribble would be available in _most_ of the text entry fields on my iPad: not just in the apps provided by Apple. I was disappointed to find that Scribble only works in about half of the places I expect it to. There are a lot more non-native text field widgets out there than I realised, and I think this is one of the major problems afflicting modern software development.

Text field widgets are something that computers have had since the dawn of graphical user interfaces in the 70s and 80s. They are a very simple, standard, widget element provided by pretty much every platform. You wouldn’t think there was a lot of room for improvement or innovation, but Apple found one in Scribble. Unfortunately, a bunch of other folks who haven’t found room for improvement or innovation decided to spend time making their own from scratch that behaves ever so slightly differently, and doesn’t benefit from the positive changes that Apple was able to provide.

If Scribble isn’t available basically everywhere then its utility is compromised. We can’t become proficient at using this feature, and have it improve our lives if it’s not nearly ubiquitous.

The old adage If it ain’t broke don’t fix it comes to mind here. By making custom, from scratch text field widgets, something that wasn’t broken wasn’t fixed. But with the release of iOS 14 it was broken.

We talk a lot in the software industry about the pitfalls of re-inventing the wheel but the wheel is being re-invented constantly, everywhere. And it’s getting in the way of what little innovation we might find, and what small improvements there are left to make.

Dos Amigans

Dos Amigans is a project that Graham Lee and I started where we explore developing software for the Amiga, and broadcast our experiences by streaming. You can visit the Dos Amigans website, or the stream site for more information.

Marketing Avoidance

Graham and I have started a new project. We are Dos Amigans: two Amiga nerds who are broadcasting our foray into software development on the Amiga platform. We’re having a lot of fun doing it, too. You can tune in every Thursday at 18:00 UTC. We hope you’ll join us.

Wow, that felt weird.

Like many folks around me, I struggle with marketing. It always makes me feel … a bit dirty to talk about the things I make or services I provide. I think a lot more people would find Dos Amigans interesting if they only knew it existed, but we haven’t really done any marketing yet. The audience we have now is already bigger than I thought we’d have, mostly because they’re our friends, or exist on a discord where “stream announcements” are welcomed.

A lot of the things I do would have far and wide appeal if people only knew I was doing them, but I often avoid talking about them or drawing attention to them. This is a post about that.

I posted a link in a stream chat to Dos Amigans the other night because it was asked for, but it felt weird. Graham messaged me and said “I feel like the amount of promo we could do without being cunts is significantly higher than the amount we do.” And he’s right. We are the dudes making the thing, so why do we feel like cunts for telling people about the thing we’re making? Even when they might enjoy it? Even when they ask for it? Why does it feel so weird to talk about the thing that we’re doing?

We aren’t the only ones who feel this way. Eric Diven, of Long Walk Woodworking, recently posted on [a popular “news” website for “hackers”] thread about his move from software into woodworking: “It feels a bit weird to mention it, but [Cabinet on Stand] is also for sale through the museum” (emphasis mine.) Why does that feel weird? The entire thread was about Eric’s departure from software, and the whole audience was there to congratulate Eric and learn about what he’s doing next. I was very happy to learn where his work could be purchased.

Every marketing professional will tell you that marketing is “just letting people know that your thing exists,” but that’s just marketing as described. Marketing as practised is more often “trick people who don’t need your thing into thinking they must have your thing.”

I think marketing as practised is why some of us have such a strong negative reaction to marketing. I often go out of my way to make sure I’m not drawing too much attention to the valuable things that I’m doing that someone might be interested in, because I don’t want to be seen as a sketchy marketer who’s tricking people into buying my stuff or giving me their attention.

In the last week, I’ve noticed that a person (who was off my radar, because I had blocked them for being a smarmy prick and lacking substance) has made a considerable amount of money and gained a lot of attention because they effectively marketed a programming practice that I taught them. Because I’m not telling anyone what I’m thinking, or doing, or solving, other people are free to take it as their own. “It’s almost like we’ve been conditioned not to compete with the alpha capitalists,” Graham pointed out.

If I’m avoiding marketing, nobody will know that my stuff exists. If nobody knows it exists, then I’m more likely to be discouraged and give up because it won’t look valuable. If I market my things myself, at least I know that I’m being honest about what I can provide. Perhaps it’s time for nerds like me to learn how to do a bit of marketing. Maybe, to paraphrase a former boss of mine, “Marketing is too important a job to be left to marketing professionals.

I’m going to try talking more about the things I’m thinking, the things I’m creating, and how you might benefit from them. I hope I don’t come off as the sleazy kind of marketer, I really just want to share the things that I’ve enjoyed learning and creating so that I can help folks and be encouraged to do more.

My question to you, dear readers, is: where are the marketing books and materials for people who want to be honest and up front, and not try to trick people into buying things they don’t need? Do these materials exist?


CocoaFIBS is a Mac client for the First Internet Backgammon Server originally written by Adam Gerson.

After a number of years of not being updated, I forked CocoaFIBS to modernize it and improve it. I am also using it as a testbed for ideas about how to modernize and improve legacy code.

My work on it is sporadic, but I am preparing it for release.

Source Code