Raspberry Pi wifi+ssh connection disconnecting (Ourlink rtl8192cu)

I started playing around with a new Raspberry Pi 2 that arrived yesterday. I had used a Beaglebone before (always connected to ethernet) so I knew what to expect in terms of memory and CPU limitations. What surprised me was how flaky ssh connections were when connected over wifi using this USB wifi dongle (an Ourlink RTL8188CUS/RTL8192cu chipset) from Adafruit. I could connect for a second and then if I paused for even a moment the connection would drop and I would get  a “Host is down” response if I tried to reconnect. Usually I would need to power cycle the Pi to be able to ssh in again.

Googling for similar issues I found the following advice:

  • Some 5V power sources actually deliver less than 5 volts. Also some chargers don’t deliver enough current (A) to supply both the Raspberry Pi and the wifi dongle. To guarantee that this wasn’t the issue I added a powered USB hub to run the wifi dongle. This didn’t solve the issue.
  • You can enable an ssh option to send a “null packet” every X seconds. You can set this on the server-side or on the client-side. On the server in /etc/ssh/sshd_config:
    ClientAliveInterval 60

    On the client in ~/.ssh/config:

    ServerAliveInterval 60

    This helped out considerably. This made sense since I could stay connected by just running ping in the ssh session or typing ls constantly. I would still lose connection after a while and would have to wait for something in the Pi to reset.

  • The real solution (and the point of this post) is suggested here. Apparently the wifi dongle has a power conservation mode that is causing it to disconnect from wifi. In hindsight, this is kind of obvious (I could see the Pi disconnecting from the router) but it took me forever to find this tip. Anyway this fixed everything:
    echo "options 8192cu rtw_power_mgnt=0 rtw_enusbss=0" | sudo tee --append /etc/modprobe.d/8192cu.conf

Hopefully this will save someone else some time.


I mean to write some longer pieces here soon. I’ve put some energy into various posts on forum.stupididea.com. I would especially recommend the links category, which is where I put interesting things that I find through my feed reader. I also write over at my company’s blog on software development topics, mostly Ruby-related:  http://wegowise.github.io/. When I write over there of course it’s in a certain voice and I have to be slightly less opinionated, so I’ll still talk about software development here.

Review of It’s Complicated, by danah boyd


(This book is actually available for free download here: http://www.danah.org/itscomplicated/, though it would be good to support her work if you like it.)

The remarkable thing about It’s Complicated is that danah boyd actually talks to teenagers for her findings. As she describes the different popular attitudes and beliefs about teens and social media it’s striking how little of it is actually based on sound data, and how much of it is based on the selling of fear by news media, barely concealed inter-generational prejudice and sheer intellectual laziness.

In one telling story, boyd (like bell hooks, she doesn’t capitalize her name) talks about a girl who killed her mother who had also been active on Myspace and describes how the media translated this event into “Girl with Myspace kills mother”. boyd identifies this as a common theme: stories that would otherwise be about broken homes or social disfunction become stories about social media, with the implication that social media creates the pathology. She argues that technology is such a locus for our hopes and anxieties that the realities of how technology is actually used by teenagers becomes distorted, resulting in bad public policy that often causes real harm.

For her own research she simply sits down with a wide array of kids and asks them questions about how and why they use social media. She uses a qualitative, ethnographic method, meaning that there’s no attempt to statistically verify the universality of her findings with surveys containing standardized questions and answers. Instead she identifies common themes from hundreds of interviews conducted over the last decade. This style of research strikes me as a good way to open up a field of social research, because it starts from reported phenomena rather than from aggregated data that already embeds assumptions and is subject to multiple interpretations. Many of her findings run directly counter to conventional wisdom, and others seem completely overlooked by popular media.

One common finding is that many kids either have little unstructured social time, because of over-programming by parents, or their parents don’t allow them to leave the house or meet with friends because of safety fears.  Social media becomes more important to teenagers as a result (the teens keep telling boyd that they would always rather hang out with their friends in person). Another finding is that teenagers use the different social networks in different ways (just like adults do) and struggle to maintain their privacy in the face of confusing settings and social circle boundary collapses (just like adults do). She talks about teens who are exasperated with parents who jump into Facebook conversations that are clearly not intended for them. She also talks about teens who employ clever tricks to avoid “drama” stemming from social networks, like deleting all comments on their wall every day (“whitewalling”) and a teen who temporarily suspends her account every day instead of logging out.

A more general theme is the conflict between parents’ concerns about teenagers and teenagers’ own perceived interests. Adults are concerned with ensuring that teenagers are doing their homework, not getting involved with bad kids, and keeping their digital records clean for future employers. Teenagers on the other hand are interested in entering into public life wherever it can be found. Since other kinds of ‘publics’ are withheld from teenagers, even to the extent that many teenagers are prevented from gathering,  ‘networked publics’ are elevated in importance for teenagers, and they are willing to make more trade-offs in terms of privacy or potentially negative representations.

What adults see as irresponsible or even “obsessive” interest in social media is for boyd a rational response to the developmental stage of being a teenager given existing social conditions. She attributes the special anxiety that parents have towards the Internet and social media to the fact that it allows teenagers to enter into various ‘publics’. boyd argues that teenagers need to be able to step out and operate in these publics in order to do their ‘identity work’, to develop into social adults. The conflict between teenagers and adults is thus a disagreement about how valuable that online social engagement is. To teenagers it feels very important but adults tend to discount it, often without compassion.

Another theme that runs throughout the book is the idea that technology by itself generally does not create social problems, nor does it offer solutions on its own. In one chapter boyd looks at the fears about sexual predators on the Internet. She finds that the risk is extremely low for teenagers overall, but that the teenagers who do get involved with adults through the Internet tend to come from troubled households and engage in risky behaviors in real life as well. Likewise, in a chapter on whether teenagers are “digital natives”, boyd points out what should be obvious: like other forms of literacy technological literacy tracks with income and social class. Similarly social media doesn’t necessarily promote equality or erase racism, but largely reproduces existing social networks and attitudes. In a chapter on online bullying, she allows that technological ‘affordances’ can help spread harassing messages farther and wider, but she reports that her informants claim that bullying is not a big issue for them (instead they talk about “drama”, which doesn’t involve a power differential). Her message is that while technology can amplify or alter behaviors online, it does not necessarily create the behaviors or the underlying non-technological conditions behind them.

The book is written in an accessible style with a minimum of jargon that clearly enunciates its arguments and findings. These points are so counter to popular views on teenagers and social media that the exposition can be forgiven for being somewhat cautious and repetitive. boyd does a good job of not assuming any academic background on the part of the reader, and gives a clear explanation of the few theoretical constructs that she needs to make her case (she does have the annoying-to-me academic tic of referring to things as “problematic”). Over all, this is a great model for how to communicate challenging ideas to a wide audience. I would recommend this book to parents, educators, policy makers, journalists–anyone who would like to understand teenagers, rather than just demonize them.


Somebody asked me to make a list of webcomics that I subscribe to (using Feedly these days) and I started writing little descriptions next to each one. I figured I’d just place it here in case anyone else is interested. These are the webcomics that have made the cut after having tried out and abandoned many more. Here they are, in no particular order (well, the order of the tabs in Chrome at the moment). I read all of these but some I recommend more whole-heartedly than others (recommended comics have a * next to them).

  • * Hark! A Vagrant — The panels require basic knowledge of literature and history, but it’s Kate Beaton’s art style that usually clinches the joke. She draws insanity and idiocy extremely well.
  • * Left Handed Toons — The conceit is that the right-handed artist draws with his left hand, but he’s been doing it so long that it doesn’t seem like much of a hindrance anymore. The comic relies on brutally dumb puns and literalisms, as well as a recurring cast of characters like Fridaynosaur, Whale, and General Twobabies.
  • Penny Arcade — Mainly of interest to people who play video games. Obscure references to recent video games and lots of gross-out humor.
  • PhD — Only occasionally funny. This is basically Cathy for academics. The jokes are all: graduate students are overworked; dissertations are stressful; advisors are clueless.
  • * Pictures for Sad Children — Grim strips that follow a depressive logic where events often take a surreal turn but nobody acts surprised. People end up inside dogs and idly discuss what to do, etc. Pretty great if you have the fortitude for this kind of thing. Not updated these days.
  • * Poorly Drawn Lines — Just started reading this, but so far the gags are good, absurd but not particularly dark.
  • *  Saturday Morning Breakfast Cereal — SMBC’s Zach Weiner is on par with Gary Larson and xkcd’s Randall Munroe in terms of raw creativity. He does both one-panel gags and longer multi-page stories. The comics generally have a science and science fiction bent with a solid grasp on related philosophical issues. Weiner is also a significantly better illustrator than most webcomic artists.
  • Sheldon — This is more of a traditional syndicated comic, but it occasionally has some interesting storylines.
  • * Subnormality — These are irregularly updated bizarrely dense comics that typically take up several screens. The comics are extremely dialogue-heavy, to the point where I frequently skip them not because I dislike the stories but simply because I don’t have the time to read them all. The stories typically take the form of externalized inner dialogues about insecurity, projection, prejudgment, etc.
  • * Gutters — One-off comics about the comic book industry by comic book artists.
  • The Trenches — Episodic strip about game testers.
  • * Wondermark — Hilarious comic assembled from old illustrations to create absurd hybrid 19th-and 21st-century jokes. Has an alt text joke.
  • * xkcd — Should need no introduction. Alt text joke.
  • * Girls with Slingshots — A well-drawn sitcom about two girls who do polyamory. Sexy but not explicit.
  • * Cyanide & Happiness — Incredibly crass jokes that could be fairly criticized as promoting all kinds of bad culture, but I would be lying if I said I didn’t find many of them to be amusing.
  • * Dinosaur Comics — Dinosaur Comics is what it is: goofy (secretly rather intelligent) rambling about language and culture overlaid on an invariant set of six panels.
  • * Dilbert — Just because Dilbert cartoons are a cubicle cliche doesn’t mean that they don’t generally speak a certain truth. It’s the same stuff over and over about pointy haired bosses and lazy coworkers, but the punchline is usually pretty fresh and often surprisingly edgy.
  • Diesel Sweeties — People seem to love this comic. It’s sufficiently interesting but doesn’t blow me away. It’s mainly jokes about killer robots and coffee.
  • * Crocodile in water tiger on land — Commentary about Indian society, delivered in a self-satirizing manner by a cast of Indian archetypes that I have to partially construct from context (e.g., the hipster, the religious zealot, the business fat cat, etc.) I don’t totally follow the issues being discussed, but it gives me some insight into Indian society and provides a valuable example of cultural self-criticism.
  • * Cat and Girl — I can’t say that Cat and Girl is usually or even often laugh out loud funny. In fact I can’t say that I generally 100% understand Dorothy’s comics. The strips tend to reward taking time to do an analysis of the words and symbolism to derive something like a thesis statement. The thesis statement is often about the nature of authenticity as an ineffable criterion for value in the context of postindustrial society and the age of digital reproduction and social networks. This will certainly not be everyone’s cup of tea, but it definitely is mine.
  • * Blaster Nation — This is another geeky narrative sitcom. Digging the story so far.
  • * Bad Machinery — Scooby-doo-style mystery stories set in England. The fun though is in the hilarious banter between the kids.
  • * Abstruse Goose — Similar to xkcd, jokes about being geeky with a focus on math and science.
  • * A Softer World — Three-image strips with some text that is generally a funny-sad statement about love and loss.
  • * Hijinks Ensue — Violent-gross jokes about geek dude culture.
  • * Scenes from a Multiverse — Strips rotate through several universes. The most popular ones get to come back.
  • * Perry Bible Fellowship — Wonderfully evil, beautifully drawn comic. Every punchline is designed to disturb you. Not updated these days.
  • * Achewood — I’m not going to bother describing this comic. It’s about some dog people and the writer Chris Onsted has a wonderful ear for spoken English. Sadly not updated for more than a year.

Helpful commands

As a follow-up to my last post here are some commands that I use throughout the day. They are admittedly nothing special but they help me out.

.bashrc aliases:

alias grc='git rebase --continue'
alias gca='git commit -a --amend'
alias gs='git status -sb'
alias gd='git diff'
alias gsn='git show --name-only'

The one worth explaining is gca. This one stages and commits everything to the previous commit. I use this constantly to keep adding stuff to my WIP commits. One thing to watch out for is this will mess up a merge conflict fix inside a rebase operation because you’ll end up putting everything in the merge before the conflict. You want to do grc instead.


force_push — I use this to automate the process of updating my remote branch and most importantly to prevent me from force pushing the one branch that I must NEVER force push.

#!/usr/bin/env bash
CURRENT_BRANCH=`git rev-parse --abbrev-ref HEAD`
if [ $CURRENT_BRANCH == 'master' ]; then
  exit 0
echo "git push origin $CURRENT_BRANCH --force"
read -p "Are you sure? [Yn] "
if [ "$REPLY" != "n" ]; then
  git push origin $CURRENT_BRANCH --force

rebase_branch — There’s not really a lot to this, but I use it reflexively before I do anything.

#!/usr/bin/env bash
git fetch
git rebase -i origin/master

merge_to_master — I do this when I’m done with a branch. This makes sure that there will be a clean fast-forward push. Notice how it reuses rebase_branch.

#!/usr/bin/env bash
CURRENT_BRANCH=`git rev-parse --abbrev-ref HEAD`
echo "git checkout master"
git checkout master
echo "pull origin master"
git pull origin master
echo "git merge $CURRENT_BRANCH"

git-vim — this one is still a bit of a work in progress, but the idea is to grab the files you’ve changed in Git and open them in separate tabs inside Vim. You can then run it with git vim which I alias as gv.

#!/usr/bin/env ruby
# uncommitted files
files = `git diff HEAD --name-only`.split("\n")
if files.empty?
  # WIP files
  files = `git show --name-only`.split("\n")
system("vim -p #{files.join(" ")}")

Of course, all these scripts need to be put somewhere in your executable path. I put them in ~/bin and include this location in my path.

So my workflow would look like this

git checkout -b new_branch
# hack hack hack
git commit -a
# hack hack hack
# hack hack hack
# all done now
# whoops a merge conflict
# resolve it
git add .
# Time to get this code reviewed on Github
# Code accepted, gonna merge this

Git workflow

In my last post I described how at my work we use code review feedback to iteratively improve code. I want to describe how Git fits into this process, because this is probably the biggest change I had to make to my preexisting workflow. Basically I had to relearn how to use Git. The new way of using it (that is, it was new to me) is extremely powerful and in a strange way extremely satisfying, but it does take a while to get used to.

Importance of rebasing

I would describe my old approach and understanding as “subversion, but with better merging”1. I was also aware of the concept of rebasing from having submitted a pull request to an open source project at one point, but I didn’t use it very often for reasons I’ll discuss later. As it turns out understanding git rebase is the key to learning how to use Git as more than a ‘better subversion’.

For those who aren’t familiar with this command, git rebase <branch> takes the commits that are unique to your branch and places them “on top” of another branch. You typically want to do this with master, so that all your commits for your feature branch will appear together as the most recent commits when the feature branch is merged into master.

Here’s a short demonstration. Let’s say this is your feature branch, which you’ve been developing while other unrelated commits are being added to master:

Feature branch with ongoing mainline activity
Feature branch with ongoing mainline activity

If you merge without rebasing you’ll end up with a history like this:

History is all jacked up!
History is all jacked up!

Here is the process with rebasing:

# We're on `feature_branch`
git rebase master # Put feature_branch's commits 'on top of' master's
git checkout master
git merge feature_branch

This results in a clean history:

Feature branch commits on top
Feature branch commits on top

Another benefit of having done a rebase before merging is that there’s no need for an explicit merge commit like you see at the top of the original history. This is because — and this is a key insight — the feature branch is exactly like the master branch but with more commits added on. In other words, when you merge it’s as though you had never branched in the first place. Because Git doesn’t have to ‘think’ about what it’s doing when it merges a rebased branch it performs what is called a fast forward. In this case it moved the HEAD2 from 899bdb (More mainline activity) to 5b475e (Finished feature branch).

The above is the basic use case for git rebase. It’s a nice feature that keeps your commit history clean. The greater significance of git rebase is the way it makes you think about your commits, especially as you start to use the interactive rebase features discussed below.

Time travel

When you call git rebase with the interactive flag, e.g. git rebase -i master, git will open up a text file that you can edit to achieve certain effects:

Interactive rebase menu
Interactive rebase menu

As you can see there are several options besides just performing the rebase operation described above. Delete a line and you are telling Git to disappear that commit from your branch’s history. Change the order of the commit lines and you are asking Git to attempt to reorder the commits themselves. Change the word ‘pick’ to ‘squash’ and Git will squash that commit together with the commit on the preceding line. Most importantly, change the word ‘pick’ to ‘edit’ and Git will drop you just after the selected ref number.

I think of these abilities as time travel. They enable you to go back in the history of your branch and make code changes as well as reorganize code into different configuration of commits.

Let’s say you have a branch with several commits. When you started the branch out you thought you understood the feature well and created a bunch of code to implement it. When you opened up the pull request the first feedback you received was that the code should have tests, so you added another commit with the tests. The next round of feedback suggested that the implementation could benefit from a new requirement, so you added new code and tests in a third commit. Finally, you received feedback about the underlying software design that required you to create some new classes and rename some methods. So now you have 4 commits with commit messages like this:

A messy commit history
A messy commit history
  1. Implemented new feature
  2. Tests for new feature
  3. Add requirement x to new feature
  4. Changed code for new feature

This history is filled with useless information. Nobody is going to care in the future that the code had to be changed from the initial implementation in commit 4 and it’s just noise to have a separate commit for tests in commit 2. On the other hand it might be valuable to have a separate commit for the added requirement.

To get rid of the tests commit all you have to do is squash commit 2 into commit 1, resulting in:

  1. Implemented new feature
  2. Add requirement x to new feature
  3. Changed code for new feature

New commit 3 has some code that belongs in commit 1 and some code that belongs with commit 2. To keep things simple, the feature introduced in commit 1 was added to file1.rb and the new requirement was added to file2.rb. To handle this situation we’re going to have to do a little transplant surgery. First we need to extract the part of commit 3 that belongs in commit 1. Here is how I would do this:

# We are on HEAD, i.e. commit 3
git reset HEAD^ file1.rb
git commit --amend
git stash
git rebase -i master
# ... select commit 1 to edit
git stash apply
git commit -a --amend
git rebase --continue

It’s just that easy! But seriously, let’s go through each command to understand what’s happening.

  1. The first command, git reset, is notoriously hard to explain, especially because there’s another command, git checkout, which seems to do something similar. The diagram at the top of this Stack Overflow page is actually extremely helpful. The thing about Git to repeat like a mantra is that Git has a two step commit process, staging file changes and then actually committing. Basically, when you run git reset REF on a file it stages the file for committing at that ref. In the case of the first command, git reset HEAD^ file.rb, we’re saying “stage the file as it looked before HEAD’s change”; in other words, revert the changes we made in the last commit.
  2. The second command, git commit --amend commits what we’ve staged into HEAD (commit 3). The two commands together (a reset followed by an amend) have the effect of uncommitting the part of HEAD’s commit that changed file1.rb.
  3. The changes that were made to file1.rb aren’t lost, however. They were merely uncommitted and unstaged. They are now sitting in the working directory as an unstaged diff, as if they’d never been part of HEAD. So just as you could do with any diff you can use git stash to store away the diff.
  4. Now I use interactive rebase to travel back in time to commit 1. Rebase drops me right after commit 1 (in other words, the temporary rebase HEAD is commit 1).
  5. I use git stash apply to get my diff back (you might get a merge conflict at this point depending on the code).
  6. Now I add the diff back into commit 1 with git commit --amend -a (-a automatically stages any modified changes, skipping the git add . step).

This is the basic procedure for revising your git history (at least the way I do it). There are a couple of other tricks that I’m not going to go into detail about here, but I’ll leave some hints. Let’s say the changes for the feature and the new requirement were both on the same file. Then you would need to use git add --patch file1.rb before step 2. What if you wanted to introduce a completely new commit after commit 1? Then you would use interactive rebase to travel to commit 1 and then add your commits as normal, and then run git rebase --continue to have the new commits inserted into the history.


One of the reasons I wasn’t used to this workflow before this job was because I thought rebasing was only useful for the narrow case of making sure that the branch commits are grouped together after a merge to master. My understanding was that other kinds of history revision were to be avoided because of the problems that they cause for collaborators who pull from your public repos.  I don’t remember the specific blog post or mailing list message but I took away the message that once you’ve pushed something to a public repo (as opposed to what’s on your local machine) you are no longer able to touch that history.

Yes and no.  Rebasing and changing the history of a branch that others are pulling from can cause a lot of problems. Basically any time you amend a commit message, change the order of a commit or alter a commit you actually create a new object with a new sha reference. If someone else naively pulls from your branch after having pulled the pre-revised-history they will get a weird set of duplicate code changes and things will get worse from there. In general if other people are pulling from your public (remote) repository you should not change the history out from under them without telling them. Linus’ guidelines about rebasing here are generally applicable.

On the other hand, in many Git workflows it’s not normal for other people to be pulling from your feature branch and if they are they shouldn’t be that surprised if the history changes.  In the Github-style workflow you will typically develop a feature branch on your personal repository and then submit that branch as a pull request to the canonical repository. You would probably be rebasing your branch on the canonical repository’s master anyway. In that sense even though your branch is public it’s not really intended for public consumption. If you have a collaborator on your branch you would just shoot them a message when you rebase and they would do a “hard reset” on their branch (sync their history to yours) using git reset --hard remote_repo/feature_branch. In practice, in my limited experience with a particular kind of workflow, it’s really not that big a deal.

Don’t worry

Some people are wary of rebase because it really does alter history. If you remove a commit you won’t see a note in the commit history that so and so removed that commit. The commit just disappears. Rebase seems like a really good way to destroy yours and other people’s work. In fact you can’t actually screw up too badly using rebase because every Git repository keeps a log of the changes that have been made to the repository’s history called the reflog. Using git reflog you can always back out misguided recent history changes by returning to a point before you made the changes.

Hope this was helpful!


  1. Not an insignificant improvement, since merging in Subversion sucks.[back]
  2. Which I always think about as a hard drive head, which I in turn think about as a record player needle[back]


My new work is great1. The main thing I like about it is the consummate professionalism of the team. Everyone I work with is interested in improving their craft and is eager to engage in discussions about software principles, code quality, development processes and tools. In general there is a noticeable group ethos that seems guided by the following set of principles:

Take the time to make your work as high-quality as possible given the scope of the task at hand. This means writing tests and undertaking refactorings to manage complexity in the normal course of development. It also means often having to partially or fully revisit a design based on new findings that come out during development. This may seem like a recipe for low productivity but our team is pretty consistent in its output: to use a decidedly fake and wobbly measurement, each team member seems capable of producing about two significant user-facing features per week. I know from personal experience that the alternative, programming to hard deadlines and arbitrary deployment cycles2, may at times produce the superficial appearance of a greater amount of output (“With Monday’s release we closed ten stories and five bugs!”) but this is just a loan taken out against the day when you have to devote entire “sprints” to fixing the earlier rushed implementations. Taking the requisite time with each task is consonant with the recognition that growth in complexity in a system has to be managed, either all at once in a great “redesign” crusade, or in small thoughtful efforts as you go.

Even sufficiently good work can be improved dramatically through constructive criticism and iterative development. This was something I’ve had considerable trouble with at times, as a person who avoids conflict and criticism at all cost. The key to being able to accept the criticism (which normally comes in the form of code reviews on Github), I’ve found, is to focus on the improvement to the final product brought about by incorporating the criticism. During each round of code review feedback instead of fighting the suggestions I try to let go of my ego (not at all easy) and visualize how the code will look after I implement the changes; I’m generally much happier with the final code than I am with the initial submission. I also try to approach criticism as containing information about what I should do and what I should avoid in the future, so that even when I am receiving brutal criticism about the quality of my code I receive some comfort from the fact that I am learning how to avoid similar criticism in the future3.

There’s more than one way to do it, but you should be able to articulate the reason for why you’re doing something a certain way, even if the articulation of that reason is “because that’s the way I learned it initially and there’s no compelling reason to change”. This of course opens you up to the possibility that somebody else has a positive argument for a certain practice, in which case it’s hard to argue for the original practice purely on the basis of apathy. Often the process of asking oneself why one prefers one practice to another ends up extracting motivations and values that are more interesting than the actual issue of practice in question. As a team we discuss many issues great and small, from high-level questions about design patterns and object composition to nitty-gritty questions about code block indentation. Sometimes these discussions result in some binding decision regarding style or best practice, but on other occasions we’ve articulated dueling reasons and concluded that the arguments for both are strong enough that both practices or styles are valid for different contexts.

Work to improve the product and support your teammates. I know this one sounds like a cheerlead-y cliché, but all I mean by this is that the primary motivator for working hard is to make the application and its codebase better, both because these support one’s self-interest of having the company succeed and remaining employed, but also because you see your teammates working hard and want to make a similar contribution. The codebase and app are a communal creation and as one works with it one begins to feel a sense of loyalty to the whole, as well as a sense of pride for the sections of code that you’ve played a major role in shaping. This is worth mentioning mainly because it stands in contrast to other motivational systems, like those that depend on fear of punitive action or explicit intra-team competition. There are still elements of fear and competition at work in our system, but they are sublimated into cooperative and high-productivity behaviors4.

This ethos doesn’t come from nowhere. It is the product of specific values that are held and promoted by my boss, and it is sustained through a combination of the personalities of the people he chose to hire and the practices (often enforced/encouraged by technological tools) he put in place. It is an impressive accomplishment that is easier to describe than it would be to replicate.

Anyway, it’s a pretty cool job.


  1. New as in I’ve been there for ~6 months [back]
  2. These deadlines and cycles often end up being Procrustean beds for feature requirements.[back]
  3. Did I mention that I don’t like criticism?[back]
  4. That is, people compete to be the most helpful![back]

My actual workflow

I should update that my current testing workflow is the one described here. At my new work the testing suite is so large (a good problem to have!) that it’s not really feasible to rerun all or even a large subset of the tests on every write. Also I realized that you mentally start waiting for the full suite to finish with a notification before continuing your work. Now I test just the file or just the test I’m working on, using the rails.vim commands and the turbux plugin to run the test in a small tmux pane. Then I run the full test suite on CI, which can take 20-40 minutes.

Local Memes

I have meant to undertake this project for a long time: the compilation of a list of “local memes” in my household. As I get older (and older and older) I become more and more aware of this condition of being composed out of bits and pieces that originated in other people. I am also aware of various recurring gags, puns, patterned exchanges etc.– memes –that gradually died out through disuse or replacement. Every once in a while my wife and I will recall some old joke that we used to do and we’ll momentarily resurrect it, like the dinosaurs in Jurassic Park. We flatter ourselves that we have an unusual number of these active or recalled memes stored up, but probably most couples have equally rich archives. Anyways, ours are special to us, and I’d like to partially preserve them by sharing them now. I’ll just update this post as I think of more.

I’ll be right Barack Obama. Pretty self-explanatory. Also, Barack to the future, etc.

Thass nice. Thass real nice. – Slurring lecher #1 in the bar scene in the terrible movie Eve of DestructionDenotes lustful appreciation of something (in the meme, usually something innocent like food or drink).

snoring/narcolepsy — This is where you decide that the point you’re trying to make is too boring, obscure and convoluted to actually finish and you stop in mid-sentence and nod off. The use of this has decreased because it was really off-putting for my wife and she basically forbid it.

“Oh God” – this was my sort of Joey Lawrence woah during college. I also briefly tried to promote my own pre-packaged meme: embrace the chaos/channel the void.

revolution in your stomach — when you mix foods causing gastrointestinal distress. From a Salvadoran family member. Also the phrase Don’t be lazy.

“What? What did you say?” — I tried to find a relevant video clip from the Ozu movie Good Morning to demonstrate this very simple and satisfying gag, but looks like you’ll have to check out the movie yourself.

cat names — We have two cats, Mackerel/Mack and Ruby. Ruby is variously PoofyMiss Fluffy-shanks and Rubifer Jenkins. Mackerel is usually just Mack, but his friendly, chill demeanor often conjures up declarations that Mack is a buddy and that Mack is a mackimal.

monkey dancing — A monkey dance is a disrespectful display where you show your interlocutor that you are not paying attention to what they are saying because you are no longer a rational being with language. Traditionally you literally pretend to be a monkey by putting your arms above your head and bouncing back and forth on your feet while sticking your tongue out, but any elaborate discourse-destroying dance qualifies.

hot socks — This one is based on the Lords of Acid song “Rough Sex“, where the refrain is a litany of “deep sex, hard sex”, etc. So you replace the word sex with socks and try to come up with descriptions of socks that sound dirty but still make sense in the context of socks. “wet socks, hot socks, smelly socks.”, etc. I actually can’t remember how we kept this one going long enough to be interesting, but we came up with quite a few.

“Bob” Damon – If you’re watching a movie where an actor looks like a more famous actor, you come up with a fake first name and assert that the actor is the more famous actor’s sibling struggling in obscurity.

“And it looked just like a checkerboard!” – The punchline to a ridiculous true story once told to me by a friend. When uttered (in a high-pitched, incredulous voice) it denotes the discovery of an absurd and wonderful fact.

haspiration — This is where you append h’s before all beginning vowels in words and all silent h’s, as in “That’s hannoying” and “hu-wat hare hu-you doing?”

Y’see… Rudy — apocryphal quoting from the Bill Cosby Rap, which doesn’t actually include the name Rudy. Denotes when you’re saying something self-consciously condescending.

Stompy McStomperson — This is apparently me. Related to Messy McMesserson, which is a pretty universal meme.

“_ Joe, everyone’s favorite Joe.” – e.g., Hey look it’s Self-pitying Joe, everyone’s favorite Joe. This came from my roommate in college.

Sometimes a man has to do things that don’t make any sense, nevertheless he must do them, because he is a man. – half-remembered paraphrase from the great movie Fighting Elegy.

“I look good, I smell good, I feel good … I’m a cat!” — half-remembered line from Red Dwarf. Also, “The only thing that can kill a vindaloo: a lager.” Also, from The IT Crowd“You’re making it go back in!”

deathwork — Term from an eccentric book by the cranky sociologist Phillip Rieff, which he uses to describe everything from Piss Christ to Ulysses to this image of a person made out of vegetables. A way to flippantly dismiss something. Related to the phrase The death of meaning.

You’re probably going to die. Used by a friend a lot, the joke being that instead of trying to reassure someone that their neurotic fears aren’t real, you just agree with them that they’re probably right in their catastrophic estimates. This friend also says Happy family, happy family whenever there’s slight social conflict, which is probably from something. Also, the whole concept of icecream cake being inherently more desirable than regular cake, which is apparently from Modern Family.

basic portalology — Recent meme from playing through Portal 2 with same friend, as in That’s basic portalology!, exclaimed when you realize you’ve been overthinking a level.

I think there was something wrong with the beer. The joke being that the beer being skunk is what made you sick, which is why you should always drink out of heinecans.

It’s important. — Said of things that aren’t important. See note.

an all too possible future — a reference to the offensively ridiculous Heinlein book Farnham’s Freehold, read by my friend and I and thankfully few other living people. Also with the same friend the principle that only one person can take off his shirt in a room at a time.

I’m sorry that happened to you — tepid expression of sympathy for an unfortunate event that is either extremely mild or entirely self-induced and preventable.

“Pretty. Pretty. Pretty good” – From Curb Your Enthusiasm, of course.

Uuuuuuuuuuuuumm — Open-ended contemplation sound that a friend’s five-year-old daughter would make. Also from same little girl: “Poop on your head!” And from that friend: “Wake up!”

“Train.” We live next to a railroad, so when the train comes by we translate what its warning horn is saying, which is clearly: “Train. Train. Traiiiiiin. Train.”, etc.  

Goodbye, Autotest

Update (10/15/2012): This isn’t how I’m doing it now. See this aside for my current workflow.

(That Latour article is in a half-finished limbo state, but I’ll get around to posting it eventually).

I wrote in a recent post about how easy it is to configure Autotest these days. And Autotest has been an essential part of my toolset as I’ve developed my humble todo app, Method Todo. I’ve only been using it to run my Rspec stories (and not Cucumber features, since they take too long). Now that I’m adding another testing discipline, testing javascript code with Jasmine, I find that there’s an even simpler way to run all your tests — specs, features and Jasmine specs — continuously and in the background.

I decided to use Guard because I couldn’t get Jasmine to access the necessary asset pipeline files to run Javascript tests, I guess because those files needed to be precompiled. Once I started using guard-rails-assets I decided to try out the other plugins for Rails, Rspec and Cucumber. It seems to all just work and the ecosystem of plugins seems like it will provide quick solutions to obscure problems down the road.

So after deleting the Autotest dependencies out of my Gemfile, I have:

# other sections for rails, rspec, spork, etc., etc. 
group :guard do                                                           
  gem 'guard'                                                                   
  gem 'guard-rails'                                                             
  gem 'guard-spork'                                                             
  gem 'guard-rspec'                                                             
  gem 'guard-cucumber'                                                          
  gem 'guard-rails-assets'                                                      
  gem 'guard-jasmine-headless-webkit'                                           

Then I run

bundle exec guard init

to create the Guardfile with sensible defaults.

Then running

bundle exec guard

will start running all of these tasks together. The default for the Cucumber guard task seems to only watch changes in feature files before re-running Cucumber, so I’ll leave this guard in as long as it doesn’t delay the re-running of specs. A bonus is that the Rails guard starts up the development server and restarts it when you change major configuration files.

Update (7/25/2012):

I found that guard-rails-assets didn’t seem to be necessary for jquery-guard-headless-webkit to process assets. I removed it to see if it would fix a strange issue with the bootstrap-modal library that only occurs in development mode (it didn’t, but it also didn’t seem to break anything).

I also found that guard-rails was loading into the test environment when paired up with the other testing guard tasks. Changing the initial generated line to

guard 'rails', :environment =&gt; 'development' do                                 

seemed to fix things.