Coding Horror

programming and human factors

Because Reading is Fundamental

Most discussions show a bit of information next to each user:

What message does this send?

  • The only number you can control printed next to your name is post count.
  • Everyone who reads this will see your current post count.
  • The more you post, the bigger that number next to your name gets.

If I have learned anything from the Internet, it is this: be very, very careful when you put a number next to someone's name. Because people will do whatever it takes to make that number go up.

If you don't think deeply about exactly what you're encouraging, why you're encouraging it, and all the things that may happen as a result of that encouragement, you may end up with … something darker. A lot darker.

Printing a post count number next to every user's name implies that the more you post, the better things are. The more you talk, the better the conversations become. Is this the right message to send to everyone in a discussion? More fundamentally, is this even true?

I find that the value of conversations has little to do with how much people are talking. I find that too much talking has a negative effect on conversations. Nobody has time to listen to the resulting massive stream of conversation, they end up just waiting for their turn to pile on and talk, too. The best conversations are with people who spend most of their time listening. The number of times you've posted in a given topic is not a leaderboard; it's a record of failing to communicate.

Consider the difference between a chat room and a discussion. Chat is a never-ending flow of disconnected, stream of consciousness sentences that you can occasionally dip your toes in to get the temperature of the water, and that's about it. Discussion is the process of lobbing paragraphs back and forth that results in an evolution of positions as your mutual understanding becomes more nuanced. We hope.

The Ars Banana Experiment

Ars Technica ran a little experiment in 2011. When they posted Guns at home more likely to be used stupidly than in self defense, embedded in the last sentence of the seventh paragraph of the article was this text:

If you have read this far, please mention Bananas in your comment below. We're pretty sure 90% of the respondants to this story won't even read it first.

The first person to do this is on page 3 of the resulting discussion, comment number 93. Or as helpfully visualized by Brandon Gorrell:

Plenty of talking, but how many people actually read up to paragraph 7 (of 11) of the source article before they rushed to comment on it?

The Slate Experiment

In You Won't Finish This Article, Farhad Manjoo dares us to read to the end.

Only a small number of you are reading all the way through articles on the Web. I’ve long suspected this, because so many smart-alecks jump in to the comments to make points that get mentioned later in the piece.

But most of us won't.

He collected a bunch of analytics data based on real usage to prove his point:

These experiments demonstrate that we don't need to incentivize talking. There's far too much talking already. We badly need to incentivize listening.

And online, listening = reading. That old school program from my childhood was right, so deeply fundamentally right. Reading. Reading Is Fundamental.

Let's say you're interested in World War II. Who would you rather have a discussion with about that? The guy who just skimmed the Wikipedia article, or the gal who read the entirety of The Rise and Fall of the Third Reich?

This emphasis on talking and post count also unnecessarily penalizes lurkers. If you've posted five times in the last 10 years, but you've read every single thing your community has ever written, I can guarantee that you, Mr. or Mrs. Lurker, are a far more important part of that community's culture and social norms than someone who posted 100 times in the last two weeks. Value to a community should be measured every bit by how much you've read as much as how much you talked.

So how do we encourage reading, exactly?

You could do crazy stuff like require commenters to enter some fact from the article, or pass a basic quiz about what the article contained, before allowing them to comment on that article. On some sites, I think this would result in a huge improvement in the quality of the comments. It'd add friction to talking, which isn't necessarily a bad thing, but it's a negative, indirect way of forcing reading by denying talking. Not ideal.

I have some better ideas.

  1. Remove interruptions to reading, primarily pagination.

    Here's a radical idea: when you get to the bottom of the page, load the next damn page automatically. Isn't that the most natural thing to want when you reach the end of the page, to read the next one? Is there any time that you've ever been on the Internet reading an article, reached the bottom of page 1, and didn't want to continue reading? Pagination is nothing more than an arbitrary barrier to reading, and it needs to die a horrible death.

    There are sites that go even further here, such as The Daily Beast, which actually loads the next article when you reach the end of the one you are currently reading. Try it out and see what you think. I don't know that I'd go that far (I like to pick the next thing I read, thanks very much), but it's interesting.

  2. Measure read times and display them.

    What I do not measure, I cannot display as a number next to someone's name, and cannot properly encourage. In Discourse we measure how long each post has been visible in the browser for every (registered) user who encounters that post. Read time is a key metric we use to determine who we trust, and the best posts that people do actually read. If you aren't willing to visit a number of topics and spend time actually listening to us, why should we talk to you – or trust you.

    Forget clicks, forget page loads, measure read time! We've been measuring read times extensively since launch in 2013 and it turns out we're in good company: Medium and Upworthy both recently acknowledged the intrinsic power of this metric.

  3. Give rewards for reading.

I know, that old saw, gamification, but if you're going to reward someome, do it for the right things and the right reasons. For example, we created a badge for reading to the end of a long 100+ post topic. And our trust levels are based heavily on how often people are returning and how much they are reading, and virtually not at all on how much they post.

To feel live reading rewards in action, try this classic New York Times Article. There's even a badge for reading half the article!

  1. Update in real time.

    Online we tend to read these conversations as they're being written, as people are engaging in live conversations. So if new content arrives, figure out a way to dynamically rez it in without interrupting people's read position. Preserve the back and forth, real time dynamic of an actual conversation. Show votes and kudos and likes as they arrive. If someone edits their post, bring that in too. All of this goes a long way toward making a stuffy old debate feel like a living, evolving thing versus a long distance email correspondence.

These are strategies I pursued with Discourse, because I believe Reading Is Fundamental. Not just in grade school, but in your life, in my life, in every aspect of online community. To the extent that Discourse can help people learn to be better listeners and better readers – not just more talkative – we are succeeding.

If you want to become a true radical, if you want to have deeper insights and better conversations, spend less time talking and more reading.

Update: There's a CBC interview with me on the themes covered in this article.

[advertisement] Stack Overflow Careers matches the best developers (you!) with the best employers. You can search our job listings or create a profile and even let employers find you.
Discussion

The Tablet Turning Point

Remember how people in the year 2000 used to say how crazy and ridiculous it was, the idea that Anyone Would Ever Run Photoshop in a Web Browser? I mean come on.

Oops.

One of my big bets with Discourse is that eventually, all computers will be tablets of varying size, with performance basically indistinguishable from a two year old desktop or laptop.

Apps are great and all, but there has to be some place for this year's bumper crop of obscene amount of computing superpower to go. I like to use history as my guide, and I believe it's going exactly the same place it did on desktops and laptops — that no-installing-anything friend of every lazy user on the planet, the inevitable path of least resistance, the mobile web browser.

For the last few years, I've been buying every significant tablet device in the run up to the big holiday sales season, and testing them all, to see how many years are left until mobile devices catch up to desktops on general web and JavaScript performance.

How are we doing? Let's benchmark some Discourse client-side Ember JavaScript code:

iPhone 4 June 2011 2031ms
iPhone 5 Sept 2012 600ms
iPhone 5s Sept 2013 300ms
iPhone 6 Sept 2014 250ms
iPad Air 2 Oct 2014 225ms

My Core i4770k desktop machine scores 180ms in the same benchmark on the latest version of Chrome x64. I'd say we're solidly within striking distance this year.

I don't like to spend a lot of time talking about news and gadgets here, since the commentary will be irrelevant within a few years. But this year marks a key turning point for mobile and tablet performance, and I've lived with every iteration of these devices for the last couple of years, so I'll make an exception.

Look at this performance rampage the iPad Air 2 goes on:

Just look at it! All the graphs are like this!

It's hard to believe we now live in a world where the Apple "Premium" is no longer about aesthetics, but raw, unbridled, class-leading performance. And you know what? That's something I can totally get behind.

Anyone who tells you the iPad Air 2 is some kind of incremental update must not actually use theirs. As someone who does regularly use his iPad, I can say without hesitation that this is a massively upgraded device. I grew to hate my old iPad Air because of the memory restrictions; I could barely have three tabs open in Mobile Safari without one of them paging out of memory. Thanks x64 and iOS7!

The bonded screen, touchid, the now-adequate-for-x64 2GB of RAM, the amazingly fast triple core CPU, the GPU, and yeah, it's a little thinner. For performance, nothing else even comes close.

It's so fast I sometimes forget I'm not using my Surface Pro 3 with its 4GB RAM and Core i5 CPU. I get hassled when I bring my Surface to meetings, but I patiently explain that it's a very nice third gen hardware design with a fully integrated keyboard cover, IE11 is a great touch browser, and that I'm mostly using the device as a tablet, as a sneak preview of what iPad 8 performance will look like. Based on today's benchmarks with the iPad Air 2 – chronologically, the iPad "6" – I believe that's about right.

I also purchased a Nexus 9. It's the first device to ship with Android 5 and the vaunted Nvidia Tegra K1.

I'm very impressed with Android 5.0; aesthetically I think it's superior to iOS 8 in a lot of ways, and it is a clear step forward over Android 4. Anyone on older Android devices should definitely upgrade to Android 5 at their first opportunity.

Performance-wise, it is what I've come to expect from Android: erratic. In our Discourse benchmarks, and the latest version of Chrome Android beta, it scores about 750ms, putting it somewhere between the 2011 iPhone 4s and the 2012 iPhone 5. That said, this is the fastest Android device I have ever laid hands on. I just wish it was consistently faster. A lot faster.

To that end, I'd like to ask for your help. We've identified some deep bugs in the Android Chrome V8 engine that cause fairly severe performance issues with JavaScript frameworks like Angular and Ember. (Desktop Chrome performance remains class leading; this is highly specific to the Android version of Chrome.) If you know anyone at Google, please ping them about this and see if it can be escalated. I'd love it if more Android users – including me – could have a better browser experience when using large JavaScript apps.

I hope over the next year the remaining Android 5 performance bumps can be ironed out. I still like the Nexus 9; if you're a big fan of Google services like GMail, Docs, and Maps like I am, I definitely recommend it. The one I have will be a gift to my mom.

[advertisement] How are you showing off your awesome? Create a Stack Overflow Careers profile and show off all of your hard work from Stack Overflow, Github, and virtually every other coding site. Who knows, you might even get recruited for a great new position!
Discussion

What If We Could Weaponize Empathy?

One of my favorite insights on the subject of online community is from Tom Chick:

Here is something I've never articulated because I thought, perhaps naively, it was understood:

The priority for participating on this forum is not the quality of the content. I ultimately don't care how smart or funny or observant you are. Those are plusses, but they're never prerequisites. The priority is on how you treat each other. I expect spats, arguments, occasional insults, and even inevitable grudges. We've all done that. But in the end, I expect you to act like a group of friends who care about each other, no matter how dumb some of us might be, no matter what political opinions some of us hold, no matter what games some of us like or dislike. This community is small enough, intimate enough, that I feel it's a reasonable expectation.

Indeed, disagreement and arguments are inevitable and even healthy parts of any community. The difference between a sane community and a terrifying warzone is the degree to which disagreement is pursued in the community, gated by the level of respect community members have for each other.

In other words, if a fight is important to you, fight nasty. If that means lying, lie. If that means insults, insult. If that means silencing people, silence.

I may be a fan of the smackdown learning model and kayfabe, but I am definitely not a fan of fighting nasty.

I expect you to act like a group of friends who care about each other, no matter how dumb some of us might be, no matter what political opinions some of us hold, no matter what games some of us like or dislike.

There's a word for this: empathy.

One of the first things I learned when I began researching discussion platforms two years ago is the importance of empathy as the fundamental basis of all stable long term communities. The goal of discussion software shouldn't be to teach you how to click the reply button, and how to make bold text, but how to engage in civilized online discussion with other human beings without that discussion inevitably breaking down into the collective howling of wolves.

That's what the discussion software should be teaching you: Empathy.

You. Me. Us. We can all occasionally use a gentle reminder that there is a real human being on the other side of our screen, a person remarkably like us.

I've been immersed in the world of social discussion for two years now, and I keep going back to the well of empathy, time and time again. The first thing we did was start with a solid set of community guidelines on civilized discussion, and I'm proud to say that we ship and prominently feature those guidelines with every copy of Discourse. They are bedrock. But these guidelines only work to the extent that they are understood, and the community helps enforce them.

In Your Community Door, I described the danger of allowing cruel and hateful behavior in your community – behavior so obviously corrosive that it should never be tolerated in any quantity. If your community isn't capable of regularly exorcising the most toxic content, and the people responsible for that kind of content, it's in trouble. Those rare bad apples are group poison.

Hate is easy to recognize. Cruelty is easy to recognize. You do not tolerate these in your community, full stop.

But what about behavior that isn't so obviously corrosive? What about behavior patterns that seem sort of vaguely negative, but … nobody can show you exactly how this behavior is directly hurting anyone? What am I talking about? Take a look at the Flamewarriors Online Discussion Archetypes, a bunch of discussion behaviors that never quite run afoul of the rules, per se, but result in discussions that degenerate, go in circles, or make people not want to be around them.

What we're getting into is shades of grey, the really difficult part of community moderation. I've been working on Discourse long enough to identify some subtle dark patterns of community discussion that – while nowhere near as dangerous as hate and cruelty – are still harmful enough to the overall empathy level of a community that they should be actively recognized when they emerge, and interventions staged.

1. Endless Contrarianism

Disagreement is fine, even expected, provided people can disagree in an agreeable way. But when someone joins your community for the sole purpose of disagreeing, that's Endless Contrarianism.

Example: As an athiest, Edward shows up on a religion discussion area to educate everyone there about the futility of religion. Is that really the purpose of the community? Does anyone in the community expect to defend the very concept of religion while participating there?

If all a community member can seem to contribute is endlessly pointing out how wrong everyone else is, and how everything about this community is headed in the wrong direction – that's not building constructive discussion – or the community. Edward is just arguing for the sake of argument. Take it to debate school.

2. Axe-Grinding

Part of what makes discussion fun is that it's flexible; a variety of topics will be discussed, and those discussions may naturally meander a bit within the context defined by the site and whatever categories of discussion are allowed there. Axe-Grinding is when a user keeps constantly gravitating back to the same pet issue or theme for weeks or months on end.

Example: Sara finds any opportunity to trigger up a GMO debate, no matter what the actual topic is. Viewing Sara's post history, GMO and Monsanto are constant, repeated themes in any context. Sara's negative review of a movie will mention eating GMO popcorn, because it's not really about the movie – it's always about her pet issue.

This kind of inflexible, overbearing single-issue focus tends to drag discussion into strange, unwanted directions, and rapidly becomes tiresome to other participants who have probably heard everything this person has to say on that topic multiple times already. Either Sara needs to let that topic go, or she needs to find a dedicated place (e.g. GMO discussion areas) where others want to discuss it as much as she does, and take it there.

3. Griefing

In discussion, griefing is when someone goes out of their way to bait a particular person for weeks or months on end. By that I mean they pointedly follow them around, choosing to engage on whatever topic that person appears in, and needle the other person in any way they can, but always strictly by the book and not in violation of any rules… technically.

Example: Whenever Joe sees George in a discussion topic, Joe now pops in to represent the opposing position, or point out flaws in George's reasoning. Joe also takes any opportunity to remind people of previous mistakes George made, or times when George was rude.

When the discussion becomes more about the person than the topic, you're in deep trouble. It's not supposed to be about the participants, but the topic at hand. When griefing occurs, the discussion becomes a stage for personal conflict rather than a way to honestly explore topics and have an entertaining discussion. Ideally the root of the conflict between Joe and George can be addressed and resolved, or Joe can be encouraged to move on and leave the conflict behind. Otherwise, one of these users needs to find another place to go.

4. Persistent Negativity

Nobody expects discussions to be all sweetness and light, but neverending vitriol and negativity are giant wet blankets. It's hard to enjoy anything when someone's constantly reminding you how terrible the world is. Persistent negativity is when someone's negative contributions to the discussion far outweigh their positive contributions.

Example: Even long after the game shipped, Fred mentions that the game took far too long to ship, and that it shipped with bugs. He paid a lot of money for this game, and feels he didn't get the enjoyment from the game that was promised for the price. He warns people away from buying expansions because this game has a bad track record and will probably fail. Nobody will be playing it online soon because of all the problems, so why bother even trying? Wherever topics happen to go, Fred is there to tell everyone this game is worse than they knew.

If Fred doesn't have anything positive to contribute, what exactly is the purpose of his participation in that community? What does he hope to achieve? Criticism is welcome, but that shouldn't be the sum total of everything Fred contributes, and he should be reasonably constructive in his criticism. People join communities to build things and celebrate the enjoyment of those things, not have other people dump all over it and constantly describe how much they suck and disappoint them. If there isn't any silver lining in Fred's cloud, and he can't be encouraged to find one, he should be asked to find other places to haunt.

5. Ranting

Discussions are social, and thus emotional. You should feel something. But prolonged, extreme appeal to emotion is fatiguing and incites arguments. Nobody wants to join a dry, technical session at the Harvard Debate Club, because that'd be boring, but there is a big difference between a persuasive post and a straight-up rant.

Example: Holly posts at the extremes – either something is the worst thing that ever happened, or the best thing that ever happened. She will post 6 to 10 times in a topic and state her position as forcefully as possible, for as long and as loud as it takes, to as many individual people in the discussion as it takes, to get her point across. The stronger the language in the post, the better she likes it.

If Holly can't make her point in a reasonable way in one post and a followup, perhaps she should rethink her approach. Yelling at people, turning the volume to 11, and describing the situation in the most emotional, extreme terms possible to elicit a response – unless this really is the worst or best thing to happen in years – is a bit like yelling fire in a crowded theater. It's irresponsible. Either tone it down, or take it somewhere that everyone talks that way.

6. Grudges

In any discussion, there is a general expectation that everyone there is participating in good faith – that they have an open mind, no particular agenda, and no bias against the participants or the topic. While short term disagreement is fine, it's important that the people in your community have the ability to reset and approach each new topic with a clean(ish) slate. When you don't do that, when people carry ill will from previous discussions toward the participants or topic into new discussions, that's a grudge.

Example: Tad strongly disagrees with a decision the community made about not creating a new category to house some discussion he finds problematic. So he now views the other leaders in the community, and the moderators, with great distrust. Tad feels like the community has turned on him, and so he has soured on the community. But he has too much invested here to leave, so Tad now likes to point out all the consequences of this "bad" decision often, and cite it as an example of how the community is going wrong. He also follows another moderator, Steve, around because he views him as the ringleader of the original decision, and continually writes long, critical replies to his posts.

Grudges can easily lead to every other dark community pattern on this list. I cannot emphasize enough how important it is to recognize grudges when they emerge so the community can intervene and point out what's happening, and all the negative consequences of a grudge. It's important in the broadest general life sense not to hold grudges; as the famous quote goes (as near as I can tell, attributed to Alcoholics Anonymous)

Holding a grudge is like drinking poison and expecting the other person to die.

So your community should be educating itself about the danger of grudges, the root of so many other community problems. But it is critically important that moderators never, and I mean never ever, hold grudges. That'd be disastrous.

What can you do?

I made a joke in the title of this post about weaponizing empathy. I'm not sure that's even possible. But you can start by having clear community guidelines, teaching your community to close the door on overt hate, and watching out for any overall empathy erosion caused by the six dark community behavior patterns I outlined above.

At the risk of sounding aspirational, here's one thing I know to be true, and I advise every community to take to heart: I expect you to act like a group of friends who care about each other, no matter how dumb some of us might be, no matter what political opinions some of us hold, no matter what things some of us like or dislike.

[advertisement] Stack Overflow Careers matches the best developers (you!) with the best employers. You can search our job listings or create a profile and even let employers find you.
Discussion

Your Community Door

What are the real world consequences to signing up for a Twitter or Facebook account through Tor and spewing hate toward other human beings?

As far as I can tell, nothing. There are barely any online consequences, even if the content is reported.

But there should be.

The problem is that Twitter and Facebook aim to be discussion platforms for "everyone", where every person, no matter how hateful and crazy they may be, gets a turn on the microphone. They get to be heard.

The hover text for this one is so good it deserves escalation:

I can't remember where I heard this, but someone once said that defending a position by citing free speech is sort of the ultimate concession; you're saying that the most compelling thing you can say for your position is that it's not literally illegal to express.

If the discussion platform you're using aims to be a public platform for the whole world, there are some pretty terrible things people can do and say to other people there with no real consequences, under the noble banner of free speech.

It can be challenging.

How do we show people like this the door? You can block, you can hide, you can mute. But what you can't do is show them the door, because it's not your house. It's Facebook's house. It's their door, and the rules say the whole world has to be accommodated within the Facebook community. So mute and block and so forth are the only options available. But they are anemic, barely workable options.

As we build Discourse, I've discovered that I am deeply opposed to mute and block functions. I think that's because the whole concept of Discourse is that it is your house. And mute and ignore, while arguably unavoidable for large worldwide communities, are actively dangerous for smaller communities. Here's why.

  • It allows you to ignore bad behavior. If someone is hateful or harassing, why complain? Just mute. No more problem. Except everyone else still gets to see a person being hateful or harassing to another human being in public. Which means you are now sending a message to all other readers that this is behavior that is OK and accepted in your house.

  • It puts the burden on the user. A kind of victim blaming — if someone is rude to you, then "why didn't you just mute / block them?" The solution is right there in front of you, why didn't you learn to use the software right? Why don't you take some responsibility and take action to stop the person abusing you? Every single time it happens, over and over again?

  • It does not address the problematic behavior. A mute is invisible to everyone. So the person who is getting muted by 10 other users is getting zero feedback that their behavior is causing problems. It's also giving zero feedback to moderators that this person should probably get an intervention at the very least, if not outright suspended. It's so bad that people are building their own crowdsourced block lists for Twitter.

  • It causes discussions to break down. Fine, you mute someone, so you "never" see that person's posts. But then another user you like quotes the muted user in their post, or references their @name, or replies to their post. Do you then suppress just the quoted section? Suppress the @name? Suppress all replies to their posts, too? This leaves big holes in the conversation and presents many hairy technical challenges. Given enough personal mutes and blocks and ignores, all conversation becomes a weird patchwork of partially visible statements.

  • This is your house and your rules. This isn't Twitter or Facebook or some other giant public website with an expectation that "everyone" will be welcome. This is your house, with your rules, and your community. If someone can't behave themselves to the point that they are consistently rude and obnoxious and unkind to others, you don't ask the other people in the house to please ignore it – you ask them to leave your house. Engendering some weird expectation of "everyone is allowed here" sends the wrong message. Otherwise your house no longer belongs to you, and that's a very bad place to be.

I worry that people are learning the wrong lessons from the way Twitter and Facebook poorly handle these situations. Their hands are tied because they aspire to be these global communities where free speech trumps basic human decency and empathy.

The greatest power of online discussion communities, in my experience, is that they don't aspire to be global. You set up a clubhouse with reasonable rules your community agrees upon, and anyone who can't abide by those rules needs to be gently shown the door.

Don't pull this wishy washy non-committal stuff that Twitter and Facebook do. Community rules are only meaningful if they are actively enforced. You need to be willing to say this to people, at times:

No, your behavior is not acceptable in our community; "free speech" doesn't mean we are obliged to host your content, or listen to you being a jerk to people. This is our house, and our rules.

If they don't like it, fortunately there's a whole Internet of other communities out there. They can go try a different house. Or build their own.

The goal isn't to slam the door in people's faces – visitors should always be greeted in good faith, with a hearty smile – but simply to acknowledge that in those rare but inevitable cases where good faith breaks down, a well-oiled front door will save your community.

[advertisement] How are you showing off your awesome? Create a Stack Overflow Careers profile and show off all of your hard work from Stack Overflow, Github, and virtually every other coding site. Who knows, you might even get recruited for a great new position!
Discussion

Level One: The Intro Stage

Way back in 2007, before Stack Overflow was a glint in anyone's eye, I called software development a collaborative game. And perhaps Stack Overflow was the natural outcome of that initial thought – recasting online software development discussion into a collaborative game where the only way to "win" is to learn from each other.

That was before the word gamification existed. But gamification is no longer the cool, hip concept it was back in 2011. Still, whether you call yourself a "gamer" or not, whether you believe in "gamification" or not, five years later you're still playing the world's largest multiplayer game.

In fact, you're playing it right now.

One of the most timeless aspects of games is how egalitarian they are, how easy it is for anyone to get started. Men, women, children — people love games because everyone can play along. You don't have to take classes or go to college or be certified: you just play. And this is, not so incidentally, how many of the programmers I know came to be programmers.

Do you know anyone that bought the video game Halo, or Myst, then proceeded to open the box and read the manual before playing the game? Whoa there guys, we can't play the game yet, we gotta read these instructions first! No, they stopped making manuals for games a long time ago, unless you count the thin sheet of paper that describes how to download / install the game on your device. Because they found out nobody reads the manual.

The project I’m working on is critical, but it has only about 3 to 4 users, most of whom are already familiar the application. One of the users even drives the design. The manual I’m writing, which is nearly 200 pages, is mostly a safety measure for business continuity planning. I don’t expect anyone will ever read it.

It’s a project I managed to procrastinate for months, working on other projects, even outside the scope of my regular assignments. The main deterrent, I believe, was my perception that no one needed the manual. The users seemed to be getting along fine without it.

And so as the year ticked to a close, instead of learning more about Mediawiki and screencasting and After Effects, I spent my time updating a 200-page manual that I don’t think anyone will ever read. It will be printed out, three-hole punched, and placed in a binder to collect dust on a shelf.

I guess that's not surprising for games. Games are supposed to be fun, and reading manuals isn't fun; it's pretty much the opposite of fun. But it is also true for software in general. Reading manuals isn't work, at least, it isn't whatever specific thing you set out to do when you fired up that bit of software on your phone, tablet, or laptop.

Games have another clever trick up their sleeve, though. Have you ever noticed that in most of today's games, the first level is kind of easy. Like… suspiciously easy?

That's because level one, the intro stage, isn't really part of the game. It's the manual.

As MegaMan X illustrates, manuals are pointless when we can learn about the game in the best and most natural way imaginable: by playing the actual game. You learn by doing, provided you have a well designed sandbox that lets you safely experiment as you're starting out in the game.

(The above video does contain some slightly NSFW language, as well as abuse of cartoon female avatars that I don't endorse, but it is utterly brilliant, applies to every app, software and website anyone has ever built, and I strongly recommend watching it all.)

This same philosophy applies to today's software and websites. Don't bother with all the manuals, video introductions, tutorials, and pop-up help dialogs. Nobody's going to read that stuff, at least, not the people who need it.

Instead, follow the lesson of MegaMan: if you want to teach people about your software, consider how you can build a great intro stage and let them start playing with it immediately.

[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.
Discussion