Coding Horror

programming and human factors

Fail Early, Fail Often

Scott Hanselman thinks signing your name with a bunch of certifications is gauche:

If it's silly to suggest putting my SATs on my resume, why is …

Scott Hanselman, MCSD, MCT, MCP, MC*.*

… reasonable? Having a cert means you have a capacity to hold lots of technical stuff in your head. Full stop. I propose we sign our names like this:

Scott Hanselman, 11 Successful Large Projects, 3 Open Source Applications, 1 Colossal Failure

Wouldn't that be nice?

I agree. Your credentials should be the sum of the projects you've worked on. But I think Scott has this backwards: you should emphasize the number of failed projects you've worked on.

How do we define "success", anyway? What were the goals? Did the project make money? Did users like the software? Is the software still in use? It's a thorny problem. I used to work in an environment where every project was judged a success. Nobody wanted to own up to the limitations, compromises, and problems in the software they ended up shipping. And the managers in charge of the projects desperately wanted to be perceived as successful. So what we got was the special olympics of software: every project was a winner. The users, on the other hand, were not so lucky.

Success is relative and ephemeral. But failure is a near-constant. If you really want to know if someone is competent at their profession, ask them about their failures. Last year I cited an article on predicting the success or failure of surgeons:

Charles Bosk, a sociologist at the University of Pennsylvania, once conducted a set of interviews with young doctors who had either resigned or been fired from neurosurgery-training programs, in an effort to figure out what separated the unsuccessful surgeons from their successful counterparts.

He concluded that, far more than technical skills or intelligence, what was necessary for success was the sort of attitude that Quest has – a practical-minded obsession with the possibility and the consequences of failure.

"When I interviewed the surgeons who were fired, I used to leave the interview shaking," Bosk said. "I would hear these horrible stories about what they did wrong, but the thing was that they didn't know that what they did was wrong. In my interviewing, I began to develop what I thought was an indicator of whether someone was going to be a good surgeon or not. It was a couple of simple questions: Have you ever made a mistake? And, if so, what was your worst mistake? The people who said, 'Gee, I haven't really had one,' or, 'I've had a couple of bad outcomes but they were due to things outside my control' – invariably those were the worst candidates. And the residents who said, 'I make mistakes all the time. There was this horrible thing that happened just yesterday and here's what it was.' They were the best. They had the ability to rethink everything that they'd done and imagine how they might have done it differently."

The best software developers embrace failure – in fact, they're obsessed with failure. If you forget how easy it is to make critical mistakes, you're likely to fail. And that should concern you.

Michael Hunter takes this concept one step beyond mere vigilance. He encourages us to fail early and often:

If you're lucky, however, your family encourages you to fail early and often. If you're really lucky your teachers do as well. It takes a lot of courage to fight against this, but the rewards are great. Learning doesn't happen from failure itself but rather from analyzing the failure, making a change, and then trying again. Over time this gives you a deep understanding of the problem domain (be that programming or combining colors or whatever) - you are learning. Exercising your brain is good in its own right ("That which is not exercised atrophies", my trainer likes to say), plus this knowledge improves your chances at functioning successfully in new situations.

I say the more failed projects in your portfolio, the better. If you're not failing some of the time, you're not trying hard enough. You need to overreach to find your limits and grow. But do make sure you fail in spectacular new ways on each subsequent project.

Discussion

Why Do We Have So Many Screwdrivers?

Jon Raynor added this comment to my previous post about keeping up with the pace of change in software development:

The IT field is basically a quagmire. It's better to accept that fact right away or move on to a different field. I guess someday I wish for Utopia where I won't be obsoleted when I get out of bed each and every morning.

The industry needs to stop running around like a chicken with its head cut off trying to find the next big thing. The tools constantly change, but yet they do the same thing, create code to run on machines. First we get a screwdriver and learn how to use it. Then out comes the newdriver, different than the screwdriver, but does the same thing. Then out comes the phewdriver which is totally different than the screw and new driver but performs the same function of both previous tools.

It's an interesting observation. I'm far from a handyman, but even I own many different screwdrivers: different sizes, different tips, different lengths. They're all performing the same job-- screwing*-- but each one is uniquely useful in the right scenario. I'd hate to throw out all the screwdrivers I own and opt for a one-size-fits-all approach. Sure, I may choose the standard screwdriver 90 percent of the time, but what about that other ten percent?

A collection of different screwdrivers

So a case can be made for having multiple languages and multiple tools, redundancies and all.

However, software developers are awfully eager to throw out existing tools for new ones. Unfortunately, these decisions are often based on myth and wishful thinking, and the decisions are typically made in favor of whatever the hot new thing of the moment is. Here are two mistakes that I see a lot:

1. Let's buy this whiz-bang power screwdriver that will double our productivity.

A silver bullet brand screwdriver, if you will. Just replace the word "Ada" with "Ruby", below:

One of the most touted recent developments is Ada, a general-purpose high-level language of the 1980's. Ada not only reflects evolutionary improvements in language concepts, but indeed embodies features to encourage modern design and modularization. Perhaps the Ada philosophy is more of an advance than the Ada language, for it is the philosophy of modularization, of abstract data types, of hierarchical structuring. Ada is over-rich, a natural result of the process by which requirements were laid on its design. That is not fatal, for subsetted working vocabularies can solve the learning problem, and hardware advances will give us the cheap MIPS to pay for the compiling costs. Advancing the structuring of software systems is indeed a very good use for the increased MIPS our dollars will buy. Operating systems, loudly decried in the 1960's for their memory and cycle costs, have proved to be an excellent form in which to use some of the MIPS and cheap memory bytes of the past hardware surge.

Nevertheless, Ada will not prove to be the silver bullet that slays the software productivity monster. It is, after all, just another high-level language, and the biggest payoff from such languages came from the first transition -- the transition up from the accidental complexities of the machine into the more abstract statement of step-by-step solutions. Once those accidents have been removed, the remaining ones will be smaller, and the payoff from their removal will surely be less.

I predict that a decade from now, when the effectiveness of Ada is assessed, it will be seen to have made a substantial difference, but not because of any particular language feature, nor indeed because of all of them combined. Neither will the new Ada environments prove to be the cause of the improvements. Ada's greatest contribution will be that switching to it occasioned training programmers in modern software-design techniques.

2. This screwdriver is for amateurs and hacks. We should buy a newer, more professional screwdriver.

David Megginson notes the self-perpetuating cycle of successful programming languages:

  • Elite (guru) developers notice too many riff-raff using their current programming language, and start looking for something that will distinguish them better from their mediocre colleagues.
  • Elite developers take their shopping list of current annoyances and look for a new, little-known language that apparently has fewer of them.
  • Elite developers start to drive the development of the new language, contributing code, writing libraries, etc., then evangelize the new language.
  • Sub-elite (senior) developers follow the elite developers to the new language, creating a market for books, training, etc., and also accelerating the development and testing of the language.
  • Sub-elite developers, who have huge influence (elite developers tend to work in isolation on research projects rather than on production development teams), begin pushing for the new language in the workplace.
  • The huge mass of regular developers realize that they have to start buying books and taking courses to learn a new language.
  • Elite developers notice too many riff-raff using their current programming language, and start looking for something that will distinguish them better from their mediocre colleagues.

It's OK to add a new screwdriver to your toolkit every few years. But make sure you're adding it for the right reasons.

* Yes, it's still funny.

Discussion

Keeping Up and "Just In Time" Learning

Do you ever feel like you're buried under umpteen zillion backlogged emails, feeds, books, articles, journals, magazines, and printouts? Do you ever feel that you're hopelessly behind, with so much new stuff created every day that you can never possibly hope to keep up?
Well, you're not alone.

Via SecretGeek:

  • You do NOT have to refactor all your code.
  • You do NOT have to keep up with the latest news from microsoft, and know everythnig there is to know about longhorn, whidbey, avalon, XAML, indigo and star wars III.
  • You do not have to have perfectly de-coupled tiers in your technology independent SOA software.
  • You do not have to comply to every standard, achieve the perfect balance between maintainability and performance. Usability and familiarity.
  • You don't have to do "first things first every day"
  • You DO NOT have to memorize and understand every patten the gang of four have catalogued.
  • You do NOT have to read every technical blog, print out every technical article and learn every technical thing there is to learn.
  • You are beautiful just the way you are.
  • You are brilliant, interesting, wise and fun to be around.
  • You rock.

Via Kathy Sierra:

I remember when the first public release of Java came out, and it had 200 classes. You could fit the entire class library in the same space as Miss January. But then 1.1 came out and the API more than doubled to 500 classes. It no longer fit on a centerfold – but you could get it on a wall poster. With 200 classes, you really could master the entire API. With 500, it took some effort, but you could at least be familiar with just about everything, given enough time. By Java 1.4, the library had swelled to 2300 classes. And today? It's something like 3500 classes just in the Standard Edition – not including the mobile and enterprise extensions. You could wallpaper an entire room with the class library.

By the year 2000, it had become impossible for even a Sun Java engineer – someone creating the API – to be familiar with everything in the standard library. Yet the rest of us were feeling guilty. Like we were falling behind. Like we weren't hardcore Java programmers.

It's time to let that go. You're not keeping up. I'm not keeping up. And neither is anyone else. At least not in everything.

Kathy has a few suggestions to combat Information Anxiety:

  • Find the best aggregators
  • Get summaries
  • Cut redundancy
  • Unsubscribe from as many things as possible
  • Recognize black holes (gaming, slashdot, etc)
  • Pick categories for balance; include some from outside your main field
  • Be more realistic about what you're likely to get to; throw the rest out
  • In anything you need to learn, find someone who can tell you what is
    • Need to know
    • Should know
    • Nice to know
    • Edge case
    • Useless

I don't worry about keeping up with the Joneses; I focus on the specific problem at hand. I take a "Just In Time" attitude to learning new technology. I can't possibly learn everything. But I do try to learn enough to know what the new thing is, and when I might need it. Most of the time, I don't need it. And when I do, I can learn it Just In Time to help me solve the current problem I'm working on.

Discussion

Of Spaces, Underscores and Dashes

I try to avoid using spaces in filenames and URLs. They're great for human readability, but they're remarkably inconvenient in computer resource locators:

  1. A filename with spaces has to be surrounded by quotes when referenced at the command line:

     XCOPY "c:\test files\reference data.doc" d:
     XCOPY c:\test-files\reference-data.doc d:
    
  2. Any spaces in URLs are converted to the encoded space character by the web browser:

     http://domain.com/test%20files/reference%20data.html
     http://domain.com/test-files/reference-data.html
    

So it behooves us to use something other than a space in file and folder names. Historically, I've used underscore, but I recently discovered that the correct character to substitute for space is the dash. Why?

The short answer is, that's what Google expects:

If you use an underscore '_' character, then Google will combine the two words on either side into one word. So bla.com/kw1_kw2.html wouldn't show up by itself for kw1 or kw2. You'd have to search for kw1_kw2 as a query term to bring up that page.

The slightly longer answer is, the underscore is traditionally considered a word character by the w regex operator.

Here's RegexBuddy matching the w operator against multiple ASCII character sets:

Result of a regex match for w (word characters)

As you can see, the dash is not matched, but underscore is. This_is_a_single_word, but this-is-multiple-words.

Like NutraSweet and Splenda, neither is really an acceptable substitute for a space, but we might as well follow the established convention instead of inventing our own. That's how we ended up with the backslash as a path separator.

Discussion

A Blog Without Comments Is Not a Blog

James Bach responded to my recent post, Are You Following the Instructions on the Paint Can?, with Studying Jeff Atwood's Paint Can. I didn't realize how many assumptions I made in that post until I read Mr. Bach's pointed response. The most amusing assumption I made – and I had no idea I was doing this – was that I ran a painting business in college! The paint can instructions make sense to me because of that prior experience. Pity the would-be handyman who has never painted anything before and has only a few paragraphs of text on the back of a can to refer to.

But I'll reserve a complete response to Mr. Bach for later. My immediate frustration is that James has comments disabled on his blog, so I can't form a public reply to James without creating a post on my own blog.

I firmly maintain that a blog without comments enabled is not a blog. It's more like a church pulpit. You preach the word, and the audience passively receives your evangelical message. Straight from God's lips to their ears. When the sermon is over, the audience shuffles out of the church, inspired for another week. And there's definitely no question and answer period afterward.

the church pulpit

Of course, I'm exaggerating for comedic effect. Maybe a blog with comments disabled is more analogous to a newspaper editorial. But even with a newspaper editorial, readers can make public comments by sending a letter to the editor, which may be published in a later edition of the paper.

However we slice it, with comments disabled the reader's hands are tied. If readers want to have a public dialog with you, then your readers must have blogs of their own. This strikes me as awfully elitist.

  • It's unreasonable to expect people that disagree with the tenets of your religion to build a church and start their own religion.

  • It's unreasonable to expect people that disagree with your newspaper editorial to buy a printing press and start their own newspaper.

  • And it's unreasonable to expect people to start their own blogs to make a public reply to your post.

Yes, the barriers to entry for blogging are radically lower, but not everyone has the time or inclination to become a full-bore blogger. Are you really comfortable saying, in effect, unless you have a blog I am not interested in what you have to say? Because I'm not.

That said, I realize that comments aren't appropriate for every blog in the world:

Blogs are susceptible to the same problems as social software sites (as well as having to deal with comment spamming scum). The more popular the blog, the bigger the problem. Just ask Heather or Jason.

Most blogs allow comments. There's no doubt about it; having comments enabled is likely to increase the popularity of your blog.

But that, in and of itself, is not a good justification. It assumes that popularity is desirable. The truth is that, when it comes to personal publishing, it's not the amount of people who visit that count, it's who those people are why they're visiting that's important.

Comments are a shortcut to a Pyrrhic victory of popularity at the cost of having your pages cluttered with pointless remarks (by pointless, I don't just mean the negative stuff: "me too!" and "great post!" achieve as little as "you suck!"). If popularity is your aim, it's better in the long run to claw your way towards that goal on the strength of your writing or design skills.

But comments can add value. They are particularly useful on sites that have a narrow, focused scope. The focused nature of the subject matter ensures that visitors share a common interest – otherwise, they wouldn't be there.
The more general a site's focus, the less chance there is of it receiving quality comments. A site that covers everything from politics (Republican vs. Democrat) to computing (Mac vs. PC) is going to be flame-war central.

Jeremy's observation about the effect of topic on comments is interesting. Again, something I wouldn't have anticipated based on my experience with a very focused blog.

However, to deny public conversation by disabling comments right out of the gate – based on the presumption that the comments will be of low quality – is, again, awfully elitist. Have some respect for your audience. Enable comments and experiment before making the assumption that 90 percent of the comments will be crap! Personally, I've found that the comments can be the best, most informative part of a blog. Anyone who has visited Amazon and skipped directly to the user reviews will know exactly what I'm talking about.

Some people refuse to enable comments because they don't want to deal with the spam problem. I can appreciate this concern, but a simple CAPTCHA is extremely effective at blocking machine spam. And a simple daily browse of your comments will catch those rare manually entered spam comments. Why in the world would you enable comments if you're not planning to read them at some point?

I am sympathetic to issues of scale. Comments don't scale worth a damn. If you have thousands of readers and hundreds of comments for every post, you should disable comments and switch to forums, probably moderated forums at that. But the number of bloggers who have that level of readership is so small as to be practically nil. And when you get there, believe me, you'll know. Until then, you should enable comments.

The more I think about this, the more I keep coming back to my original position: a blog without comments enabled is not a blog. I'm not sure what it is, exactly, but it definitely isn't a blog.

Comments?

Discussion