Coding Horror

programming and human factors

Tending Your Software Garden

Software: do you write it like a book, grow it like a plant, accrete it like a pearl, or construct it like a building? As Steve McConnell notes in Code Complete 2, there's no shortage of software development metaphors:

A confusing abundance of metaphors has grown up around software development. David Gries says writing software is a science (1981). Donald Knuth says it's an art (1998). Watts Humphrey says it's a process (1989). P. J. Plauger and Kent Beck say it's like driving a car, although they draw nearly opposite conclusions (Plauger 1993, Beck 2000). Alistair Cockburn says it's a game (2002). Eric Raymond says it's like a bazaar (2000). Andy Hunt and Dave Thomas say it's like gardening. Paul Heckel says it's like filming Snow White and the Seven Dwarfs (1994). Fred Brooks says that it's like farming, hunting werewolves, or drowning with dinosaurs in a tar pit (1995). Which are the best metaphors?

I think we're leaving one metaphor on the table which more accurately reflects the way software is built in the real world: flail around randomly and pray you succeed by force of pure dumb luck. Sometimes it even works. Not very often, but just enough to confuse people who should know better into thinking they're smart, when what they really were is lucky.

The answer, of course, is whichever metaphor helps you and your team get to the end of the project. Personally, I see them as more of a battle cry, a way for a team to communicate a shared vision and a set of values. They're heavy on imagery and metaphor, and light on specific, concrete advice.

Even as Steve McConnell argues that most software development metaphors come up short, he quite clearly picks a favorite, and spends quite a bit of time defending his choice. It's not exactly a secret, as it's in the subtitle for the book: Code Complete: A Practical Handbook of Software Construction.

As much as I respect Steve, my software project experience to date doesn't match the controlled construction metaphor. I agree with Thomas Guest; software is soft; buildings aren't. I'm more partial to the model that Andy Hunt and Dave Thomas promote, what I call tending your software garden.

American Gothic, a painting by Grant Wood

Programers as farmers, if you will.

All the best software projects I've worked were, for lack of a better word, alive. I don't mean that literally, of course. But the software was constantly and quite visibly growing. There were regular, frequent release schedules defining its evolution. There was a long term project commitment to a year out, five years out, ten years out.

To me, the parallels between farming and software development are strong and evocative. Steve disagrees.

The weakness in the software-farming metaphor is its suggestion that you don't have any direct control over how the software develops. You plant the code seeds in the spring, Farmer's Almanac and the Great Pumpkin willing, you'll have a bumper crop of code in the fall.

To be clear, all these metaphors are abstract and therefore heavily subject to interpretation (and/or useless, take your pick), so I don't want to get too wrapped up in defending one.

That said, I disagree with Steve's dismissal. The strength of the farming metaphor is the implied commitment to the craft. Farming is hard, unforgiving work, but there's a yearly and seasonal ritual to it, a deep underlying appreciation of sustainible and controlled growth, that I believe software developers would do well to emulate. I also think Steve was a bit unfair in characterizing farming as "no direct control". There's plenty of control, but lots of acknowledged variables, as well -- which I think more accurately represents the shifting sands of software development. Farmers do their best to control those variables, of course, but most of all they must adapt to whatever conditions they're dealt. Next season, next year, they know they'll be back with a renewed sense of purpose to try it all again and do better. Not so coincidentally, these are also traits shared by the best software developers I've known.

In particular, the rise of the web software development model has made the farming model more relevant. Where traditional software like Office might go through a bunch of monolithic, giant construction project updates every two to three years -- from Office XP, to Office 2003, to Office 2007 -- websites can be deployed far more often. Seasonally, if you will. Some websites even "harvest" monthly, organically growing new features and bugfixes each time. The guys at 37Signals apparently noticed this, too.

It recently dawned on me that software grows much in the same way that plants grow. New features are the flowers of the software world. And just as most plants aren't flowering all year long, software isn't sprouting features all year long. There's flowering season. There's new feature season. There's infrastructure season.

Sometimes software is working on its roots. Bolstering its infrastructure. It's growing underground where the public can't see it. It looks like nothing's happening, but there's really a lot going on. Without those roots new features can't sprout.

And sometimes it's rest time. Plants rest in the winter. Software often rests in the summer (it's too nice to work too hard in the summer). Everything can benefit from a deep breath, relaxation, and sleep. Chaotic constant growth and change doesn't make room for order and organization. Growth requires new energy and new energy requires rest.

Another thing I've noticed is that tending to websites, which usually have community features and user-generated content at the forefront, feels a heck of a lot like weeding your garden. You grow a lot of content, but not all of it is exactly what you had in mind.

I scrutinize every comment, and I remove a tiny percentage of them: they might be outright spam, patently off-topic, or just plain mean. I like to refer to this as weeding my web garden. It's a productivity tax you pay if you want to grow a bumper crop of comments, which, despite what Joel says, often bear such wonderful fruit. The labor can be minimized with improved equipment, but it's always there in some form. And I'm OK with that. The myriad benefits of a robust comment ecosystem outweighs the minor maintenance effort.

And when you don't weed your garden? The weeds threaten to choke out your crops. Eventually, your software garden looks neglected, and then abandoned.

web weeds

As Steve says, some software development metaphors are better than others. But when it comes to web development, at least, you could certainly do a lot worse than tending to your software garden.

Discussion

Is Email = Efail?

While I've always practiced reasonable email hygiene, for the last 6 months I've been in near-constant email bankruptcy mode. This concerns me.

Yes, it's partly my fault for being a world champion procrastinator, but I'm not sure it's entirely my fault. There are forces at work here, factors that easily outstrip the efforts of any one measly human being, no matter how tenacious and dogged. Or, as in my case, no matter how lazy.

I've always liked Merlin Mann's explanation of this phenomenon:

Email is such a funny thing. People hand you these single little messages that are no heavier than a river pebble.

river pebbles

But it doesn't take long until you have acquired a pile of pebbles that's taller than you and heavier than you could ever hope to move, even if you wanted to do it over a few dozen trips. For the person who took the time to hand you their pebble, it seems outrageous that you can't handle that one tiny thing. "What 'pile'? It's just a f**ing pebble!"

The underlying problem is that individual human beings don't scale.

The net number of requests for my attention exceeds my ability to provide that attention by at least an order of magnitude. And the disparity around my ability to thoughtfully respond to my pile may be ten or more times worse still. The scale is insanely out of whack.

Email is certainly the backbone of the information economy, but it's also fundamentally and perhaps even fatally flawed. Tantek elik captured my thoughts perfectly with this post:

Last year when I posted The Three Hypotheses, they helped me explain why I found email so much less useful/usable than instant messaging (IM) and Twitter. Since then, I find that while I can keep up with more people contacting me over IM and following more people on Twitter, email has simply become less and less usable. But not for reasons of interface; I'm using the email application now as I was a year ago.

I'm probably responding to less than 1 in 10 emails that are sent directly to me, and even fewer that were sent to a set of people or a list. The usability of email for me has deteriorated so much that I exclaimed on Twitter: EMAIL shall henceforth be known as EFAIL.

The blanket equation of email with failure is strong language indeed, but it's a serious problem. The intrinsically low effort-to-reward ratio of private email is not necessarily a new idea; as I said in When In Doubt, Make It Public, it's almost never in anyone's best interest to keep their communications locked into private silos of any kind, email or otherwise. Why answer one person's email directly when I could potentially answer a thousand different people's email with a single blog post?

I urge you to read the full text of Tantek's article. He cuts to the heart of the email problem: size, in both the mental and physical dimensions.

Email requires more of an interface cognitive load tax than instant messaging. People naturally put much more into an email, perhaps in an unconscious effort to amortize that email interface tax overhead across more content. People feel that since they are already "bothering" to write an email, they might as well take the time to go into all kinds of detail, perhaps even adding a few more things that they're thinking about.

Such natural message bloat places additional load on the recipient, both in terms of the raw length of the message, and in terms of the depth and variety of topics covered in the email. This results in a direct increase in processing time per email, making it even harder for people to process and respond. I know I've let numerous emails grow stale because there were simply too many different things in the email that required a response. I didn't want to send a response without responding to everything in the email because then I would inevitably receive yet another email response without being able to file the original as being processed and thus have the situation worsen!

What we can to combat the email = efail problem? Take Tantek's advice: whenever possible, avoid sending email. Not because we don't want to communicate with our peers. Quite the contrary. We should avoid sending email out of a deep respect for our peers -- so that they are free to communicate as effectively and as often as possible with us.

  1. Channel that private email effort into a public outlet. Discussion boards, blog entries, comments, wikis, you name it. If it can be indexed by a web search engine, you're in the right place -- and many more people can potentially find, answer, and benefit from that information.

  2. If you must send email, make it as short as possible. Think of it as Strunk and White on speed. Can you reduce your email into a single paragraph? How about two sentences? How about just the title field with no body, even?

  3. Remember the theory of communication escalation. Email is just one communication tool in our toolkit; that doesn't mean it is always the right one for whatever situation is at hand. Take advantage of phone calls, instant messaging, text messages, and so forth, as appropriate. Scale your choice of communication method to the type of conversation you're having, and don't be afraid to escalate it (or demote it!) as the ebb and flow of the conversation shifts.

So if you've emailed me, and I haven't responded in a timely fashion, I apologize. I know it may sound crazy, but I've been desperately clawing my way out from under this mountain of pebbles.

p.s. Email me if you agree with this.

Discussion

Can You Really Rent a Coder?

I've been a fan of Dan Appleman for about as long as I've been a professional programmer. He is one of my heroes. Unfortunately, Dan only blogs rarely, so I was heartened to see a spate of recent blog updates from him. One of the entries asks a question I've often wondered myself: can you really rent a coder?

Over the past year or two I've kept an eye on the various online consulting sites - Elance, guru.com, RentACoder, oDesk. I've actually used RentACoder once (as a buyer on a very small project) and was satisfied with the results -- though I suspect I spent more time writing the spec and managing the programmers than I would if I had done the work myself.

I'm surprised Dan opens with such a sunny outlook on these services, because I've heard almost universally negative things about them. As professional programmers, I think we're all naturally inclined to see these sort of low-bid contract sites as cannibalizing and cheapening our craft. It's roughly analogous to the No-Spec movement for designers.

The odd thing is that, despite the sunny outlook, the article Dan wrote on this topic comes across as quite cautionary:

  • You'll be competing with people around the world. In fact, you'll be amazed at how little people in some parts of the world will bid. That's because a few dollars an hour can work well in a country where the average wage is a couple of hundred dollars a month.

  • Many of the projects posted are unrealistic. For example, people asking for a clone of ebay for under $500. What ends up happening in these cases is that usually somebody ends up getting ripped off (either the client or the consultant who underbid or fails to deliver).

  • A lot of projects go bad. They get cancelled. Or the consultant who bid on the work never delivered, or delivered poor results. Or the client has unreasonable expectations, or doesn't actually know what he wants.

Maybe it's just my natural bias talking, but these sites seem awfully impractical to me.

Simply sorting out the DailyWTF project pitches from things you could actually deliver -- at ultra-competitive offshore programming rates, no less -- would require the patience of a saint and the endurance of an olympic athlete. Specification documents are hard enough to write when everyone involved is a coworker sitting in the same room. I can't even imagine the difficulty of agreeing on what it is you're building when the participants are thousands of miles away and have never met. But then I thought Amazon's Mechanical Turk was a failure, and it seems to be enjoying a moderate level of success.

Dan has a small chart comparing the services of these online freelance/consulting sites. It's too easy to write these sites off as an affront to software engineering. I guess they're sort of like dating sites -- they might be one way to find a client relationship, but I'd be highly suspicious of any professional developer who can't find a stable, long term relationship with a client eventually.

If nothing else, we should be looking at them for research purposes, as a baseline. Surely you can demonstrate better value to your employer than the random, anonymous programmers on Elance, guru.com, RentACoder, or oDesk. And I'd certainly hope that the projects you're working on are more sensible and rewarding (in both senses of the word) than the stuff that appears on those sites.

Discussion

That's Not a Bug, It's a Feature Request

For as long as I've been a software developer and used bug tracking systems, we have struggled with the same fundamental problem in every single project we've worked on: how do you tell bugs from feature requests?

Sure, there are some obvious crashes that are clearly bugs. But that's maybe 10% of what you deal with on a daily basis, and the real killer showstopper bugs -- the ones that prevent normal usage of the system -- are eradicated quickly, lest the entire project fail. The rest of the entries in your bug tracking system, the vast majority, exist in an uncertain gray no-man's land. Did users report a bug? Not quite. Are users asking for a new or enhanced feature? Not quite. Well, which is it?

It's an insoluble problem. Furthermore, I think most bug tracking systems fail us because they make us ask the wrong questions. They force you to pick a side. Hatfields vs. McCoys. Coke vs. Pepsi. Bug vs. Feature Request. It's a painful and arbitrary decision, because most of the time, it's both. There's no difference between a bug and a feature request from the user's perspective. If you want to do something with an application (or website) and you can't do it because that feature isn't implemented -- how is that any different than not being able to do something due to an error message?

Consider an example: Visual Studio doesn't use the correct font when building Windows applications. Is this a bug or a feature request?

Personally, I consider this a bug. I guess Microsoft does too, at least in theory, because it's been in Microsoft's Connect bug tracking system for over four years now. When you build a Windows application, wouldn't you expect it to use the default font of the underlying operating system you're running it on, unless you've explicitly told it otherwise? Well, guess what happens when you create a new form in Visual Studio 2008 and instantiate a label control.

Windows Forms, Visual Studio 2008 default font

Party like it's 1996, folks, because you'll get MS Sans Serif, and you'll like it. That is the default for each new form. Never mind that every new application you build will look like -- let me put this as delicately as I can -- ass.

Here's a comparison of a label with the default font, versus one that was explicitly set to the default GUI font.

windows-forms-sans-serif-vs-segoe-ui.png

Judging by the applications I've used, most Windows developers couldn't care less about design. That's bad. What's even worse is learning that same design carelessness has shipped in the box with every copy of Visual Studio since 2002.

Of course, matters of design are so subjective. If only there were some definitive source we could refer to on the matter of proper Windows GUI font usage. Some sort of reference standard, as it were. Like, say, the top rules for Windows Vista User Experience from Microsoft:

  1. Use the Aero Theme and System Font (Segoe UI)
  2. Use common controls and common dialogs
  3. Use the standard window frame, use glass judiciously

There are 12 rules in total, but the rule I'm looking for is right at the top -- applications should use the system font.

The hilarity of this list is already sort of self evident, given that I've written an entire post bemoaning the general lack of fit and finish in Windows Vista. I couldn't help but laugh at rule number 12: Reserve time for "fit and finish"! Now there's a rule Microsoft should have taken to heart while developing Windows Vista. Understand this is all coming from a guy who likes Vista.

But I digress.

Despite the windows forms font behavior in Visual Studio 2008 contradicting rule number one of Microsoft's own design guidelines, this "bug" has gone unfixed for over four years. It has been silently reclassified as a "feature request" and effectively ignored. Nothing's broken, after all: using the wrong font hasn't caused any application crashes or lost productivity. On the other hand, imagine how many BigCorpCo apps have been built since then that violate Microsoft's own design rules for their platform. Either because the developers didn't realize that the app font didn't match the operating system, or because they didn't have the time to write the workaround code necessary to make it do the right thing.

Yes, this is a small thing. And I'm sure fixing it wouldn't result in selling an additional umpteen thousand Visual Studio licenses to BigCorpCo, which is why it hasn't happened yet.

But the question remains: is this a bug, or a feature request?

One of my favorite things about UserVoice -- which we use for Stack Overflow -- is the way it intentionally blurs the line between bugs and feature requests. Users never understand the difference anyway, and what's worse, developers tend to use that division as a wedge against users. Nudge things you don't want to do into that "feature request" bucket, and proceed to ignore them forever. Argue strongly and loudly enough that something reported as a "bug" clearly isn't, and you may not have to to do any work to fix it. Stop dividing the world into Bugs and Feature Requests, and both of these project pathologies go away.

I wish we could, as an industry, spend less time fighting tooth and nail over definitions, painstakingly placing feedback in the "bug" or "feature request" buckets -- and more time doing something constructive with our users' feedback.

Discussion

We Are Typists First, Programmers Second

Remember last week when I said coding was just writing?

I was wrong. As one commenter noted, it's even simpler than that.

[This] reminds me of a true "Dilbert moment" a few years ago, when my (obviously non-technical) boss commented that he never understood why it took months to develop software. "After all", he said, "it's just typing."

Like broken clocks, even pointy-haired managers are right once a day. Coding is just typing.

keyright keyboard

So if you want to become a great programmer, start by becoming a great typist. Just ask Steve Yegge.

I can't understand why professional programmers out there allow themselves to have a career without teaching themselves to type. It doesn't make any sense. It's like being, I dunno, an actor without knowing how to put your clothes on. It's showing up to the game unprepared. It's coming to a meeting without your slides. Going to class without your homework. Swimming in the Olympics wearing a pair of Eddie Bauer Adventurer Shorts.

Let's face it: it's lazy.

There's just no excuse for it. There are no excuses. I have a friend, John, who can only use one of his hands. He types 70 wpm. He invented his own technique for it. He's not making excuses; he's typing circles around people who are making excuses.

I had a brief email exchange with Steve back in March 2007, after I wrote Put Down The Mouse, where he laid that very same Reservoir Dogs quote on me. Steve's followup blog post was a very long time in coming. I hope Steve doesn't mind, but I'd like to pull two choice quotes directly from his email responses:

I was trying to figure out which is the most important computer science course a CS student could ever take, and eventually realized it's Typing 101.

The really great engineers I know, the ones who build great things, they can type.

Strong statements indeed. I concur. We are typists first, and programmers second. It's very difficult for me to take another programmer seriously when I see them using the hunt and peck typing techniques. Like Steve, I've seen this far too often.

First, a bit of honesty is in order. Unlike Steve, I am a completely self-taught typist. I didn't take any typing classes in high school. Before I wrote this blog post, I realized I should check to make sure I'm not a total hypocrite. So I went to the first search result for typing test and gave it a shot.

typing test speed (WPM) results

I am by no means the world's fastest typist, though I do play a mean game of Typing of the Dead. Let me emphasize that this isn't a typing contest. I just wanted to make sure I wasn't full of crap before I posted this. Yes, there's a first time for everything. Maybe this'll be the start of a trend. Doubtful, but you never know.

Steve and I believe there is nothing more fundamental in programming than the ability to efficiently express yourself through typing. Note that I said "efficiently" not "perfectly". This is about reasonable competency at a core programming discipline.

Maybe you're not convinced that typing is a core programming discipline. I don't blame you, although I do reserve the right to wonder how you manage to program without using your keyboard.

Instead of answering directly, let me share one of my (many) personal foibles with you. At least four times a day, I walk into a room having no idea why I entered that room. I mean no idea whatsoever. It's as if I have somehow been teleported into that room by an alien civilization. Sadly, the truth is much less thrilling. Here's what happened: in the brief time it took for me to get up and move from point A to point B, I have totally forgetten whatever it was that motivated me to get up at all. Oh sure, I'll rack my brain for a bit, trying to remember what I needed to do in that room. Sometimes I remember, sometimes I don't. In the end, I usually end up making multiple trips back and forth, remembering something else I should have done while I was in that room after I've already left it.

It's all quite sad. Hopefully your brain has a more efficient task stack than mine. But I don't fault my brain – I fault my body. It can't keep up. If I had arrived faster, I wouldn't have had time to forget.

What I'm trying to say is this: speed matters. When you're a fast, efficient typist, you spend less time between thinking that thought and expressing it in code. Which means, if you're me at least, that you might actually get some of your ideas committed to screen before you completely lose your train of thought. Again.

Yes, you should think about what you're doing, obviously. Don't just type random gibberish as fast as you can on the screen, unless you're a Perl programmer. But all other things being equal – and they never are – the touch typist will have an advantage. The best way to become a touch typist is through typing, and lots of it. A little research and structured practice couldn't hurt either. Here are some links that might be of interest to the aspiring touch typist:

(But this is a meager and incomplete list. What tools do you recommend for becoming a better typist?)

There's precious little a programmer can do without touching the keyboard; it is the primary tool of our trade. I believe in practicing the fundamentals, and typing skills are as fundamental as it gets for programmers.

Hail to the typists!

Discussion