Coding Horror

programming and human factors

Is Eeyore Designing Your Software?

This classic Eric Lippert post describes, in excruciating, painful detail, exactly how much work it takes to add a single ChangeLightBulbWindowHandleEx function to a codebase at Microsoft:

  • One dev to spend five minutes implementing ChangeLightBulbWindowHandleEx.
  • One program manager to write the specification.
  • One localization expert to review the specification for localizability issues.
  • One usability expert to review the specification for accessibility and usability issues.
  • At least one dev, tester and PM to brainstorm security vulnerabilities.
  • One PM to add the security model to the specification.
  • One tester to write the test plan.
  • One test lead to update the test schedule.
  • One tester to write the test cases and add them to the nightly automation.
  • Three or four testers to participate in an ad hoc bug bash.
  • One technical writer to write the documentation.
  • One technical reviewer to proofread the documentation.
  • One copy editor to proofread the documentation.
  • One documentation manager to integrate the new documentation into the existing body of text, update tables of contents, indexes, etc.
  • Twenty-five translators to translate the documentation and error messages into all the languages supported by Windows.The managers for the translators live in Ireland (European languages) and Japan (Asian languages), which are both severely time-shifted from Redmond, so dealing with them can be a fairly complex logistical problem.
  • A team of senior managers to coordinate all these people, write the cheques, and justify the costs to their Vice President.
  • I think sometimes programmers forget how much work it is to create software at large companies. What may seem like a no-brainer five line code change to us on the outside is perhaps five man-weeks of work once you factor in all the required process overhead. We're picking on Microsoft here, but this is by no means limited to Microsoft; it's a simple function of scale and audience for all commercial software.

    So then, the obvious question: who does all those things for non-commercial, open source software? The answer, per a Raymond Chen comment on the same post, is "nobody":

    Who develops the test plans for open source software? Who updates the screenshots in the user's guide and online help? And who translates the documentation into Polish and Turkish? Who verifies that the feature doesn't violate the Americans with Disabilities Act or German privacy laws? Back when I worked on Linux, the answer was "Nobody. There is no test plan, there is no printed user's guide, what little documentation there is exists only in English, and nobody cares about complying with the ADA or German privacy laws." Maybe things have changed since then.

    Here's my honest question: does open source software need all that process to be successful? Isn't the radical lack of process baggage in open source software development not a weakness, but in fact an evolutionary advantage? What open source software lacks in formal process it makes up ten times over in ubiquity and community. In other words, if the Elbonians feel so strongly about localization, they can take that effort on themselves. Meanwhile, the developers have more time to implement features that delight the largest base of customers, instead of plowing through mountains of process for every miniscule five line code change.

    Are large commercial software companies crippled by their own process?

    If you openly reward and promote people for killing work by bemoaning the risk and the testing cost and localization impact of each feature and interrogating a design change request as if it were Dan Brown shackled in-front of a wild-eyed, hot-poker wielding Pope, well, everyone is going to grab pitchforks and jump on that "No can do! No can ship!" bandwagon.

    It makes me think of how many feature meetings I've had and what a small percent of those features have actually ever shipped. Not that every feature is a good idea, but it's damn near wake-worthy sometimes for a feature to actually get out into shipping bits. Que Eeyore: "Oh no. Now we have to support it. I suppose a hotfix request will come in any moment now..."

    All too often, it really does feel like Microsoft's software was designed by Eeyore.

    Eeyore, from Winnie-The-Pooh

    In this case, the bird represents features that delight customers.

    Discussion

    The Sierra Network II

    You may remember Sierra's ImagiNation network from the earliest days of dial-up networking:

    The ImagiNation Network (INN), aka The Sierra Network (TSN), was the first online multiplayer gaming system. Developed by Sierra On-Line in 1989, and first available to the public in 1991, the ImagiNation Network was a unique online gaming network that gave subscribers from all over the United States of America a place where they could "play games, make friends and have fun". With a wide variety of games including RPGs, WWI aeroplane simulations, live trivia, and card and board games, almost every user could find something enjoyable to play. INN also featured an electronic post office, many bulletin boards, chat rooms, and the company boasted of having "more than 200 groups, clubs and special events online."

    I had an account on The Sierra Network for a while. The graphics were incredible for that era, at least compared to the text-only BBS games that passed for online multiplayer gaming at the time. Still, it wasn't quite my cup of tea, so I didn't last long there. I finally achieved online multiplayer satisfaction a few years later with Doom, Dwango, and Kali.

    sierra-imagination-network-screenshot.png

    The Sierra Network is an interesting bit of computer history trivia at best. But it's particularly relevant when you compare it to the recently launched Mytopia gaming service. Mytopia allows you to play common internet games (think hearts, sudoku, chess, etcetera) across several popular walled garden social networking sites including MySpace and Facebook.

    mytopia screenshot

    The resemblance, indeed, is astonishing. It has to be some sign of the coming internet apocalypse when a startup has essentially rebuilt The Sierra Network in Web 2.0 fashion.

    (via rei on QuarterToThree)

    Discussion

    Paul Graham's Participatory Narcissism

    I have tremendous respect for Paul Graham. His essays – repackaged in the book Hackers and Painters – are among the best writing I've found on software engineering. Not all of them are so great, of course, but the majority are well worth your time. That's more than I can say for 99.9-infinitely-repeating-percent of the content on the web. He's certainly a better and more authoritative writer than I.

    But lately I've begun to wonder whether Mr. Graham, like Joel Spolsky before him, has devolved into self-absorption and irrelevance. Consider his latest essay, You Weren't Meant to Have a Boss, which opens with this distasteful anecdote:

    A few days ago I was sitting in a cafe in Palo Alto and a group of programmers came in on some kind of scavenger hunt. It was obviously one of those corporate "team-building" exercises.

    They looked familiar. I spend nearly all my time working with programmers in their twenties and early thirties. But something seemed wrong about these. There was something missing.

    And yet the company they worked for is considered a good one, and from what I overheard of their conversation, they seemed smart enough. In fact, they seemed to be from one of the more prestigious groups within the company. So why did it seem there was something odd about them?

    The guys on the scavenger hunt looked like the programmers I was used to, but they were employees instead of founders. And it was startling how different they seemed.

    So what, you may say. So I happen to know a subset of programmers who are especially ambitious. Of course less ambitious people will seem different. But the difference between the programmers I saw in the cafe and the ones I was used to wasn't just a difference of degree. Something seemed wrong.

    I think it's not so much that there's something special about founders as that there's something missing in the lives of employees. I think startup founders, though statistically outliers, are actually living in a way that's more natural for humans.

    I was in Africa last year and saw a lot of animals in the wild that I'd only seen in zoos before. It was remarkable how different they seemed. Particularly lions. Lions in the wild seem about ten times more alive. They're like different animals. And seeing those guys on their scavenger hunt was like seeing lions in a zoo after spending several years watching them in the wild.

    I'm not sure why Mr. Graham felt the need to draw this incredibly condescending parallel with company employees and caged animals in the zoo.

    I've actually taken Mr. Graham's advice. I recently quit my job to blog and participate in a micro startup. Even though I'm now one of the anointed founders in Mr. Graham's book, I still found this comparison retroactively offensive to all those years I worked as an employee for various companies and had perfectly enriching, rewarding – dare I say even enjoyable – experiences. Or at least as happy as a caged animal in a zoo can ever be, I suppose.

    caged orangutan

    Mr. Graham's essay does contain some fair points, if you can suppress your gag reflex long enough to get to them. If you don't have time to read it, lex99 posted this succinct summary that captured its flavor perfectly:

    I work with young startup founders in their twenties. They're geniuses, and play by their own rules. Oh... you haven't founded a company? You suck.

    Small businesses are the backbone of the American economy. And Mr. Graham is absolutely right to encourage young people to take risks early in life, to join small business startups with potentially limitless upside while they have nothing to lose – no children, no mortgage, no significant other. I believe in this so strongly I included it as a slide in my presentation to graduating Canadian computer science students.

    cusec-presentation: 'Your early twenties are exactly the time to take insane career risks.', Paul Graham

    Indeed, you should take insane career risks while you're young.

    And there are lots of large corporate soul-sucking programming jobs that are, quite literally, Dilbert cartoons brought to life.

    The problem with this particular essay is the way Mr. Graham implies the only path to true happiness as a young programmer lies in founding a startup. If you aren't a founder, or one of the first 10 employees, then, well.. enjoy your life at the zoo. We'll be sure to visit when we aren't busy loping free on the plains, working the way people were meant to. I'm not paraphrasing here; he actually wrote that: working the way people were meant to. The sense of disdain, the dismissiveness, is nearly palpable.

    He acknowledges that his perspective is warped because "nearly all the programmers [he knows] are startup founders." Therein lies the problem. These essays are no longer about software engineering; they're about Paul Graham. They've become participatory narcissism:

    After a while, you begin to notice that all the essays are an elaborate set of mirrors set up to reflect different facets of the author, in a big distributed act of participatory narcissism.

    Naturally, every young software programmer worth a damn forms a startup. Because that's what Mr. Graham's company, Y Combinator, does. They fund startups with young software programmers. He projects his reality outward, reflecting it against the rest of us so brightly and so strongly that we're temporarily blinded. We stop seeing our own reality and trade it for his, in a form of participatory narcissism – we believe in the one true path to success, exactly the way Mr. Graham has laid it before us. Traditional employment? That's for suckers. Real go-getters start their own companies.

    On the whole, I think I preferred Paul Graham's essays when they were more about software engineering and less about Paul Graham.

    Update: Paul Graham posted two essays that partially respond to this post: You Weren't Meant to Have a Boss: The Cliffs Notes and How to Disagree. The latter is, as far as I can tell, a sort of EULA for disagreeing with Paul Graham. Based on the conversation this post initiated, I attended a Y Combinator dinner and got to meet Mr. Graham in person. That is, to me, the point of posts like this – some initial disagreement ultimately leading to deeper, more satisfying communication. A net positive all around.

    Discussion

    The First Rule of Programming: It's Always Your Fault

    You know the feeling. It's happened to all of us at some point: you've pored over the code a dozen times and still can't find a problem with it. But there's some bug or error you can't seem to get rid of. There just has to be something wrong with the machine you're coding on, with the operating system you're running under, with the tools and libraries you're using. There just has to be!

    No matter how desperate you get, don't choose that path. Down that path lies voodoo computing and programming by coincidence. In short, madness.

    It's frustrating to repeatedly bang your head against difficult, obscure bugs, but don't let desperation lead you astray. An essential part of being a humble programmer is realizing that whenever there's a problem with the code you've written, it's always your fault. This is aptly summarized in The Pragmatic Programmer as "Select Isn't Broken":

    In most projects, the code you are debugging may be a mixture of application code written by you and others on your project team, third-party products (database, connectivity, graphical libraries, specialized communications or algorithms, and so on) and the platform environment (operating system, system libraries, and compilers).

    It is possible that a bug exists in the OS, the compiler, or a third-party product-- but this should not be your first thought. It is much more likely that the bug exists in the application code under development. It is generally more profitable to assume that the application code is incorrectly calling into a library than to assume that the library itself is broken. Even if the problem does lie with a third party, you'll still have to eliminate your code before submitting the bug report.

    We worked on a project where a senior engineer was convinced that the select system call was broken on Solaris. No amount of persuasion or logic could change his mind (the fact that every other networking application on the box worked fine was irrelevant). He spent weeks writing workarounds, which, for some odd reason, didn't seem to fix the problem. When finally forced to sit down and read the documentation on select, he discovered the problem and corrected it in a matter of minutes. We now use the phrase "select is broken" as a gentle reminder whenever one of us starts blaming the system for a fault that is likely to be our own.

    The flip side of code ownership is code responsibility. No matter what the problem is with your software-- maybe it's not even your code in the first place-- always assume the problem is in your code and act accordingly. If you're going to subject the world to your software, take full responsibility for its failures. Even if, technically speaking, you don't have to. That's how you earn respect and credibility. You certainly don't earn respect or credibility by endlessly pawning off errors and problems on other people, other companies, other sources.

    Statistically, you understand, it is incredibly rare for any bugs or errors in your software not to be your fault. In Code Complete, Steve McConnell cited two studies that proved it:

    A pair of studies performed [in 1973 and 1984] found that, of total errors reported, roughly 95% are caused by programmers, 2% by systems software (the compiler and the operating system), 2% by some other software, and 1% by the hardware. Systems software and development tools are used by many more people today than they were in the 1970s and 1980s, and so my best guess is that, today, an even higher percentage of errors are the programmers' fault.

    Whatever the problem with your software is, take ownership. Start with your code, and investigate further and further outward until you have definitive evidence of where the problem lies. If the problem lies in some other bit of code that you don't control, you'll not only have learned essential troubleshooting and diagnostic skills, you'll also have an audit trail of evidence to back up your claims, too. This is certainly a lot more work than shrugging your shoulders and pointing your finger at the OS, the tools, or the framework-- but it also engenders a sense of trust and respect you're unlikely to achieve through fingerpointing and evasion.

    If you truly aspire to being a humble programmer, you should have no qualms about saying "hey, this is my fault-- and I'll get to the bottom of it."

    Discussion

    Adventures in Rechargeable Batteries

    Every self-respecting geek loves gadgets. I'm no exception. And so many of my favorite gadgets have a voracious appetite for batteries. I don't know why all the other battery types fell so far out of favor, but between AA and AAA, I could probably power 95% of my household gadget needs.

    battery set: C, D, 9 volt, AAA and AA

    I've been a rechargeable battery user for years. It seems the frugal thing to do in the long run, and it's also healthier for the planet when we aren't discarding mountains of single-use batteries into landfills. I remember switching over to the then-new NiMH battery type based on a late 90's John Dvorak column touting their availability and power. Miraculously, that very article is still available on the internet:

    The calculation of cost for nickel hydride batteries in the table is for 100 recharges. Hawk says the industry knows that nickel hydride batteries can easily last through 500 recharges. I've seen data indicating that 1,000 charges are possible. This drops the cost per 10,000 pictures to 70 cents! I'm convinced that the industry doesn't want people to know about these batteries. I seriously doubt you'll be seeing them on a rack in the grocery store anytime soon. Do the math: It's like buying 1,000 alkaline batteries for less than 10 bucks. Imagine what this does to the lucrative disposable-battery business.

    So now I wonder where the D, C, and AAA nickel hydride batteries are? Mostly in Japan. As far back as January 1996, Toshiba rolled out the first complete line of standard cells and other Japanese battery makers have followed. This event was essentially hushed up in the U.S. market. The big-name American battery companies have avoided this market-killing technology for obvious reasons.

    I immediately rushed out bought a bunch of the batteries and the charger from the importer that Mr. Dvorak recommended. In fact I still have some of those original models. Let's compare these ten year old 1998 NiMH batteries to their 2008 cousins:

    AA battery comparison: 1998 vs 2008

    The picture can be a little hard to read, so I've reprinted the technical details from each AA battery below:

    1998 NiMH GP Rechargeable 1.2v, 1300 mAh
    2008 NiMH Energizer Rechargeable 1.2v, 2500 mAh

    Is it really true that AA battery capacity has almost doubled in the last ten years? That's pretty amazing. But as I found out, it's not the entire story.

    For one thing, there's the issue of discharge rate. It turns out that massive 2500mAh capacity of the Energizer rechargeable battery doesn't mean much when the battery drains itself within a month. Take it from Mr. Lee:

    All rechargeable battery manufacturers love to boast about their product's current capacity (mAh). But there is a dirty little secret that they don't want you to hear: self-discharge rate. Simply put: a fully charged NiCd of NiMH cell will gradually lose its stored energy over time. Technical papers I have researched typically put the self-discharge rate at 10-20% per month for NiCd cells, and 20-30% per month for NiMH cells. This kind of self-discharge rate is usually acceptable in applications such as digital cameras.

    I bought 8 of those Energizer 2500mAh rechargeable NiMH batteries over one year ago. At first, I was very happy about the large current capacity offered by those batteries. But within a few months, I started to notice that they die very quickly in my digital camera. In fact, a set of Sony 2000mAh NiMH batteries I bought one year earlier seems to last much longer when used in the same camera.

    So putting a larger number on the box is ultimately a method of fooling consumers with marketing. Where have we seen that before? Oh right, everywhere. Caveat emptor. Mr. Lee recommends the following model batteries, which exhibit much saner self-discharge rates; I've since bought a few batches of both the Eneloop and the Hybrid cells:

    In general you want the "hybrid" or "pre-charged" varieties, and should ignore ridiculous claims about capacity.

    The other pitfall of rechargeable batteries lies in the recharging process itself. Even if you buy the very best rechargeable batteries, if you charge them improperly, you'll get poor results.

    Charging NiMH batteries is the result of a compromise. A low current is gentle on the battery and maximizes its lifespan, but a full charge takes hours. A high current will recharge the battery much faster, but put more strain on it, causing it to wear out prematurely. It also requires careful monitoring of the battery's electrical characteristics to prevent damage.

    Most of the chargers on the market today use one or the other of these methods. The fast chargers, especially the cheap ones, excel at one thing: destroying perfectly good batteries, because they lack the monitoring circuitry to control the charge current and detect when the battery is full. The slow chargers are usually better, mainly because it's harder to design a really bad slow charger. Unfortunately... they're slow.

    Most bundled battery chargers are junk. Given the inherent compromises of charging, you need something smart. That's why I ended up tossing my generic "rapid" chargers in favor of the majestic, glorious, and surprisingly inexpensive La Crosse Technology BC-900 AlphaPower battery charger.

    Lacrosse Techology BC-900 AlphaPower battery charger

    Seriously, just look at this thing. It's a geek's dream. Each battery can be controlled individually, with its own real-time LCD readout, in four modes:

    1. Charge at various rates, from 200/500/700/1000mA
    2. Discharge at 1/2 the charging rate
    3. Test to determine true battery capacity
    4. Refresh to "revitalize" older batteries

    You can also switch between four different readouts after the mode is engaged: time elapsed, voltage, mAh charge/discharge rate, and current mAh capacity. That refresh mode is incredibly slow-- it's basically discharging and recharging over and over-- but it really works. It can take marginal batteries from the brink of death and give them new life.

    But you don't have to care about any of that; if you just drop 4 AAs or AAA batteries in the device, it will charge them fine. I spent several hours after I got it plugging various batteries in it, trying different modes, and watching it work. I'm not sure what the exact definition of geek is, but I think "enjoys recharging batteries" has to be very high on that list.

    I can't recommend the BC-900 highly enough. Did I mention it comes packaged with a starter set of 4 rechargeable AA and AAA batteries, D-cell adapter shells, and a nifty nylon carrying case, too? But don't take my word for it. Read the Amazon reviews; they're positively glowing.

    The gadget world may run on AA and AAA cells, but armed with a basic knowledge of NiMH battery technology and a great recharger, you too can be more than prepared to meet that challenge.

    Gentlemen, start your chargers.

    Discussion