Coding Horror

programming and human factors

Real Ultimate Programming Power

A common response to The Ferengi Programmer:

From what I can see, the problem of "overly-rule-bound developers" is nowhere near the magnitude of the problem of "developers who don't really have a clue."
The majority of developers do not suffer from too much design patterns, or too much SOLID, or agile, or waterfall for that matter. They suffer from whipping out cowboy code in a pure chaos environment, using simplistic drag & drop, data driven, vb-like techniques.

Absolutely.

But here's the paradox: the types of programmers who would most benefit from these guidelines, rules, principles, and checklists are the least likely to read and follow them. Throwing a book of rules at a terrible programmer just creates a terrible programmer with a bruise on their head where the book bounced off. This is something I discussed previously in Mort, Elvis, Einstein, and You:

Thus, if you read the article, you are most assuredly in the twenty percent category. The other eighty percent are not actively thinking about the craft of software development. They would never find that piece, much less read it. They simply don't read programming blogs – other than as the result of web searches to find quick-fix answers to a specific problem they're having. Nor have they read any of the books in my recommended reading list. The defining characteristic of the vast majority of these so-called "vocational" programmers is that they are unreachable. It doesn't matter what you, I or anyone else writes here – they'll never see it.

In the absence of mentoring and apprenticeship, the dissemination of better programming practices is often conveniently packaged into processes and methodologies. How many of these do you know? How many have you practiced?

1969 Structured programming
1975 Jackson Structured Programming
1980 Structured Systems Analysis and Design Methodology
1980 Structured Analysis and Design Technique
1981 Information Engineering
1990 Object-oriented programming
1991 Rapid Application Development
1990 Virtual finite state machine
1995 Dynamic Systems Development Method
1998 Scrum
1999 Extreme Programming
2002 Enterprise Unified Process
2003 Rational Unified Process
2004 Constructionist Design Methodology
2005 Agile Unified Process

And how do we expect the average developer to find out about these? In a word, marketing. (I could have substituted religion here without much change in meaning.) It's no coincidence that a lot of the proponents of these methodologies make their living consulting and teaching about them. And they have their work cut out for them, too, because most programmers are unreachable:

I was sitting in my office chatting with my coworker Jeremy Sheeley. Jeremy leads the dev team for Vault and Fortress. In the course of our discussion, I suddenly realized that none of our marketing efforts would reach Jeremy. He doesn't go to trade shows or conferences. He doesn't read magazines. He doesn't read blogs. He doesn't go to user group meetings.
Jeremy is a decision-maker for the version control tool used by his team, and nothing we are doing would make him aware of our product. How many more Jeremies are out there?

Millions! As Seth Godin notes, the unreachable are now truly unreachable – at least not through marketing.

So, if we know the programmers who would benefit most from these rules and principles and guidelines are:

  • highly unlikely to ever read them of their own volition
  • almost impossible to reach through traditional religion marketing

Remind me again – who, exactly, are we writing these principles, rules, guidelines, and methodologies for? If we're only reaching the programmers who are thoughtful enough to care about their work in the first place, what have we truly accomplished? I agree with Jeff R., who left this comment:

There's nothing wrong with the SOLID principles; they make sense to me. But I've been programming since the days of card readers and teletypes. They won't make sense to those with little experience. They don't know when or how to apply them appropriately. They get bogged down in the attempt.

So trying to follow them changes the focus from result to process. And that's deadly.

It's the job of the lead programmer or manager to see that good principles are followed, perhaps by guiding others invisibly, without explicitly mandating or even mentioning those principles.

In my effort to suck less every year, I've read hundreds of programming books. I've researched every modern programming methodology. I'm even a Certified Scrum Mastertm. All of it, to me, seems like endlessly restated versions of four core fundamentals. But "four core fundamentals?" that's awful marketing. Nobody will listen in rapt, adoring attention to me as I pontificate, nor will they pay the exorbitant consulting fees I demand to support the lifestyle I have become accustomed to. It simply won't do. Not at all. So, I dub this:

The Atwood System of Real Ultimate Programming Power

All those incredibly detailed rules, guidelines, methodologies, and principles? YAGNI. If it can't be explained on a single double-spaced sheet of paper, it's a waste of your time. Go read and write some code! And if you can't grok these fundamentals in the first three or four years of your programming career, well – this slightly modified R. Lee Ermey quote comes to mind.

My name is Jeff, and I can't stop thinking about programming. And neither should you.

Discussion

The Ferengi Programmer

There was a little brouhaha recently about some comments Joel Spolsky made on our podcast:

Last week I was listening to a podcast on Hanselminutes, with Robert Martin talking about the SOLID principles. (That's a real easy-to-Google term!) It's object-oriented design, and they're calling it agile design, which it really, really isn't. It's principles for how to design your classes, and how they should work. And, when I was listening to them, they all sounded to me like extremely bureaucratic programming that came from the mind of somebody that has not written a lot of code, frankly.

There's nothing really objectionable about Bob's object-oriented design principles, on the face of it. (Note that all links in the below table are PDFs, so click accordingly.)

The Single Responsibility Principle A class should have one, and only one, reason to change.
The Open Closed Principle You should be able to extend a classes behavior, without modifying it.
The Liskov Substitution Principle Derived classes must be substitutable for their base classes.
The Dependency Inversion Principle Depend on abstractions, not on concretions.
The Interface Segregation Principle Make fine grained interfaces that are client specific.
The Release Reuse Equivalency Principle The granule of reuse is the granule of release.
The Common Closure Principle Classes that change together are packaged together.
The Common Reuse Principle Classes that are used together are packaged together.
The Acyclic Dependencies Principle The dependency graph of packages must have no cycles.
The Stable Dependencies Principle Depend in the direction of stability.
The Stable Abstractions Principle Abstractness increases with stability.

While I do believe every software development team should endeavor to follow the instructions on the paint can, there's a limit to what you can fit on a paint can. It's the most basic, most critical information you need to proceed and not make a giant mess of the process. As brief as the instructions on a paint can are, they do represent the upper limit of what most people will realistically read, comprehend, and derive immediate benefit from.

Expanding from a few guidelines on a paint can into a detailed painting manual is far riskier. The bigger and more grandiose the set of rules you come up with, the more severe the danger. A few broad guidelines on a paint can begets thirty rules for painting, which begets a hundred detailed principles of painting..

Pretty soon you'll find yourself believing that every possible situation in software development can be prescribed, if only you could come up with a sufficiently detailed set of rules! And, of course, a critical mass of programmers patient enough to read Volumes I - XV of said rules. You'll also want to set up a few messageboards for these programmers to argue endlessly amongst themselves about the meaning and interpretation of the rules.

This strikes me as a bit like Ferengi programming.

Ferengi Rules of Acquisition, book cover

The Ferengi are a part of the Star Trek universe, primarily in Deep Space Nine. They're a race of ultra-capitalists whose every business transaction is governed by the 285 Rules of Acquisition. There's a rule for every possible business situation -- and, inevitably, an interpretation of those rules that gives the Ferengi license to cheat, steal, and bend the truth to suit their needs.

At what point do you stop having a set of basic, reasonable programming guidelines -- and start being a Ferengi programmer, an imperfect manifestation of the ruleset?

Like James Bach, I've found less and less use for rules in my career. Not because I'm a self-made genius who plays by my own rules, mind you, but because I value the skills, experience, and judgment of my team far more than any static set of rules.

When Ron says there is an "absolute minimum of practice" that must be in for an agile project to succeed, I want to reply that I believe there is an absolute minimum of practice needed to have a competent opinion about things that are needed -- and that in his post he does not achieve that minimum. I think part of that minimum is to understand what words like "practice" and "agile" and "success" can mean (recognizing they are malleable ideas). Part of it is to recognize that people can and have behaved in agile ways without any concept of agile or ability to explain what they do.

My style of development and testing is highly agile. I am agile in that I am prepared to question and rethink anything. I change and develop my methods. I may learn from packaged ideas like Extreme Programming, but I never follow them. Following is for novices who are under active supervision. Instead, I craft methods on a project by project basis, and I encourage other people to do that, as well. I take responsibility for my choices. That's engineering for adults like us.

Guidelines, particularly in the absence of experts and mentors, are useful. But there's also a very real danger of hewing too slavishly to rulesets. Programmers are already quite systematic by disposition, so the idea that you can come up with a detailed enough set of rules, and sub-rules, and sub-sub-rules, that you can literally program yourself for success with a "system" of sufficient sophistication -- this, unfortunately, comes naturally to most software developers. If you're not careful, you might even slip and fall into a Methodology. Then you're in real trouble.

Don't become a Ferengi Programmer. Rules, guidelines, and principles are gems of distilled experience that should be studied and respected. But they're never a substute for thinking critically about your work.

Discussion

The Elephant in the Room: Google Monoculture

I was browsing the sessions at an upcoming Search Conference, which describes itself thusly:

The way to online success is through being easily found in search engines such as Google, Yahoo!, and Microsoft Live Search. While developers have historically thought of search as a marketing activity, technical architecture has now become critical for search success.

Anyone else see the elephant in the room, there? No?

Banksy: elephant in room

Just two weeks after we launched Stack Overflow, I mentioned that search engines already made up 50% of our traffic. Well, not so much search engines as search engine:

I try to be politically correct in discussing web search, avoiding the g-word whenever possible, desperately attempting to preserve the illusion that web search is actually a competitive market. But it's becoming a transparent and cruel joke at this point. When we say "web search" we mean one thing, and one thing only: Google. Rich Skrenta explains:

I'm not a professional analyst, and my approach here is pretty back-of-the-napkin. Still, it confirms what those of us in the search industry have known for a long time.

The New York Times, for instance, gets nearly 6 times as much traffic from Google as it does from Yahoo. Tripadvisor gets 8 times as much traffic from Google vs. Yahoo.

Even Yahoo's own sites are no different. While it receives a greater fraction of Yahoo search traffic than average, Yahoo's own flickr service gets 2.4 times as much traffic from Google as it does from Yahoo.

My favorite example: According to Hitwise, [ex] Yahoo blogger Jeremy Zawodny gets 92% of his inbound search traffic from Google, and only 2.7% from Yahoo.

That was written almost two years ago. Guess which way those numbers have gone since then?

Now that Stack Overflow has been chugging right along for almost six months, allow me to share the last month of our own data. Currently, 83% of our total traffic is from search engines, or rather, one particular search engine:

Search EngineVisits
Google3,417,919
Yahoo9,779
Live5,638
Search2,961
AOL1,274
Ask1,186
MSN1,177
Altavista202
Yandex191
Seznam103

Those 6x and 8x numbers that Rich quoted two years ago seem awfully quaint now. Google delivers 350x the traffic to Stack Overflow that the next best so-called "search engine" does. Three hundred and fifty times!

Now, I don't claim that Stack Overflow is representative of every site on the internet -- obviously it isn't. It's a site for programmers. And let me be absolutely crystal clear that I have no problem at all with Google. That said, I find it profoundly disturbing that if every other search engine in the world shut down tomorrow, our website's traffic would be effectively unchanged. That's downright scary.

Yes, I like Google. Yes, Google works great and has been my homepage for about eight years now. Google nailed search, and they deserve the leadership position they've earned. But where's the healthy competition? Where's the incentive for Google to improve? All I see is a large and growing monoculture that acts as the start page for the internet.

I'm a little surprised all the people who were so up in arms about the Microsoft "monopoly" ten years ago aren't out in the streets today lighting torches and sharpening their pitchforks to go after Google. Does the fact that Google's products are mostly free and ad-supported somehow exempt it from the same scrutiny? Isn't anyone else concerned that Google, even with the best of "don't be evil" intentions, has become more master than servant?

Calling the current state of search engine competition a horse race is an insult to horse races. No, what we have here is a one horse race where all the other horses were shipped off to glue factories years ago. Forget "search conference", you should be throwing a "Google conference", because there's no difference.

I don't know. Maybe that's OK. But it does mean that if Google, for whatever reason, decided to remove you from its search results, your website no longer exists. At least not as a viable business, anyway.

Discussion

Don't Reinvent The Wheel, Unless You Plan on Learning More About Wheels

The introduction to Head First Design Patterns exhorts us not to reinvent the wheel:

You're not alone. At any given moment, somewhere in the world someone struggles with the same software design problems you have. You know you don't want to reinvent the wheel (or worse, a flat tire), so you look to Design Patterns – the lessons learned by those who've faced the same problems. With Design Patterns, you get to take advantage of the best practices and experience of others, so that you can spend your time on … something else. Something more challenging. Something more complex. Something more fun.

Avoiding the reinvention of the proverbial wheel is a standard bit of received wisdom in software development circles. There's certainly truth there, but I think it's a bit dangerous if taken too literally – if you categorically deny all attempts to solve a problem with code once any existing library is in place.

square bike wheel

I'm not so sure. I think reinventing the wheel, if done properly, can be useful. For example, James Hart reinvented the wheel. And he liked it:

I reinvented the wheel last week. I sat down and deliberately coded something that I knew already existed, and had probably also been done by many many other people. In conventional programming terms, I wasted my time. But it was worthwhile, and what's more I would recommend almost any serious programmer do precisely the same thing.

But who's James Hart? Just another programmer. If that doesn't carry enough weight for you, how does it sound coming from Charles Moore, the creator of FORTH?

A second corollary was even more heretical: "Do it yourself!"

The conventional approach, enforced to a greater or lesser extent, is that you shall use a standard subroutine. I say that you should write your own subroutines.

Before you can write your own subroutines, you have to know how. This means, to be practical, that you have written it before; which makes it difficult to get started. But give it a try. After writing the same subroutine a dozen times on as many computers and languages, you'll be pretty good at it.

Moore followed this to an astounding extent. Throughout the 70's, as he implemented Forth on 18 different CPUs, he invariably wrote for each his own assembler, his own disk and terminal drivers, even his own multiply and divide subroutines (on machines that required them, as many did). When there were manufacturer-supplied routines for these functions, he read them for ideas, but never used them verbatim. By knowing exactly how Forth would use these resources, by omitting hooks and generalities, and by sheer skill and experience (he speculated that most multiply/divide subroutines were written by someone who had never done one before and never would again), his versions were invariably smaller and faster, usually significantly so.

Moreover, he was never satisfied with his own solutions to problems. Revisiting a computer or an application after a few years, he often re-wrote key code routines. He never re-used his own code without re-examining it for possible improvements. This later became a source of frustration to Rather, who, as the marketing arm of FORTH, Inc., often bid jobs on the assumption that since Moore had just done a similar project this one would be easy – only to watch helplessly as he tore up all his past code and started over.

And then there's Bob Lee, who leads the core library development on Android.

Depending on the context, you can almost always replace "Why reinvent the wheel?" with "Please don't compete with me," or "Please don't make me learn something new." Either way, the opponent doesn't have a real argument against building something newer and better, but they also don't want to admit their unhealthy motivations for trying to stop you.

More seeds, more blooms, I say. Don't build houses on kitchen sinks. Reinvent away. Most of our current technology sucks, and even if it didn't, who am I to try and stop you?

Indeed. If anything, "Don't Reinvent The Wheel" should be used as a call to arms for deeply educating yourself about all the existing solutions – not as a bludgeoning tool to undermine those who legitimately want to build something better or improve on what's already out there. In my experience, sadly, it's much more the latter than the former.

So, no, you shouldn't reinvent the wheel. Unless you plan on learning more about wheels, that is.

Discussion

You're Doing It Wrong

In The Sad Tragedy of Micro-Optimization Theater we discussed the performance considerations of building a fragment of HTML.

string s =
@"<div class=""action-time"">{0}{1}</div>
<div class=""gravatar32"">{2}</div>
<div class=""details"">{3}<br/>{4}</div>";
return String.Format(s, st(), st(), st(), st());

The second act of this particular theater was foreshadowed by Stephen Touset's comment:

The correct answer is that if you're concatenating HTML, you're doing it wrong in the first place. Use an HTML templating language. The people maintaining your code after you will thank you (currently, you risk anything from open mockery to significant property damage).

The performance characteristics of building small string fragments isn't just a red herring -- no, it's far, far worse. The entire question is wrong. This is one of my favorite lessons from The Pragmatic Programmer.

When faced with an impossible problem, identify the real constraints. Ask yourself: "Does it have to be done this way? Does it have to be done at all?"

If our ultimate conclusion was that performance is secondary to readability of code, that's exactly what we should have asked, before doing anything else.

Let's express the same code sample using the standard ASP.NET MVC templating engine. And yes, we render stuff like this all over the place in Stack Overflow. It's the default method of rendering for a reason.

<div class="action-time"><%= User.ActionTime %></div>
<div class="gravatar32"><%= User.Gravatar %></div>
<div class="details"><%= User.Details %><br/><%= User.Stuff %></div>

We have a HTML file, through which we poke some holes and insert the data. Simple enough, and conceptually similar to the String.Replace version. Templating works reasonably well in the trivial cases when you have an object with obvious, basic data types in fields that you spit out.

But beyond those simple cases, it's shocking how hairy HTML templating gets. What if you need do to a bit of formatting or processing to get that data into shape before displaying it? What if you need to make decisions and display things differently depending on the contents of those fields? Your once-simple page templates get progressively more and more complex.

<%foreach (var User in Users) { %>
<div class="action-time"><%= ActionSpan(User)%></div>
<% if (User.IsAnonymous) { %>
<div class="gravatar32"><%= RenderGravatar(User)%></div>
<div class="details"><%= RepSpan(User)%><br/><%= Flair(User)%></div>
<% } else { %>
<div class="anon">anonymous</div>
<% } %>
<% } %>

This is a fairly mild case, but you can see where templating naturally tends toward a frantic, unreadable mish-mash of code and template -- Web Development as Tag Soup. If your HTML templates can't be kept simple, they're not a heck of a lot better than the procedural string building code they're replacing. And this is not an easy thing to stay on top of, in my experience. The daily grind of struggling to keep the templates from devolving into tag soup starts to feel every bit as grotty as all that nasty string work we were theoretically replacing.

Now it's my turn to ask -- why?

I think existing templating solutions are going about this completely backwards. Rather than poking holes in HTML to insert code, we should simply treat HTML as code.

Like so:

foreach (var User in Users)
{
<div class="action-time">[ActionSpan(User)]</div>
if (User.IsAnonymous)
{
<div class="gravatar32">[RenderGravatar(User)]</div>
<div class="details">[UserRepSpan(User)]<br/>[UserFlairSpan(User)]</div>
}
else
{
<div class="anon">anonymous</div>
}
}

Seamlessly mixing code and HTML, using a minumum of those headache-inducing escape characters. Is this a programming language for a race of futuristic supermen? No. There are languages that can do this right now, today -- where you can stick HTML in the middle of your code. It's already possible using Visual Basic XML Literals, for example.

Visual Basic XML Literals used in an ASP.NET MVC view

Even the hilariously maligned X# has the right core idea. Templating tends to break down because it forces you to treat code and markup as two different and fundamentally incompatible things. We spend all our time awkwardly switching between markup-land and code-land using escape sequences. They're always fighting each other -- and us.

Seeing HTML and code get equal treatment in my IDE makes me realize one thing:

We've all been doing it wrong.

Discussion