Coding Horror

programming and human factors

Software Development: It's a Religion

It's Monday, and Steve Yegge still hates Agile software development. How much does he hate it? Approximately 11,000 words' worth. I think I could start a cottage industry producing Cliff's Notes versions of Steve Yegge posts. Here's my condensed version of Steve's latest:

  • Steve didn't intend to promote Google's software development process as the One True Method of software development. It's just an example of one possible alternative to Agile.
  • Agile software development methodologies only work because any software development methodology works if you have reasonably talented engineers trying hard enough.
  • There's no empirical, scientific way to prove that Agile is any better than any other software development methodology. Thus, the promotion of Agile is a superstition.

The repeated use of the word "superstition" is a key point. In his previous post, he referred to Agile as a religion:

So the consultants, now having lost their primary customer, were at a bar one day, and one of them (named L. Ron Hubbard) said: "This nickel-a-line-of-code gig is lame. You know where the real money is at? You start your own religion." And that's how both Extreme Programming and Scientology were born.

Steve says "religion" like that's a bad thing.

bar code Jesus

But software development is, and has always been, a religion. We band together into groups of people who believe the same things, with very little basis for proving any of those beliefs. Java versus .NET. Microsoft versus Google. Static languages versus Dynamic languages. We may kid ourselves into believing we're "computer scientists", but when was the last time you used a hypothesis and a control to prove anything? We're too busy solving customer problems in the chosen tool, unbeliever!

The fundamental cultural problem with superstition arises when people don't know how to differentiate between reliable and unreliable data-verification sources, so they treat non-science as science. If they don't recognize or admit that they're being superstitious, then they'll feel no qualms at all about trying to propagate their beliefs to YOU.

Did Steve Yegge prove that Google's software development methodology is any better than big-a Agile? No, but he strongly believes it to be so in his heart of hearts. You might even say he's religious about it. And so what? There's nothing wrong with a religion that preaches solid engineering. If you're a true believer in the church of Google methodology, you'll become a better developer. When Steve evangelizes the glory of Google's methodology so eloquently, it gets other developers excited about what they're doing and makes them better developers, too. Before you know it, the whole team is juiced because everything is coming together on this totally awesome plan that they all believe in-- whether it's big-a Agile, Google, Scrum, or whatever.

In order for programmers to be effective, they have to believe in what they're doing. Whether that belief is scientifically and empirically provable is completely irrelevant. It's Peopleware 101. Steve is so hell-bent on proving his own methodology athiesm that he undermines his own point. It's all religion, as Jef Raskin points out:

Computer systems exhibit all the behaviors best suited to create superstitious responses. You will try something, it won't work, so you try it again -- the exact same way -- and this time it works, or not. That's random reinforcement. The effectiveness of many programming and management practices thus are not measurable. Most of the principles of "extreme programming," for example, seem reasonable to me, and I was using many of them long before they had acquired their present absurd name. The people who promulgate the idea, however, are also those who created the paradigm. Most reported results aren't even single-blind, much less double-blind. We rarely understand, in any detail, the processes going on behind the tasks we do with computers. We're using megabytes of code written by others, code that is indifferently documented and inadequately tested, and which is being used in ways and in combinations unforeseen by its creators.

Software methodology religion, in and of itself, isn't dangerous. There's always the risk of religious persecution, but the only truly dangerous people are the religious nuts who don't realize they are religious nuts. I don't like resorting to name-calling, but I just can't see how Steve could otherwise be so glib:

But rather than enumerating all the non-Agile teams and companies as special cases, let's get straight to the rule: Most great software developers around the world don't use Agile. They just work hard, they stay lightweight, and they ship great stuff. Most developers still have no idea what Agile even is. Think of that!

  1. How exactly is "staying lightweight" different from Agile? Isn't Agile the Church of Lightweight? Steve may disagree with specific tenets of the church, (pair programming, etc), and that's fine. Nobody prays exactly the same way, and not everyone is a zealot. But you always have a set of shared, core beliefs. And lightweight development is clearly a core Agile belief.

  2. It's true that most developers don't use agile software development. But it's not by choice. They weren't given a choice. They're usually stuck in organizations that don't allow them the luxury of "staying lightweight". They're peons trapped in a sausage factory.

  3. Steve talks about "staying lightweight" as if it's the easiest thing in the world, like it's some natural state of grace that developers and organizations are born into. Telling developers to stay lightweight is like telling depressed people to cheer up.

I'm not sure why Steve is so uncomfortable with the idea of software development as a religion, as a set of shared beliefs with very little "basis [in] scientific experiment or pure deductive reasoning." Because he's surely one of the best evangelists I've ever read.

Discussion

DEFCON: Shall We Play a Game?

Earlier this year I wrote about how much I loved Introversion Software's indie PC game Darwinia. Introversion just released their newest game, DEFCON.

DEFCON channels WarGames and Balance of Power..

Balance of Power screenshot

Balance of Power game over screenshot

.. but Defcon begins where Balance of Power ended:

DEFCON screenshot

It's positively strangelovian.

The developers nail the mood of cold war paranoia, as explained in this Eurogamer interview:

[Defcon] simulates Global Thermonuclear War.

Points are scored by successfully nuking the enemy civilian population into oblivion. This is an extremely difficult task because launching an attack on the enemy makes you very vulnerable - Ground Silos and Subs and Bombers all give away their positions the moment they launch nuclear weapons.

We're playing this game every day and people keep coming up with new strategies - but the bottom line is it's very difficult to win convincingly. Games often end with both sides obliterated. It's a fascinating and nervous game to play.

We've gone for a very minimal atmosphere, with some wonderful ambient music playing (written by Alistair Lindsay and Michael Maidment - the same guys that did the awesome Darwinia audio). There's very little in-game sound except deep rumbles when nukes hit. It's like you're ten miles underground in a bunker, bringing the world to an end one city at a time, completely detached from the millions of deaths you are causing.

DEFCON uses OpenGL, which is quite problematic in Vista at the moment. It works great under XP, of course.

The game is unusually multitasking friendly; it doesn't capture the mouse pointer, so you can run it on a secondary monitor and treat it just like any other window on your screen. It also continues to run in the background when you minimize it. Running in the background is essential for those inevitable office matches:

Yeah, we're very excited by Office Mode. The basic idea is that a group of work-mates can start the game up in the morning in Office Mode, playing over their local area network. The game takes place entirely in real-time (you can quite easily end the world with nuclear conflict in eight hours) and each player controls one territory, e.g. North America or Russia. You can hit the Panic key (press escape twice) which immediately removes the game from the screen and places a discreet icon in your system tray. That icon changes when important things happen - for example if you detect some nuclear launches the icon will flash as a Nuke for a few seconds. Because everything is taking place in real-time you've got at least 30 minutes before those nukes land, so you've got plenty of time to respond without interfering with your real work too much.

Although DEFCON offers four speeds from real-time to 20x, even on 20x it's still a relatively slow paced real time strategy game, with plenty of "think time". Battleships and bombers don't turn on a dime, and rapid clicking won't win games.

Unlike Darwinia, DEFCON is primarily a multiplayer experience. Although you can play against the computer AI-- our good friend the W.O.P.R.-- there's not much of a single player narrative to the game. The best way to conduct Global Thermonuclear War is with a couple of your closest friends. Download the demo version, which has functional LAN and internet multiplayer, and nuke your coworkers into the stone age.

Discussion

Building and Overclocking a Core 2 Duo System

It's been over a year since I built my last PC, and all those killer Core 2 Duo benchmark and overclocking results were making me anxious. I just pulled the trigger on the following Core 2 Duo upgrade:

I'm not replacing my video card, hard drive, power supply, or case. This is a straight "drop in" replacement for my existing Athlon X2 4800+.

First, a few words on why I chose these specific parts. Computer hardware is one of my few indulgences, but I do a freakishly obsessive amount of research before buying anything. Allow me to share my freakish obsession with you, dear reader. After all, that's what the internet is for.

  1. Motherboard. The 965 Express was an editors' choice at Tech Report. It's the most modern chipset for the Core 2 Duo, too. ASUS is a well respected brand name, and I really like the fact that it has a silent heatpipe on the northbridge instead of a fan. Modern northbridges run very hot, and cooling them quietly can be a PITA because of their proximity to the CPU and video cards.

  2. Memory. Fast DDR2 memory ain't cheap. And I won't go below 2 gigabytes, which is what I consider a mainstream memory configuration these days. Have you priced 2 gigabytes of DDR2-1066 lately? Personally, I think buying extremely fast memory is overrated; by the time the system has to reach beyond the L1 and L2 cache into main memory, the performance penalty is already so severe that a few additional nanoseconds isn't going to matter in the big scheme of things. That's why I went with a nice midrange DDR2-667, specifically the AData Vitesta memory which did quite well in a recent AnandTech value memory roundup. Even if you push the front side bus up to 400 MHz-- what I consider an extreme overclock-- that's still only (400 x 2) or DDR2-800 officially. And all the value DDR2-533 memory AnandTech tested ran fine at 800 speeds, as long as you bumped up the voltage a bit.

  3. CPU. Core 2 Duo is clearly the benchmark champ at the moment. I've been a long time AMD enthusiast, but Intel finally abandoned the problematic Pentium 4 architecture and built a better mousetrap this time. The E6600 is the cheapest Core 2 Duo with 4 megabytes of level 2 cache. I'm a big believer in cache, so I'm not willing to drop down to the E6300 or E6400 which only have 2 megabytes of L2. This might be a little irrational if you actually compare the performance of both cache sizes on an apples-to-apples clock rate basis, but so be it. I loves me some L2 cache.

  4. Heatsink. If you want a quiet PC, buy the best CPU heatsink you can afford. That said, the Scythe Infinity is definitely overkill for a Core 2 Duo system, even an overclocked and overvolted one. But it's such beautiful, magnificent, glorious overkill. It barely fit in my case. That just made me love it all the more. This monster barely gets warm under dual Prime95 load. Running it completely passive is a no-brainer, but make sure you have proper case airflow.

My general strategery with computer upgrades is to buy upper midrange and overclock myself into high-end territory for extra value. The Core 2 Duo CPU makes this easy, because they're all incredible overclockers. I overclocked my $319 2.4 GHz E6600 chip beyond $999 Core 2 "extreme" X6800 territory with a few quick modifications in the BIOS:

Intel Core 2 Duo overclocking results

As a responsible overclocker, I also ensure the system is actually stable at these settings through hours and hours of Prime 95 torture testing. I still have those two instances of Prime95 running in the background as I'm writing this post.

So how did I turn my 90-pound weakling of a $320 CPU into a fire-breathing $999 monster CPU? It's quite easy. Read on.

  1. Install the latest BIOS on the motherboard. This is standard operating practice whenever I build a new system. On the P5B, the flash utility is built into the BIOS and it even supports USB flash drives! Finally! I downloaded the latest P5B Deluxe BIOS from ASUS' web site, copied it on a flash drive, plugged it in. I then booted, pressed ALT+F2 during startup to access the flash utility, and it autodetected the new BIOS file. All I had to do was hit enter to start the BIOS update, and I was done.

  2. Slowly increase the FSB speed in the BIOS. I have an E6600, which is a 2.4 GHz chip with a 9x multiplier. That means the FSB speed is 2400 / 9 or 266 MHz. As I increase the FSB speed, the CPU speed also increases. I first tried 333 MHz, which results in 333 * 9 or 3.0 GHz. As you can see in the screenshot, I've currently gone a bit further for 3.15 GHz. Remember, make small changes and test as you go. Don't immediately go for the highest possible overclock. Be conservative initially; you can adjust upward more later after you develop confidence.

  3. Increase voltage to the CPU, and memory in the BIOS. To goose that extra bit of performance out of your system, increase voltages in the BIOS across the board. Don't worry, I'm not talking about massive increases here-- just slight boosts. I'm using 1.425 volts for the CPU (up from 1.35v), and 2.1 volts (up from 1.8v) for the memory. If what you want to do doesn't work with these modest voltage boosts, it probably won't work at all.

  4. Boot and see what happens.
    • My computer won't boot. Don't worry. No harm, no foul. Unplug your system, find the clear CMOS jumper on your motherboard, and use it to clear the CMOS. You can also pop out the CMOS battery if you're impatient. Make sure you do this with the system unplugged, and give the system a full minute to clear the CMOS.
    • I can't boot into my operating system. Your overclocking settings are too aggressive. We already increased voltage, so you need to back down your overclocking settings in the BIOS.
    • It works! Maybe it does, maybe it doesn't. Don't get cocky. See next step.

  5. Burn your new settings in with Prime95. Assuming you booted and logged into your operating system without crashing, hanging, or bluescreening*, your next job is to run torture tests to see if things are really working. Prime95 is your new best friend. You'll run one instance for every core in your CPU-- create a copy of the Prime95 folder for each core, and run the executables from those folders. Use Options, Torture Test, "In place large FFTs" to start. If you can run Prime95 this way for an hour, it's very likely your system is stable. If you can run Prime95 this way overnight, your system is guaranteed stable.

Now that I've gotten my Core 2 Duo system stable at 3+ GHz, I can bask in the glory of a system that's 50% faster than my old Athlon X2 4800+ -- at least according to SYSmark 2004. Not bad for under 800 bucks.

* Sounds traumatic, but if you want to make an omelette, you have to break some eggs. Don't be afraid to break stuff.

Discussion

On Frameworkitis

Alex Gorbatchev, after a long hiatus, is blogging again. What was keeping him away? Frameworkitis.

This is the longest break in posting I've had in the last 2.5 years of blogging. Community Server is really bringing me down… I just don't like it.

So, I started working on my own blog engine… for like the 6th time. This time it's different. It's actually moving ahead. Not so long ago I read an article about how to stay productive on a project. One thing that I took away from it, was the most crippling problem in all my previous projects - procrastinating via developing frameworks. Every time I have started working on my own blog, I delved deep into creating frameworks and [needless] to say, none of my previous 5 attempts went even as far as being able to post a new entry.

It really is very hard trying to stay away from creating libraries, thinking about future uses and what extra things I can put in that might used in the future. Simply creating code that works and only does what I need it to really helps to move things ahead.

Indeed.

Can you guess what the number one sin in Eric Gunnerson's Seven Deadly Sins of Programming is? I bet you can.

Although I'll admit I've been sorely tempted myself, I wonder if writing your own blog software isn't a form of procrastination in and of itself.

Discussion

Is Software Development Like Manufacturing?

We've adopted Scrum for all of our software development at Vertigo. Although I'm totally in favor of Anything But Waterfall, Scrum is an unfortunate name:

  1. It's two additional characters away from a term for male genitalia.
  2. The term is derived from rugby, an extraordinarily violent sport. During my first year at college, a guy on our hall participated in Rugby. His ongoing injuries, both small and large, became a running joke in the dorm. Eventually even he started to re-evaluate the merits of the sport. As Steve pointed out, Wikipedia defines scrum as ".. the most dangerous phase in rugby, since a collapse or inproper engage can lead to a front row player damaging or even breaking his neck." My indirect experience with rugby leads me to agree. The most dangerous phase of a violent sport is not exactly the sort of thing you want to add to your project.
  3. When you tell customers your software developers use the Scrum process, they have absolutely no idea what you're talking about.

We usually say "agile" to avoid all the weird connotations of the word Scrum.

To promote understanding of Scrum, and Agile software development in general, everyone at Vertigo got a copy of Mary and Tom Poppendieck's book, Lean Software Development: An Agile Toolkit. I was inclined to like the book, because I'm a big fan of Mary Poppendieck's article Team Compensation (pdf).*

Although the book is great, the Poppendiecks spend a lot of their time drawing parallels between software development and manufacturing. Every few pages, you'll find some example from a classic manufacturing company: Ford, L.L. Bean, GM, Dell, Toyota, etcetera. Although the examples do extend beyond the manufacturing sector, they're definitely dominated by it.

Perhaps this makes sense if you consider that Scrum originated in manufacturing:

Scrum was named as a project management style in auto and consumer product manufacturing companies by Takeuchi and Nonaka in "The New New Product Development Game" (Harvard Business Review, Jan-Feb 1986). Jeff Sutherland, John Scumniotales, and Jeff McKenna documented, conceived and implemented Scrum as it is described below at Easel Corporation in 1993, incorporating team managment styles noted by Takeuchi and Nonaka. In 1995, Ken Schwaber formalized the definition of Scrum and helped deploy it worldwide in software development.

The manufacturing examples presented in the book don't resonate with me at all. I'm not convinced that manufacturing industries and software development have much, if anything, in common. Fortunately, the Poppendiecks address this criticism early in the book:

The origins of lean thinking lie in production, but lean principles are broadly applicable to other disciplines. However, lean production practices -- specific guidelines on what to do -- cannot be transplanted directly from a manufacturing plant to software development. Many attempts to apply lean production practices to software development have been unsuccessful because generating good software is not a production process; it is a development process.

Development is quite different than production. Think of development as creating a recipe and production as following the recipe. Thse are very different activities, and they should be carried out with different approaches. Developing a recipe is a learning process involving trial and error. You would not expect an expert chef's first attempt at a new dish to be the last attempt. In fact, the whole idea of developing a recipe is to try many variations on a theme and discover the best dish.

Once a chef has developed a recipe, preparing the dish means following the recipe. This is equivalent to manufacturing, where the objective is to faithfully and repeatedly reproduce a "recipe" with a minimum of variation.

In many ways, software development is the antithesis of manufacturing:

  • Variability is the enemy in manufacturing; in software, it's the reason we get up in the morning. Every worthwhile software development project is a custom one-off job for a unique problem.
  • Requirements are the bread and butter of manufacturing; in software, we rarely have meaningful requirements. Even if we do, the only measure of success that matters is whether our solution solves the customer's shifting idea of what their problem is.

I suppose the proof is in the pudding; if Scrum works for manufacturers and software development shops alike, then maybe the parallels between the two industries are valid. Still, I think Lean Software Development: An Agile Toolkit would be a much stronger book if it relied more on examples from actual software development efforts and less on examples from the movie Gung Ho.

* which I discovered through Joel Spolsky's The Best Software Writing I.

Discussion