Coding Horror

programming and human factors

Has Joel Spolsky Jumped the Shark?

When you're starting out as a technical blogger, you'll inevitably stumble across Joel on Software. He's been blogging since the year 2000, when computers were hand-carved of wood and the internet transmitted data via carrier pigeon. He has his own software development company, a few books under his belt, and he's an outstanding and entertaining writer by any measure. In many ways, Joel is a legend.

Although Joel's blog entries are generally pure gold, he has generated his fair share of controversy in the last six years. For example, he doesn't like programming using exceptions, despite the fact that they are the bread and butter of modern programming languages. He also said that teaching new programmers only Java is liable to poison their minds, although I think Java is the least of any budding new programmer's problems. But a few of Joel's recent posts go far, far beyond these minor gaffes.

For instance, two weeks ago we found out that Joel's company wrote their flagship product, FogBugz, in a proprietary language they created themselves.

FogBugz is written in Wasabi, a very advanced, functional-programming dialect of Basic with closures and lambdas and Rails-like active records that can be compiled down to VBScript, JavaScript, PHP4 or PHP5. Wasabi is a private, in-house language written by one of our best developers that is optimized specifically for developing FogBugz; the Wasabi compiler itself is written in C#.

You couldn't possibly have heard it, but that was the sound of fifty thousand programmers' heads simultaneously exploding.

Writing your own language is absolutely beyond the pale. It's a toxic decision that is so completely at odds with Joel's previous excellent and sane advice on software development that people literally thought he was joking. He had to write an entire follow-up post to explain that, no, he wasn't kidding.

Read his defense of Wasabi. If anything, it amplifies the insanity. Because, you know, installing a PHP/NET/Java runtime at a customer site is totally unsupportable, even though it's the business model of 99.9% of the rest of the world. And with Wasabi, they can add any language features they want! Just like Lisp, right? And eventually they'll plug in a .NET CLR back-end to Wasabi and generate bytecode! Never mind the fact that your company's flagship application is still written in a freaky custom language based on VBScript that only three people in the world know how to program.

But wait! It gets worse!

Joel Spolsky -- WTF

Now Joel says that a dynamically typed language like Ruby can't possibly be fast enough to run FogBugz:

I understand the philosophy that developer cycles are more important than cpu cycles, but frankly that's just a bumper-sticker slogan and not fair to the people who are complaining about performance. Even though our product, FogBugz, seems like something that should be perfect for Ruby on Rails, we have several parts of code where performance is extremely important. In FogBugz 6 there's one place where we need to do literally millions of calculations to display a single chart on a single web page. We have gotten it down to 3 seconds or so in our current development environment with a lot of optimization, but frankly with a duck-typed function call I really don't think we could do it before the web browser gave up and timed out and the sun cooled down a couple of degrees.

Let me get this straight. Let me make sure I'm understanding this. Because I think I've gone crosseyed.

  1. I don't see how Wasabi-- a language that, per Joel, compiles down to VBScript on Windows-- could actually be faster than Ruby. VBScript certainly isn't compiled, and it isn't exactly known for its blazing speed. Speed improvement is one of the many bullet points used to justify the switch from ASP to ASP.NET.

  2. If performance is so critically important in this section of the code, why wouldn't Joel simply build that section of the code in a compiled language and call it from the other language? Am I missing something here? Is there some law that states all code for a web application must be in the same exact language?

  3. Justifying any language choice based on one tiny section of the code makes no sense whatsoever. It's a complete reversal of the well-known 90/10 rule. If we followed Joel's logic, we should reject all dynamically typed languages. Even in a world filled with 3 gigahertz $200 dual-core CPUs that get cheaper every nanosecond. Because, y'know, there's this one part here that's kinda slow.

All of this makes me wonder: has Joel Spolsky jumped the shark?

I reject this new, highly illogical Joel Spolsky. I demand the immediate return of the sage, sane, wise Joel Spolsky of years past. But maybe it's like wishing for a long-running television show to return to its previous glories.

Discussion

Technological Racism

Brian Kuhn recently described the real risk of technocentrism.

[..] people use (or have rejected) particular operating systems, tools, and software that has in turn shaped their perceptions when it comes to making judgments on the various merits of particular technologies. People tend to categorize or identify themselves with particular "technological cultures"; some of the most common being type of operating system they use, development platform, and programming language. Participation and identification with these cultures brings with it a tendency to look at the world primarily from the perspective of one's own technological culture, or technocentrism.

If you do not make an effort to be aware of being technocentric, it is easy to become a zealot of the the technology you identify most with. I find myself guilty of this at times, and see the results of technocentrism most often in the back and forth arguments of many blog comment posts. You can find this sort of rabid fanaticism and heated arguments all over the Internet, and in my opinion it is to the detriment of all participants.

The beginning of a very long flight

[We can all] co-exist without all of us ending up resorting to some sort of technology jihad.

Developers are no strangers to technology jihads; it's an occupational hazard. But creating a website that intentionally and maliciously makes content look worse in Internet Explorer 6, well, that's crossing the line from technocentrism into the disturbing realm of technological racism.

If you happen to be using a non-Internet Explorer browser, let me quickly explain the insanity that is unfolding. It seems some people have intentionally modified their blogs/web sites so that they render differently based on the browser you are using. Usually browser detection is used to ensure that a web site renders properly for a variety of platforms, but now we see that this ability can be used for the reverse. If you are using a version of Internet Explorer you get the web site in black and white coloring and it tends to be very, very ugly. If you are using another browser such as FireFox, it renders in color and the font is actually readable. Internet Explorer users also get a message like "Why pay for black and white when you can get full color for free?" at the top of the page.

Although I fully realize that IE6 is the new Netscape 4.7x, such heavy-handed methods are more likely to hurt the cause than advance it. But perhaps Brian's criticism was ultimately taken to heart; the site in question changed to an alert at the top of the page (visible only in IE) instead of actively making the page content painful to read.

All That Malarkey, a popular CSS design website, does the same thing, but with a decidedly more tongue-in-cheek bent. Here's a side-by-side shot of the same All That Malarkey page in IE6 and IE7:

Malarkey page in IE6   Malarkey page in IE7

It's a funny nod to the black-and-white fashion style of 2 Tone musical artists. If you scroll to the bottom, the joke is revealed:

Internet Explorer 2 old, stomp to da betta browsa

It's clever, but the use of color to discriminate between browsers in both cases is unfortunate; it evokes comparisons with our cultural history of racism and segregation.

colored only sign

I empathize with the pain that IE6 causes. I really do. But keep the goal in mind: getting people to switch away from a browser that's nearly six years old. The zealotry and vitriol of technological racism is not a particularly effective way to realize change. The best way to get rid of IE6 is through gentle evangelism. Don't waste your time attacking the status quo. Instead, spend your time making the alternatives more attractive by supporting and encouraging them.

You'll always get more flies with honey than vinegar.

Discussion

Have You Ever Been Windows Experienced?

Now that Windows Vista Release Candidate 1 is sorta-kinda available to everyone, let's see what it takes to run it. Here's a comparison of the Vista hardware requirements with the hardware requirements of Windows XP:

 Windows XP (2001)Windows Vista (2007)
CPU233 MHz800 MHz
(1 GHz recommended)
RAM64 MB
(128 MB recommended)
512 MB
(1 GB recommended)
VideoSuper VGA (800 x 600) displayDirectX9 video card
(128 MB video RAM recommended)
HDD1.5 GB15 GB

Vista requires 10x the drive space, 8x the memory, and 4x the CPU power. It also substantially raises the bar for video; most integrated video solutions are no longer acceptable. The increase in minimum spec is not unreasonable, considering it's been 6 long years since the last release of a mainstream desktop operating from Microsoft.

Still, Vista-capable PCs can be had on the cheap. Even a basic $449 eMachines box exceeds these requirements. Granted, you might need to spend a bit of extra money to upgrade the memory from the default 512 megabytes, but even then you could buy a cheap 512 megabyte USB key and use it as ReadyBoost cache.

To deal with Vista's increased system requirements, Microsoft baked in a system benchmarking tool known as the Windows Experience Index. At first boot, your system is profiled, and features are enabled or disabled based on your machine's Windows Experience Index score. Here's what my home PC scored:

a Windows Experience Index score

Unlike some of the new Microsoft Vista features, this one is remarkably well thought out. For one thing, it expresses the total score as the lowest subscore. This is an incredibly intuitive way to highlight that your PC's performance is only as good as the slowest subsystem. You know immediately which part of your system will give you the most bang for the buck when upgrading.*

Clicking the View and Print Details button results in a great detailed system summary as well. Here's the Windows Experience Index summary page for my Asus W3J laptop:

Vista's More Details About My Computer report

It's a nice, balanced set of results, exactly what I was shooting for with this laptop. I also planned to upgrade the laptop's hard drive later in its life to boost its performance, so this confirms that choice as well. But what exactly is being measured here?

If you browse to c:windowsperformancewinsat and look in the datastore folder, you'll find an XML file that describes the test results in detail. Here's the relevant Metrics section:

<Metrics>
  <
CPUMetrics>
    <
CompressionMetric units="MB/s">43.83377</CompressionMetric>
    <
EncryptionMetric units="MB/s">23.30456</EncryptionMetric>
    <
Compression2Metric units="MB/s">138.22060</Compression2Metric>
    <
Encryption2Metric units="MB/s">178.69444</Encryption2Metric>
    <
DshowEncodeTime units="s">19.18101</DshowEncodeTime>
  </
CPUMetrics>
  <
MemoryMetrics>
    <
Bandwidth units="MB/s">3316.58691</Bandwidth>
  </
MemoryMetrics>
  <
GamingMetrics>
    <
AlphaFps units="F/s">49.85000</AlphaFps>
    <
ALUFps units="F/s">40.82000</ALUFps>
    <
TexFps units="F/s">45.64000</TexFps>
  </
GamingMetrics>
  <
GraphicsMetrics>
    <
DWMFps units="F/s">88.73640</DWMFps>
    <
VideoMemBandwidth units="MB/s">4695.65000</VideoMemBandwidth>
    <
MFVideoDecodeDur units="s">2.93202</MFVideoDecodeDur>
  </
GraphicsMetrics>
  <
DiskMetrics>
    <
AvgThroughput units="MB/s">31.75583</AvgThroughput>
  </
DiskMetrics>
</
Metrics>

I can see the Windows Experience Index base score becoming a standard tool for bragging rights amongst OEM PC builders. And because Microsoft used a balanced set of benchmarks with a logical scoring mechanism based on the weakest link in the system, the WEI score is more meaningful than a third party synthetic benchmark: when these scores improve, users win.

Just kidding. My WEI is bigger than yours.

* There is one caveat here: gaming. A low-ish ~2 video card rating will be plenty for Aero and WPF apps. A higher video rating will only really matter for users that play games, so it might be unfair to reduce the entire system's score to the video card score.

Discussion

Software: It's a Gas

Nathan Myhrvold, the former CTO of Microsoft, is also a bona-fide physicist. He holds physics degress from UCAL and Princeton. He even had a postdoctoral fellowship under the famous Stephen Hawking. Thus, as you might expect, his 1997 ACM keynote presentation, The Next Fifty Years of Software is full of physics and science metaphors.

It starts with Nathan's four Laws of Software:

  1. Software is a gas
    Software always expands to fit whatever container it is stored in.

  2. Software grows until it becomes limited by Moore's Law
    The initial growth of software is rapid, like gas expanding, but is inevitably limited by the rate of increase in hardware speed.

  3. Software growth makes Moore's Law possible
    People buy new hardware because the software requires it.

  4. Software is only limited by human ambition and expectation
    We'll always find new algorithms, new applications, and new users.

Myhrvold goes on to describe software development as a state of Perpetual Crisis. The size and complexity of software is constantly rising, with no limit in sight. As we develop more advanced software-- and as we develop solutions to manage the ever-increasing complexity of this software-- the benefits of the new software are absorbed by the rising tide of customer expectations. Software development will never be easy; new software always has to push against the current complexity boundary if it wants to be commercially successful.

This was all written in 1997. Nearly ten years later, are his points still valid? Software is certainly still a gas. Now that we're entering the multi-core era, there is one crucial difference. Historically hardware has gotten more complex because of limitations in the ability of software to scale; now the software needs to get more complex because of limitations in the ability of hardware to scale. The burden of scaling now falls on the software.

Myhrvold then makes an interesting point about the amount of storage required to capture human diversity. If..

  • the human Genome is approximately 1 gigabyte of data;
  • the individual difference between any two humans is 0.25% of their Genome;
  • we assume a lossless compression rate of 2:1;

The individually unique part of the human Genome can be stored in ~1.2 megabytes. Thus, you fit on a 3.5" floppy disk.

In fact, the entirety of human genetic diversity for every living human being could be stored in a 3.7 terabyte drive array. And the entire genetic diversity of every living thing on earth could be stored in roughly the size of the internet circa 2001.

I'm not sure what that means, exactly, but I love the idea that I can fit myself on a 3.5" floppy disk.

Discussion

Unnecessary Dialogs: Stopping the Proceedings with Idiocy

Although I like Notepad2, it has some pathological alert dialog behavior, particularly when it comes to searching. Here's an alert dialog I almost always get when searching a document:

Reached the end of the document. Restarting search at the beginning.

Thanks for the update, Notepad2. I really wanted a whole modal alert dialog to tell me this important fact. And if my search text is not found?

The specified text was not found.

Because I couldn't possibly tell if the text was not found without a giant.. alert.. dialog.

These are both unnecessary alert dialogs that, collectively, destroy the flow of my search task. It's the GUI equivalent of that annoying little kid from Jerry Maguire telling me that:

the human head weighs eight pounds   bees and dogs can smell fear   my next door neighbor has three rabbits

Meanwhile, I'm trying to get some freaking work done.

Alan Cooper, in his book About Face 2.0, calls this stopping the proceedings with idiocy. And that's exactly what it is.

There is a particular form of excise that is so prevalent it deserves special attention. In Chapter 9, we introduced the concept of flow, where the user enters a highly productive mental state by working in harmony with his tools. Flow is a natural state, and people will enter it without much prodding. It takes some effort to break into flow after someone has achieved it. Interruptions like a ringing telephone will do it, as will an error message box. Most interruptions are avoidable; a few aren't. But interrupting a user's flow for no good reason is stopping the proceedings with idiocy and is one of the most disruptive forms of excise.

Poorly designed software will make assertions that no self-respecting individual would ever make. It states unequivocally, for example, that a file doesn't exist merely because it is too stupid to look for it in the right place, and then implicitly blames you for losing it. A program will cheerfully execute an impossible query that hangs up your system until you decide to reboot. Users view such software behavior as idiocy, and with just cause.

The canonical example of unnecessary dialogs, the delete confirmation dialog, lives on in Windows Vista:

Are you sure you want to move this file to the recycle bin?

The real irony here is that moving files to the recycle bin is a completely recoverable action. It doesn't matter what I click in this dialog. If I mistakenly "delete" files this way, I can simply recover them from the recycle bin. This alert dialog is utterly superfluous. It's just another button I have to click through to get my work done. Sure, I can disable it (and I have), but the fact that the delete confirmation dialog exists at all is a giant sock in the nose for usability professionals everywhere.

Here's how strongly I feel about this: every time you send your users to an alert dialog, you have failed them. In a perfect world, we should never see a single alert dialog. Ever.

Heck, I'm not even sure the dialog box itself was ever a good idea. There's a lot of evidence that users never read dialogs and quickly train themselves to mindlessly click through them. Consider Mike's latest experience:

Over the weekend I was at a meeting of parents where many laptops were open and we were all looking at a spreadsheet. One parent opened the spreadsheet in Excel and was confronted by that big dialog boxes that warns you about macros.

It was classic. She just stopped and said "What am I supposed to do!? What does this mean!?" She did not read one single word of the dialog box text. Being me and all, I said "What does it say?" and more-or-less made her read the text, mostly on principle. Not that it probably helped much, because even when she'd read that "macros can be harmful" or whatever it says, she asked me "But it's ok to open this spreadsheet"? Yes, it was. Who knows what she would have done if she'd been on her own. I'm not sure she knows what a macro actually is.

The web has been a dialog-free world for years, and nobody seems to miss them very much. Maybe this is why. I dream of the day when we can produce a dialog-free GUI.

Discussion