Coding Horror

programming and human factors

Compiler, It Hurts When I Do This

Here's a question that recently came up on an internal mailing list: how do I create an enum with a name that happens to be a c# keyword?

I immediately knew the answer for VB.net; you use brackets to delimit the word.

Public Enum test
[Public]
[Private]
End Enum
Sub Main()
Dim e As test = test.Private
End Sub

A little internet searching revealed that such things are called escaped identifiers, and the equivalent in c# is the @ character.

public enum test
{
@public,
@private
}
static void main()
{
test e = test.@private;
}

They do work the same, but they don't look the same. In c#, you have to type the unwanted escaped identifier every time you use the enum, and the enum even shows up with the @ prefix in intellisense. However, if you echo back the enum value, it will be "private", and not "@private", as expected.

However, after spending 30 minutes researching the answer and playing with the results, I began to wonder if the real answer to this question should be another question: why do you need to do this? At some point it all becomes a little ridiculous. What's next-- an enum named "enum"? A variable named "variable"?

Stop me if you've heard this one before:

A man goes to a doctor's office. He says, "Doctor, it hurts when I raise my arm over my head."

The doctor replies, "Then don't raise your arm over your head."

If the compiler is telling you it hurts when you do something, maybe you should stop doing it. Just something to consider before merrily swimming your way upstream.

Discussion

Information Density and Dr. Bronner

Edward Tufte, in his new book, Beautiful Evidence, continues on his crusade for information density. Here's a representative recap of a Tufte seminar from 2001:

Tufte spent most of his talk walking around the room while talking on a wireless mike. He had two projectors set up, but for the most part he only displayed pages or pictures from his books, instructing the audience to follow along in their own copies (which had been provided to every attendee). He occasionally carried around some other props, in particular a few 400-year old books from his personal library. This style not only entertained and engaged the audience, it also emphasized one of his main points, which is that progress is often measured in data density - how many bits per unit of area can be accomodated by a hard drive or a display.

In terms of text display, a page in a phone book can hold 36K of information, while the best display can only show about 5K. If you look at something like a topographical map, the resolution available on paper is a factor of ten, at least, beyond what can be shown on a screen.

Tufte feels that the same mantra about data density should be applied to web sites, and in fact to the entire contents of the computer display that the user sees when navigating a web site. Thus, he dislikes task bars, menu bars, status bars, and other GUI screen overhead, since they constrict how much of the display can be used for content. Once you get to the actual site, he has similar disdain for banner ads, navigation bars, graphical frills, and the like.

Tufte feels that the main measure of a web site (or any computer interface) should be the percentage of the screen that is actually devoted to the task at hand. He wants web pages to use words instead of icons, because [words] can display information more compactly. He does not like navigation bars, but instead wants as many choices as possible on the main page.

You'll find the same theme repeated in all of Tufte's books: progress is measured in information density.

Although I definitely understand the desire for maximizing content and minimizing UI clutter, I have a hard time squaring the desire for maximum information density with the current Web 2.0 drive for minimalist content.

These days, you rarely see screens packed densely with content and hundreds of links, but that's what Tufte seems to be asking for. We even make fun of the Yahoo home page because it has become so dense over time. Are we wrong, and Tufte is right? Average display resolutions haven't increased that much between 1996 and 2006; we went from 800x600 to 1280x1024 or thereabouts. And we have the RGB magic of ClearType which increases effective horizontal resolution by about 3x.

Maybe the Yahoo home page design overreaches because it's now being designed as if it was a printed page. We have higher resolutions, sure, but computer displays are still nowhere near the resolution of a printed page. Perhaps the current trend of design minimalism is simply eliminating wishful thinking: mating the very low resolution of a computer screen (as compared to a printed page) with a corresponding reduction in content.

But Tufte isn't the only design guru to worship at the altar of information density. Jef Raskin, in The Humane Interface, talks about this at some length. He even references Tufte directly:

We seem to have a real fear of displaying data in our interfaces. We know that people can quickly find one among a few items much more quickly than they can find one among dozens: there is less to look through. But it does not follow, as some seem to think, that it is therefore better to have fewer items on each screen. If you have hundreds of items and split them up among dozens of screens, you lose more in navigation time than you gain in searching for the individual item, even if the one you seek is swimming in a sea of similar-looking items.

Visual designer Edward Tufte's first three principles for displaying information are:

  • Above all else, show the data.
  • Maximize the data-ink ratio.
  • Erase nondata ink.

All we need to do is substitute pixels for ink for his advice to apply to display-based devices. A serious, professional user wants screens packed with useful stuff. Screens should be well labeled, with methods to make finding things easier and dense with the information that represents the real value of each screen.

One of the most remarkable examples of information density, at least in a commercial product, is Dr. Bronner's soaps:

dr bronner peppermint soap label small

Click the image to see a larger version. You can also obtain PDF versions of the labels directly from the company website (scroll to the bottom).

I remember the first time I saw a Dr. Bronner product; the incredible density of the tiny text on the label drew me to it. Yes, they're filled with half-crazy religious ravings. Not so fun in person, but if someone is this jazzed about a bar of soap, it's somehow endearing. You can see a small video clip of Bronner ranting in person via the Dr. Bronner's Magic Soapbox documentary trailer.

You'd think a label filled with reams of tiny, indecipherable text would be the kiss of death for any commercial product. Not so for eccentric Dr. Bronner and his soaps. Is it a victory for information density? Maybe. I think Craigslist is conceptually pretty close to what Dr. Bronner was doing.

Discussion

What is "Modern Software Development"

Joel Spolsky came up with a twelve-item checklist in August, 2000 that provides a rough measure of – in his words – "how good a software team is":

  1. Do you use source control?
  2. Can you make a build in one step?
  3. Do you make daily builds?
  4. Do you have a bug database?
  5. Do you fix bugs before writing new code?
  6. Do you have an up-to-date schedule?
  7. Do you have a spec?
Steve McConnell enumerated Software's Ten Essentials in 1997, ten things that every software project should have:

  1. A product specification
  2. A detailed user interface prototype
  3. A realistic schedule
  4. Explicit priorities
  5. Active risk management
  6. A quality assurance plan
  7. Detailed activity lists
  8. Software configuration management
  9. Software architecture
  10. An integration plan

These are great lists. But Spolsky's list is 6 years old; McConnell's is almost 10 years old! Does your software project meet all these criteria?

The lists are still highly relevant and definitely worth revisiting today. But I wonder if the field of software development has advanced far enough that we can take any of the items on this list for granted. I also wonder if any new practices have emerged in the last 6 years that aren't accounted for on either list.

So here's my question to you: what core set of practices constitutes modern software development in 2006?

Discussion

I Pity The Fool Who Doesn't Write Unit Tests

J. Timothy King has a nice piece on the twelve benefits of writing unit tests first. Unfortunately, he seriously undermines his message by ending with this:

However, if you are one of the [coders who won't give up code-first], one of those curmudgeon coders who would rather be right than to design good software… Well, you truly have my pity.

Extending your pity to anyone who doesn't agree with you isn't exactly the most effective way to get your message across.

Mr. T

Consider Mr. T. He's been pitying fools since the early 80's, and the world is still awash in foolishness.

It's too bad, because the message is an important one. The general adoption of unit testing is one of the most fundamental advances in software development in the last 5 to 7 years.

How do you solve a software problem? How do they teach you to handle it in school? What's the first thing you do? You think about how to solve it. You ask, "What code will I write to generate a solution?" But that's backward. The first thing you should be doing -- In fact, this is what they say in school, too, though in my experience it's paid more lip-service than actual service -- The first thing you ask is not "What code will I write?" The first thing you ask is "How will I know that I've solved the problem?"

We're taught to assume we already know how to tell whether our solution works. It's a non-question. Like indecency, we'll know it when we see it. We believe we don't actually need to think, before we write our code, about what it needs to do. This belief is so deeply ingrained, it's difficult for most of us to change.

King presents a list of 12 specific ways adopting a test-first mentality has helped him write better code:

  1. Unit tests prove that your code actually works
  2. You get a low-level regression-test suite
  3. You can improve the design without breaking it
  4. It's more fun to code with them than without
  5. They demonstrate concrete progress
  6. Unit tests are a form of sample code
  7. It forces you to plan before you code
  8. It reduces the cost of bugs
  9. It's even better than code inspections
  10. It virtually eliminates coder's block
  11. Unit tests make better designs
  12. It's faster than writing code without tests

Even if you only agree with a quarter of the items on that list-- and I'd say at least half of them are true in my experience-- that is a huge step forward for software developers. You'll get no argument from me on the overall importance of unit tests. I've increasingly come to believe that unit tests are so important that they should be a first-class language construct.

However, I think the test-first dogmatists tend to be a little too religious for their own good. Asking developers to fundamentally change the way they approach writing software overnight is asking a lot. Particularly if those developers have yet to write their first unit test. I don't think any software development shop is ready for test-first development until they've adopted unit testing as a standard methodology on every software project they undertake. Excessive religious fervor could sour them on the entire concept of unit testing.

And that's a shame, because any tests are better than zero tests. And isn't unit testing just a barely more formal way of doing the ad-hoc testing we've been doing all along? I think Fowler said it best:

Whenever you are tempted to type something into a print statement or a debugger expression, write it as a test instead.

I encourage developers to see the value of unit testing; I urge them to get into the habit of writing structured tests alongside their code. That small change in mindset could eventually lead to bigger shifts like test-first development-- but you have to crawl before you can sprint.

Discussion