Coding Horror

programming and human factors

Code Tells You How, Comments Tell You Why

In an earlier post on the philosophy of code comments, I noted that the best kind of comments are the ones you don't need. Allow me to clarify that point. You should first strive to make your code as simple as possible to understand without relying on comments as a crutch. Only at the point where the code cannot be made easier to understand should you begin to add comments.

It helps to keep your audience in mind when you're writing code. The classic book Structure and Interpretation of Computer Programs, originally published in 1985, gets right to the point in the preface:

Programs must be written for people to read, and only incidentally for machines to execute.

Knuth covers similar ground in his classic 1984 essay on Literate Programming (pdf):

Let us change our traditional attitude to the construction of programs: Instead of imagining that our main task is to instruct a computer what to do, let us concentrate rather on explaining to human beings what we want a computer to do.

The practitioner of literate programming can be regarded as an essayist, whose main concern is with exposition and excellence of style. Such an author, with thesaurus in hand, chooses the names of variables carefully and explains what each variable means. He or she strives for a program that is comprehensible because its concepts have been introduced in an order that is best for human understanding, using a mixture of formal and informal methods that reinforce each other.

If you write your code to be consumed by other programmers first, and by the compiler second, you may find the need for additional comments to be greatly reduced. Here's an excellent example of using comments as a crutch:

This is a snippet of code from a well funded, closed-source system that has been deployed in production for years.

float _x = abs(x - deviceInfo->position.x) / scale;
int directionCode;
if (0 < _x & x != deviceInfo->position.x) {
    if (0 > x - deviceInfo->position.x) {
        directionCode = 0x04 /*left*/;
    } else if (0 < x - deviceInfo->position.x) {
        directionCode = 0x02 /*right*/;
    }
}

This is equivalent to the following, more readable code, with a bugfix.

static const int DIRECTIONCODE_RIGHT = 0x02;
static const int DIRECTIONCODE_LEFT = 0x04;
static const int DIRECTIONCODE_NONE = 0x00;

int oldX = deviceInfo->position.x;
int directionCode = (x > oldX)? DIRECTIONCODE_RIGHT
                  : (x < oldX)? DIRECTIONCODE_LEFT
                  : DIRECTIONCODE_NONE;

Note that more comments does not mean more readable code. It didn't in this example. The comments in the snippet above – if you even noticed them – only clutter the code even more. Sometimes fewer comments makes for more readable code. Especially if it forces you to use meaningful symbol names instead.

Although there are almost infinite opportunities to refactor and simplify code to obviate the need for comments, explaining yourself exclusively in code has its limits.

No matter how simple, concise, and clear your code may end up being, it's impossible for code to be completely self-documenting. Comments can never be replaced by code alone. Just ask Jef Raskin:

Code can't explain why the program is being written, and the rationale for choosing this or that method. Code cannot discuss the reasons certain alternative approaches were taken. For example:

/* A binary search turned out to be slower than the Boyer-Moore algorithm for the data sets of interest, thus we have used the more complex, but faster method even though this problem does not at first seem amenable to a string search technique. */

What is perfectly, transparently obvious to one developer may be utterly opaque to another developer who has no context. Consider this bit of commenting advice:

You may very well know that

$string = join('',reverse(split('',$string)));

reverses your string, but how hard is it to insert

# Reverse the string

into your Perl file?

Indeed. It's not hard at all. Code can only tell you how the program works; comments can tell you why it works. Try not to shortchange your fellow developers in either area.

Discussion

Hard Drive Temperatures: Be Afraid

I recently had a noisy fan failure in my ASUS Vento 3600 case. The particular fan that failed was the 80mm fan in the front panel, which is responsible for circulating air by the hard drives in the front of the case. I disconnected it while I considered my options. There's not a lot of airflow by the hard drives in this case. I've actually had a hard drive failure in this system, which I strongly suspect was due to leaving the front fan disconnected.

The two hard drives are mounted with rubber grommets to reduce conducted vibration noise, a standard feature of many new PC cases.

hard-drive-grommets.jpg

Avoiding direct metal-to-metal contact will always help quiet drives-- they are, after all, giant hunks of metal spinning at 7,200 or 10,000 RPM. But the lack of metal-to-metal contact also means the drives don't benefit from the significant auxiliary cooling effects of metal contact.

Of course, hard drives don't generate nearly as much heat as your CPU and video card do. They only consume around 10 or 12 watts under load, and around 7 watts at idle. But unlike your CPU, they're generating a lot of mechanical movement, which means friction-- and heat disproportionate to the power input. They still need some airflow to stay at a reasonable temperature.

I often read about users obsessing over their CPU or GPU temperatures, while ignoring their hard drive temperatures entirely. That's a shame, because the hard drive is the most temperature sensitive device inside your computer. Most manufacturers rate CPUs up to 70C, and GPUs commonly rate to 90C and beyond.

Manufacturers measure off quite a modest range of operating temperatures for hard drives, from +5 to +55C as a rule, and occasionally to +60C. This operating range is much lower than processors, video cards, or chipsets. Moreover, hard drive reliability depends heavily on their operating temperatures. According to our research, increasing HDD temperature by 5C has the same effect on reliability as switching from 10% to 100% HDD workload. Each one-degree drop of HDD temperature is equivalent to a 10% increase of HDD service life.

Hard drives are only rated to 55C in most cases. Although there's still a lot of ongoing discussion on what exactly a "safe" temperature is for a hard drive, the general consensus is that high temperatures are much more risky for the hard drive than any other component inside your computer.

When your CPU, video card, or motherboard fails, you buy a new one and replace it. Big deal. Life goes on. But when your hard drive fails, unless you have a rigorous backup regime, you just lost all your data. Failure of a hard drive tends to have catastrophic consequences for your data. That's why I'm always very careful with hard drive temperatures. When I disconnected the failing fan, I used the excellent DTemp hard-drive temperature monitoring utility to keep an eye on the temperatures.

dtemp-screenshot.png

Sure enough, with the front fan disconnected, both drives inched up to 46C in 15 minutes. And that was at idle. I can only imagine what the temperatures would look like after internal temperatures increased under load. I've already had one drive failure in this case with sustained temperatures around the same level. Some kind of replacement airflow is essential. I used foam tape to mount an 80mm fan on the front of the drives, blowing across the drives and back towards the case. As I write this, they're down to 33C -- a whopping 13 degree drop.

Hard drive temperature is arguably the most important temperature to monitor in your computer. If you regularly see temperatures of 45C or higher on your drive, consider improving airflow in your case. If you don't, you've substantially increased your risk of hard drive failure or data loss.

Discussion

Next-Gen DVD: Are Those Additional Pixels Worth Your Money?

Next generation DVD formats promise a huge jump in resolution, from the 720 x 480 of standard DVD to the 1920 x 1080 of HD-DVD and Blu-Ray.

DVD resolution vs. Next-Gen DVD resolutions

Additional resolution is always welcome, of course. But it's not free. You'll have to purchase a HD-DVD or Blu-Ray player, and a television set capable of displaying the new high definition formats. Then you'll need to either rent or buy features on the new optical media formats, as they become available.

But are those additional pixels worth your money?

Decide for yourself. Consider this detailed comparison of Fellowship of the Ring in both formats. Mouse over each image to see before and after; click the images to see the full-size versions.

Here's a closeup of one particular section to illustrate the difference in fine detail:

HD-DVD Fellowship of the Ring detail   DVD Fellowship of the Ring detail

There is more detail, but I'm left wondering how much of that detail I would notice on a 42" display ten feet in front of my couch. Edward Tufte's arguments in favor of information density are for the printed page, which you read less than a foot from your eyes. I don't think the same rules apply to video ten feet or more away. And how, exactly, do we explain the runaway success of YouTube, where video quality is never less than appalling? Clearly, the rules are different for images in motion.

As screens grow in size, you do need more detail. I doubt a DVD would look very good projected on a movie screen in a typical movie theater. But I also think people tend to overestimate how much detail is needed for an image of a given size; witness David Pogue's street experiment:

On the show, we did a test. We blew up a photograph to 16 x 24 inches at a professional photo lab. One print had 13-megapixel resolution; one had 8; the third had 5. Same exact photo, down-rezzed twice, all three printed at the same poster size. I wanted to hang them all on a wall in Times Square and challenge passersby to see if they could tell the difference.

Even the technician at the photo lab told me that I was crazy, that there'd be a huge difference between 5 megapixels and 13.

I'm prepared to give away the punch line of this segment, because hey -- the show doesn't air till February, and you'll have forgotten all about what you read here today, right?

Anyway, we ran the test for about 45 minutes. Dozens of people stopped to take the test; a little crowd gathered. About 95 percent of the volunteers gave up, announcing that there was no possible way to tell the difference, even when mashing their faces right up against the prints. A handful of them attempted guesses -- but were wrong. Only one person correctly ranked the prints in megapixel order, although (a) she was a photography professor, and (b) I believe she just got lucky.

I'm telling you, there was NO DIFFERENCE.

So, by the same logic, are high definition DVD formats a waste of money?

Perhaps. But I've noticed that typical DVD playback pales in comparison to high definition video clips, even when downscaled to the 800x480 of my 42" EDTV plasma. It's a form of supersampling anti-aliasing; higher resolution sources scale better to larger and smaller screens. I don't think you necessarily need a giant screen, or even a particularly high resolution screen, to benefit from the additional source resolution.

Next-gen DVDs offer almost 5 times the resolution of standard DVDs. Despite the math, the practical difference between DVD resolution and next-gen DVD resolution is highly subjective. And it's arguably much less significant than the giant jump we took over the last 10 years to get from standard television resolution to DVD resolution. Have we already reached the point of diminishing returns? Is anyone actually arguing that DVDs don't look good enough? And is the market willing to pay more for higher resolutions?

To make matters worse, the road to next-gen DVD is currently fraught with risks: technological copy-protection pitfalls, additional costs, and an ongoing format war. PCFormat UK writes that both formats reek of exploitation, and offers an apples to apples comparison of DVD, HD-DVD and Blu-Ray stills from the movie Training Day. Judge for yourself.

Although I'll always be a fan of increased resolution, I'm not sure these additional pixels have earned my money yet.

Discussion

High-Definition Video on the PC

Now that HD-DVD and Blu-Ray are starting to enter the market in earnest, I thought I'd revisit HD video playback on the PC. I'm seriously considering buying one of the $199 Xbox 360 HD-DVD kits and retrofitting it into my Home Theater PC.

I'll start by comparing three clips from the Microsoft Windows Media High Definition Showcase page. It's a tad out of date now – most of these sample high-definition clips were mastered in late 2003 or early 2004, long before HD-DVD or Blu-Ray. But the bitrates and sizes are still representative.

Most of the clips had similar performance characteristics, so I chose one clip from each category.

T2 Promo
720p
T2 Promo
1080p
Step Into Liquid Promo
1080p
Resolution 1280 x 720 1440 x 1080 1440 x 1080
Bitrate 6.93 Mbps 8.44 Mbps 8.44 Mbps
Audio Codec WMA 9 Pro
5.1 channel
384 Kbps
WMA 9 Pro
5.1 channel
384 Kbps
WMA 9 Pro
2 channel
384 Kbps
Video Codec WMA 9 Pro WMA 9 Pro WMA 9 Pro

Note that I included the Step Into Liquid promo because it's the only clip on the WMV HD Showcase page that requires an abnormally large amount of CPU power to decode. I'm still not sure exactly why. The resolution is the same, and the bitrate looks comparable. You'll also note that Microsoft has an odd definition of 1080p. The official television resolutions break down as follows:

Pixels per frame
480i 704 x 480 interlaced 168,960
480p 704 x 480 progressive 337,920
720p 1280 x 720 progressive 921,600
1080i 1920 x 1080 interlaced 1,036,800
1080p 1920 x 1080 progressive 2,073,600

The official definition of 1080p is 1920x1080, so I'm not sure what Microsoft was thinking there. Interlaced means resolution is effectively halved vertically in time; frames alternate between odd and even lines each cycle.

Interlaced video modes are generally considered inferior to progressive video modes, and should only be used if you have no way to enable progressive modes. Interlacing has a lot of problems and should be avoided whenever possible.

I compared CPU usage in Task Manager during full-screen playback of each clip in Windows Media Player 11 on a few systems I have around the house. Note that DirectX Video Acceleration was enabled in each case.

Pentium-M 1.75 GHz Athlon 64 1.8 GHz Core Duo 2.0 GHz
T2 Promo
720p
terminator-wmvhd-720p-pentium-m.png
75% CPU, perfect
terminator-wmvhd-720p-athlon-64.png
50% CPU, perfect
terminator-wmvhd-720p-core-duo.png
25% CPU, perfect
T2 Promo
1080p
terminator-wmvhd-1080p-pentium-m.png
85% CPU, perfect
terminator-wmvhd-1080p-athlon-64.png
75% CPU, perfect
terminator-wmvhd-1080p-core-duo.png
33% CPU, perfect
Step Into Liquid Promo
1080p
terminator-wmvhd-1080p-liquid-pentium-m.png
100% CPU, unwatchably choppy
terminator-wmvhd-1080p-liquid-athlon-64.png
95% CPU, very choppy
terminator-wmvhd-1080p-liquid-core-duo.png
40% CPU, perfect

I have a sneaking suspicion the reason Microsoft redefined "1080p" as 1440x1080 had to do with performance. I doubt any PC system could play a true 1080p clip at the time these clips were mastered. It clearly requires a lot of CPU horespower to render high definition video. Driving 1920x1080 requires a lot of grunt – both in terms of pixel-pushing video bandwidth, and also in terms of CPU power for the advanced encoding used to keep the file size down on these massive resolutions. Dual core does this especially well, although it appears to do so by brute force load sharing rather than any special multiprocessor optimizations in the decoder. The mainstream PC is only now catching up to the performance required to watch high definition video.

All the video samples I cited are in Windows Media format. Windows Media, by all accounts, offers a very good next generation encoder, but it isn't the only encoder on the block. Both Blu-Ray and HD-DVD allow three different encoders:

Woe to the poor consumer who buys a HD-DVD or Blu-Ray disc encoded with the ancient MPEG-2 encoder. It'll look awful and take up a lot more room than it should. H.264 and MVC-1, however, are truly next generation encoders. They look better in less space, but they also require a lot more CPU power to decode. At least we have a few legitimate uses left for the ridiculous amounts of CPU power we've inherited over the years.

If you own a relatively new video card, it is possible to offload some of the video decoding chores from your CPU to your video card's GPU. But the configuration, drivers, and software necessary to achieve this acceleration is daunting. Anandtech found that, when properly configured, the latest video cards can significantly reduce the H.264 decoding burden on the CPU; ATI reduced it by half, and Nvidia reduced it by a fifth. But good luck getting everything set up in Windows XP. Here's hoping Windows Vista and DirectX Video Acceleration 2.0 enables hardware accelerated video decoding out of the box.

Discussion