Coding Horror

programming and human factors

This Is What Happens When You Let Developers Create UI

Deep down inside every software developer, there's a budding graphic designer waiting to get out. And if you let that happen, you're in trouble. Or at least your users will be, anyway:

wgetgui screenshot

Joseph Cooney calls this The Dialog:

A developer needed a screen for something, one or two text boxes and not much more, so they created "the dialog", maybe just to "try something out" and always with the intention of removing it before the product ships. They discovered they needed a few more parameters, so a couple more controls were added in a fairly haphazard fassion. "The dialog" exposes "the feature", something cool or quite useful. Admittedly "the feature" is more tailored towards power users, but it's still pretty cool. The developer thinks of new parameters that would make "the feature" even more powerful and so adds them to the dialog. Maybe a few other developers or power users see "the dialog" and also like "the feature". But why doesn't it expose this parameter? New controls are added. Pretty soon the technical team are so used to seeing "the dialog" the way it is that they become blind to its strange appearance. Ship time approaches and the product goes through more thorough testing, and "the dialog" is discovered, but it is too late to be heavily re-worked. Instead it is given a cursory spruce-up.

If you let your developers create your UI, hilarity ensues, as in this classic OK/Cancel strip. But when The FileMatrix is unleashed upon unsuspecting users, it's more like a horror movie. I still get chills. And like a bad horror movie franchise, the FileMatrix is still alive and kicking, folks.

Friends don't let friends produce Developer UI.

Part of being a good software developer is knowing your limits. Either copy something that's already well designed, or have the good sense to stick to coding and leave the graphic design to the experts.

Discussion

Discussions: Flat or Threaded?

Clay Shirky's classic articles on social software should be required reading for all software developers working on web applications. As near as I can tell, that's pretty much every developer these days.

But I somehow missed Joel Spolsky's related 2003 article on social software, Building Communities With Software.* It's an excellent, albeit somewhat long-winded, explanation of the way Joel runs his community forums. Although I recently accused Joel of jumping the shark, his scathing criticism of Usenet, Slashdot, and IRC is right on the money. All three are deeply flawed social software models, incapable of sustaining civilized discussion.

Joel advocates policies on his discussion boards that seem unworkable, even borderline anarchic:

  • No registration
  • No user moderation
  • No email notifications for new posts
  • No posted rules
  • No support for quoting or reply shortcuts
  • No unread post shortcuts
  • Arbitrary deletion of off-topic posts

Reads like a recipe for disaster, doesn't it? But with one minor exception**, I'm in complete agreement. When it comes to writing social software, Joel's curmudgeonly advice may very well be the right approach. Read the rest of Joel's post to understand why.

In particular, I share Joel's intense dislike of threaded conversations:

Q. OK, but can't you at least have branching? If someone gets off on a tangent, that should be its own branch which you can follow or go back to the main branch.

A. Branching is very logical to a programmer's mind but it doesn't correspond to the way conversations take place in the real world. Branched discussions are disjointed to follow and distracting. [..] Branching makes discussions get off track, and reading a thread that is branched is discombobulating and unnatural. Better to force people to start a new topic if they want to get off topic.

ThreadedFlat
discussion-threaded.png   discussion-flat.png

Two of the oldest and most popular discussion boards on the web, phpBB and vBulletin, avoid threaded views. The phpBB developers won't add threading. vBulletin offers threaded views, but they are off by default-- and often disabled completely by administrators.

Personally, I have yet find any threaded discussion format I like. Aside from the philosophical objections Joel raises, threaded discussions are painful to use. You're forced to click through to see the responses, and once you do, there's far too much pogo-ing up and down the hierarchy of the threaded discussions. It's all so.. unnecessary.

Flat discussion views have their limitations, too. But they're minor compared to the trainwreck that is threaded discussions. Until we can come up with a new discussion model that doesn't add a slew of new problems, let's take Joel's advice and stick with simple, flat discussion views.

* Thanks to Phil for pointing this article out to me.

** Quoted snippets are helpful if used in moderation. Unlike Joel, I don't have total recall of the last five posts I just read; judicious use of a few contextual quotes helps me keep the rest of the conversation in my brain.

Discussion

CPU vs. GPU

Intel's latest quad-core CPU, the Core 2 Extreme QX6700, consists of 582 million transistors. That's a lot. But it pales in comparison to the 680 million transistors of nVidia's latest video card, the 8800 GTX. Here's a small chart of transistor counts for recent CPUs and GPUs:

AMD Athlon 64 X2CPU154 m
Intel Core 2 DuoCPU291 m
Intel Pentium D 900CPU376 m
ATI X1950 XTXGPU384 m
Intel Core 2 QuadCPU582 m
NVIDIA G8800 GTXGPU680 m

ATI won't release a new video card until next year. But their current X1950 XTX isn't exactly chopped liver: 384 million transistors is more than any current dual-core CPU.

Of course, comparing GPUs to CPUs isn't an apples-to-apples comparison. The clock rates are lower, the architectures are radically different, and the problems they're trying to solve are almost completely unrelated. But GPUs now exceed the complexity of modern CPUs in terms of absolute transistor count. And like CPUs, they're becoming programmable-- it's possible to harness all that graphics power to do something other than graphics.

There's a nice overview on AnandTech which provides some background on this architectural sea change in video cards:

So far, the only types of programs that have effectively tapped GPU power-- other than the obvious applications and games requiring 3D rendering-- have also been video related: video decoders, encoders, video effect processors, and so forth. But there are many non-video tasks that are floating-point intensive, and these programs have been unable to harness the power of the GPU.

Meanwhile, the academic world has designed and utilized custom-built floating-point research hardware for years. These devices are known as stream processors. Stream processors are extremely powerful floating-point processors able to process whole blocks of data at once, whereas CPUs carry out only a handful of numerical operations at a time. We've seen CPUs implement some stream processing with instruction sets like SSE and 3DNow!, but these efforts pale in comparison to what custom hardware has been able to do.

3D rendering is also a streaming task. Modern GPUs have evolved into stream processors, sharing much in common with the customized hardware of researchers. GPU designers have cut corners where they don't need certain functionality for 3D rendering, but they have ultimately developed extremely fast and flexible stream processors. Modern GPUs are just as fast as custom hardware, but due to economies of scale are many, many times cheaper than custom hardware.

Dedicated, task-specific hardware is orders of magnitude faster than what you can achieve with a general purpose CPU. If you need proof of this, just look at the chess benchmarks. IBM's Deep Blue was capable of evaluating 200 million chess moves per second in 1997. Ten years later, the fastest quad-core desktop system can only evaluate 8 million chess moves per second. Ten year old custom hardware is still 25 times faster than the best general purpose CPUs. Amazing.

The most high profile application for all this GPU power at the moment is Stanford's Folding@Home. There's no shortage of exciting PR on this topic:

The processing power of just 5,000 ATI processors is also enough to rival that of the existing 200,000 computers currently involved in the Folding@home project; and it is estimated that if a mere 10,000 computers were to each use an ATI processor to conduct folding research, that the Folding@home program would effectively perform faster than the fastest supercomputer in existence today, surpassing the 1 petaFLOP level.

Stanford recently introduced a high performance folding client which runs on ATI's X1800 and X1900 series video cards. TechReport tested the new high performance folding client and came away a little disappointed:

Over five days, our Radeon X1900 XTX crunched eight work units for a total or 2,640 points. During the same period, our single Opteron 180 core chewed its way through six smaller work units for a score of 899 -- just about one third the point production of the Radeon. However, had we been running the CPU client on both of our system's cores, the point output should have been closer to 1800, putting the Radeon ahead by less than 50%.

The GPU may be doing 20 to 40 times more work, but the scores are calibrated to a baseline system, not the absolute amount of work that's done. It's a little anticlimactic.

Stanford's advanced folding client exploits the Brook Language, an extension to ANSI C that allows them to compile C-like code that runs on the GPU. It leverages ATI's Stream API to communicate with the GPU. NVIDIA offers something similar to Brook in their CUDA technology:

GPU computing with CUDA technology is an innovative combination of computing features in next generation NVIDIA GPUs that are accessed through a standard C language. Where previous generation GPUs were based on "streaming shader programs", CUDA programmers use C to create programs called threads that are similar to multi-threading programs on traditional CPUs. In contrast to multi-core CPUs, where only a few threads execute at the same time, NVIDIA GPUs featuring CUDA technology process thousands of threads simultaneously enabling a higher capacity of information flow.

Of course, CUDA only works on the latest G80 series of cards, just like the ATI's Stream technology is really only useful on their latest X1900 series. All this potential programmability is a very recent development.

I expect the relationship between CPU and GPU to largely be a symbiotic one: they're good at different things. But I also expect quite a few computing problems to make the jump from CPU to GPU in the next 5 years. The potential order-of-magnitude performance improvements are just too large to ignore.

Discussion

Exploring Vista's Advanced Search

I used the file search function in Windows XP a lot, particularly to find groups of files. But the XP search syntax doesn't work in Vista. Vista uses the Windows Desktop Search query syntax. Which means

   "*.vbproj;*.csproj"

becomes

   "ext:(*.vbproj OR *.csproj)"

Vista's search box in Windows Explorer

Note that the boolean operator must be in all-caps to work. That was painful to figure out.

I highly recommend reading through the Windows Desktop Search advanced query reference. First of all, it's completely different than searching in XP, so you'll need to retrain your brain. But it's also a far richer search paradigm than we ever had in XP. And you can use the same CTRL+E search keyboard shortcut that works in your browser to harness its power in Windows Explorer.

When you perform a search, note that the Search Tools menu is available; that's our main interface for all the new search options.

vista-search-tools.png

From here, you can bring up the Search Pane, which lets you filter your searches to particular file types, and includes an expandable Advanced Search pane.

vista-search-pane.png

As you fill in values in the Advanced Search pane and click Search, the equivalent query terms will be populated in the CTRL+E search box. It's a good way to learn basic search syntax. Once you've learned the new Vista search syntax, you won't need the Search Pane training wheels any more; you can press CTRL+E and type in what you want. It's Google-icious.

There's also an important distinction between indexed search locations and non-indexed search locations. To see the difference, choose "Search Options" from the Search Tools menu.

vista-folder-search-options.png

Most notably, your search terms will only extend to file contents in indexed locations. I'm also very glad to see search now ignores compressed files by default. This was a real pain in XP, which insisted on digging through 600 megabyte ZIP files as a part of any search.

To view indexed locations, or add your own, select Modify Index Locations from the Search Tools menu. On a default Vista install, there are only three indexed locations:

  • Offline Files
  • c:Program DataMicrosoftWindowsStart Menu
  • c:Users

There is one big caveat here: the full-text indexer only indexes file extensions that it understands. To view or modify the list of file extensions the indexer understands, click the Advanced Options button on the Modify Index Locations dialog, then select the File Types tab.

Vista indexes, advanced options button, file types tab

Perhaps the coolest new search feature is that you can enter searches directly from the Windows start menu. Try it. Hit the Windows key and just start typing search queries. There's nothing to install, nothing to configure, searching just works in Vista. It's about time.

Discussion

iPod Alternatives

I have a great deal of respect for Apple's iPod juggernaut. They've almost single-handedly legitimized the market for downloadable music. The kind you pay for. The kind that, at least in theory, supports the artists who produce the music instead of ripping them off.

That said, I have some problems with the iPod.

  1. The iPod is boring. How can I properly rage against the machine with the same standard, factory issue music players that everyone else has? I don't want this to devolve into a knee-jerk rejection of all iThings, but let's be honest here: when every soccer Mom carries an iPod, it's no longer a cool technical accessory. It's completely mainstream. I'd be lying if I said this didn't matter to me.

  2. The iPod has no support for subscription services. I'm a member of Yahoo Music Unlimited, which gives me unlimited access to a massive library of music for 6 bucks a month. I can stream any of this music to multiple PCs, or I can download it to my hard drive or mobile audio players. And it's in a very respectable 192kbps 2-pass CBR format, too. For that same six bucks a month, I could buy a whopping six tracks from the iTunes store. While I can certainly understand the desire to own music, why not give us a choice? Apple's insistence on purchase-only models is a huge mistake.

  3. The iPod does not support WMA. Although Jobs grudgingly made the iPod Windows compatible two years after its introduction, he still gets his jabs in. The conspicuous lack of WMA support is a not-so-subtle f*ck you to the Windows community. And what of OGG? Or FLAC? Clearly, the hardware is capable, but the political forces inside Apple won't allow it. You'd figure a company that had the guts to make a stunning, wholesale switch to x86 processors could deign to support a few alternative audio formats on their music players. But no.

  4. The iPod lacks features. I'll never understand why the iPod chooses to deliberately ignore FM radio and its rich history in the music industry. Heck, you might even want to record FM radio. That's just crazy talk! And the list goes on: there's no voice recording, no EQ settings, no gapless playback, etcetera.

  5. The iPod requires custom software to work. Every music player on the market should have this down to a science by now:

    • plug in the USB cable
    • drag and drop your music on the device
    • disconnect the cable and ROCK

    The iPod fails miserably on this count: it requires iTunes installed (or another custom application) to transfer any music to the device. You can't even use it as an external hard drive without setting up a separate, special partition on the device first. Of course, use iTunes if you want, but you shouldn't be forced to use iTunes because the hardware is a brick if you don't. How did Apple get this so very, very wrong?

Now, your goals may not be my goals. But when my wife wanted a new music player to replace her aging Rio Carbon (RIP-- a great little player for its time), these are the criteria I used to evaluate them.

Unfortunately, music devices that can be used seamlessly and interchangeably as a generic external USB hard drive and digital music player are quite rare. The sole exception, at least for hard-disk devices, is the Cowon X5L. The Cowon is a decent player, but it suffers from Soviet Russia-era design aesthetics. Due to lack of choices, I was forced to compromise on devices that support Microsoft's Media Transfer Protocol. When connected to a Windows XP or Windows Vista machine, MTP support allows you to drag and drop music directly on to the device-- without installing any software. It's not ideal, since it's tied to Microsoft, but it's the best I can do.

The Digital Audio Players Review website had the most helpful advice. Their top pick was the Creative Zen Vision:M. I agreed, so I went with the pink one. You know, for the ladies.

Creative Zen Vision:M

It's a great little device, and as promised, we just dragged and dropped our music on it-- which happens to be a mix of MP3 and WMA files. And it worked with our Yahoo Music Unlimited subscription as well.

To complement the 30gb hard drive player, I also picked up a flash device-- the new, larger 4gb iRiver Clix.

The 4gb iRiver clix

I've owned a few iRiver products in the past and they've always been excellent. dapreview gave the Clix high marks, and so has everyone else who has reviewed it. The feature set is great. It meets every one of my criteria, throws in video support, and even goes a little beyond with support for Flash Lite games.

I respect the way the pioneering iPod has collectively led the industry out of the dark Napster ages. And I like the iPod design. But until Apple at least supports subscription services and the WMA/FLAC/OGG file formats, I can't justify purchasing any iPod hardware.

Discussion