Coding Horror

programming and human factors

The Sad State of Digital Software Distribution

In this era of pervasive broadband, I'm constantly surprised how often I am forced to buy a physical CD or DVD to obtain the software I want. Physical distribution methods have their place, but they should be on the decline by now. Software is best distributed digitally through our high-speed internet connections-- using BitTorrent if necessary.

Instead, I find that download options for commercial software are quite rare. Even when the download option is available, you end up paying the same price as retail or even more. Here's a typical example. I purchased Titan Quest: Gold from Steam about a month ago. I paid $29.95, which is the standard retail box price. But online discounters sell boxed copies of the very same game for $22.90.

Digital Distribution: $29.95 Retail Copy: $22.90
Titan Quest Gold: downloadable version, $29.95 Titan Quest Gold: retail version, $22.90

Selling directly to the consumer via download means bypassing the entire brick and mortar sales chain. This should result in cheaper prices than retail, not the same prices-- and it should never result in higher prices. Paying a premium for the privilege of downloading software is complete ripoff, and yet it happens all the time.

In this case, Valve is the distributor, so they're getting a healthy cut of the sale price (rumor says 50%). That's still a fantastic deal compared to retail software sales, where the authors will be lucky to get 10% of the sale price. But this "download is the same cost as retail" pricing strategy is particularly egregious when you buy the software directly from the company who created it. That's pure profit, as Greg Costikyan points out:

If you can retain the right to sell [your software] off your own site, do, obviously. Even if your traffic is low, you keep 90% of the revenues, and that's gravy.

Microsoft does allow us to purchase and download upgrade versions of Vista digitally. But as usual, you'll be paying full retail price for the privilege. The downloadable Vista Ultimate upgrade is $259.95, but you can purchase the same product in a retail box for $249.99.

Digital Distribution: $259.95 Retail Copy: $249.99
Vista Ultimate upgrade, downloadable version: $259.95 Vista Ultimate upgrade, retail version: $249.99

I don't mean to single out Microsoft here. At least they provide the download option for Vista (but, oddly, not for Office, their other cash cow). I've also purchased games directly from EA using their EA Link download service, and you always pay full retail price there, too. Sadly, paying full retail price to download software is a standard practice in the software industry. Oh sure, sometimes they'll throw in some cheesy extras like downloadable soundtracks and so forth -- but does that really make up for the fact that you just increased their profit margin on the sale by a factor of five? I don't think so. About the only "benefit" of buying game software digitally is that they'll (sometimes) let you unlock it on midnight of the street date, so you get a few bonus hours of play before everyone else.

I can understand the desire not to undercut their own distribution channel. I'm sure Best Buy wouldn't be too happy with Microsoft or EA selling software directly to consumers for less than they can on their store shelves. But do vendors assume we are completely ignorant of basic retail economics? Digital software distribution should cost less:

  1. When vendors sell direct, it's insanely profitable (90% profit)
  2. When selling through a third-party portal, it's still extremely profitable, far more than retail sales. (50% profit)
  3. It's more efficient. There are no trucks full of boxes, manuals, jewelcases, and other atoms to be distributed across the world. Distribution costs effectively drop to zero.
  4. It's more work for consumers. There are a bunch of additional hoops you don't have with physical media, such as DRM wrappers, helper software to install, and a long download period. It shouldn't be like this. Standard Vista style online activation from an ISO image should be all that's required. But you typically get hogtied into vendor-specific downloaders and wrappers that have to be installed on your machine, such as Steam, and the EA downloader.

Unfortunately, the state of digital software distribution is so bad right now that it's almost a parody of itself. It should be a wondrous, democratizing tool that pushes software pricing down by naturally leveraging the inherent efficiency of bits over atoms. Instead, as it exists today, the digital distribution of commercial software is intentionally crippled. It's only useful for the rich and impatient, a fact vendors exploit to line their pockets with obscene profit margins (even by software industry standards, which is saying a lot). The average consumer avoids digital software distribution entirely in favor of retail discounters. Can you blame them? With every download at retail prices, you're effectively paying vendors five times as much for the same software, and that's a huge ripoff.

It seems to me that, in the area of digital distribution efficiencies, commercial software still has a lot to learn from the open source world-- where everything is downloadable by design. I hope they can adapt before they're forced into extinction.

Discussion

Is It Time for 64-bit on the Desktop?

I've been wary of 64-bit on the desktop, as the benefits are usually outweighed by the compatibility problems. I agree that 64-bit operating systems are inevitable in the big scheme of things, but I've struggled to see the relevance of 64-bit for typical desktop and laptop users. It's a novelty, albeit a necessary one for particular niche applications. However, I'm now beginning to think we could see a fairly broad switch to 64-bit desktop operating systems over the next few years-- much sooner than I anticipated.

Why?

  1. 64-bit versions of popular consumer desktop operating systems are commonly available. Both Vista and OS X 10.5 fully support 64-bit apps out of the box, although evidently the OS X kernel is still 32-bit.

  2. Memory is cheap. Dirt cheap. As of this writing, you can buy 4 gigabytes of quality DDR2 memory for around $120. The memory industry has a nasty habit of switching to newer, faster, more expensive memory types over time, but it looks like this plateau might be here to stay. 4 GB of memory is no longer a rare extravagance for rich users; it's becoming commonplace, even mundane.

  3. The 32-bit x86 architecture doesn't scale very well beyond 2 gigabytes. If you install 4 gigabytes of memory, you may find yourself wondering -- Dude, Where's My 4 Gigabytes of RAM? Good luck explaining to the average user why their computer says they only have 3 GB of memory, even though they paid for 4. It's a tough sell. And honestly, who has time to listen to a bunch of arcane technical explanations for this bizarre limitation? People just want full use of the memory they paid for.

  4. Modern video cards do not play well with 32-bit memory limits. Newer operating systems emphasize the importance of good, discrete video hardware. To get the full suite of cool desktop effects, through Aero, Beryl, or Core Image, you need a decent midrange video card. I'd say the average amount of memory on a midrange video card today is 256 megabytes, and in the enthusiast class it's closer to 512 megabytes. I can easily see that doubling over the next two years. That's a massive chunk of the 32-bit address space carved out for required hardware. And if you're a hardcore gamer or multiple monitor enthusiast with more than one video card, it's worse. Much worse.

The switch to 64-bit is interesting because there's a certain air of finality to it. It may be the last bit transition in our lifetimes.

8-bit28256 bits
16-bit21664 KB
32-bit2324 GB
64-bit2642 EB

Sure, nobody will ever need more than 640 kilobytes of memory, but this is a whole new ballgame. To put the size of the 64-bit memory address space in context, here's a chart showing the respective sizes of each. Note that the scale is logarithmic.

Graph of 8,16,32,64 bit memory spaces on a logarithmic scale

The transition from 16 to 32 bit increased our address space by a factor of 65 thousand. That's big. We've been in the 32-bit era since about 1992; that address space has been good for about thirty years, give or take a few. The transition from 32 to 64 bit, whenever we finally make it, will increase our address space by a factor of four billion. Will there be a transition to 128-bit machines and operating systems? Absolutely. But I'm not sure it'll happen while we're still alive.

You certainly won't be upgrading to 64-bit applications for better performance. Or at least you shouldn't be, unless you enjoy disappointment. 64-bit offers compelling performance benefits on servers, but on desktops, it's a bit of a wash. On one hand, the x86 architecture simply works better in 64-bit mode:

The x86 instruction set was created in the 16-bit era and has accumulated quite a bit of cruft going from 16-bit to 32-bit. Some of that cruft was wisely abandoned during the transition from 32-bit to 64-bit. Applications compiled for x86_64 don't just get larger registers, they get more registers, plus a more modern calling convention and more addressing modes. Every 32-bit x86 application can benefit from these changes, it's just a question of how significant that benefit will be.

On the other hand, stuff is just plain larger in 64-bit land-- your pointers and data structures now take up twice as much room. That 2 megabytes of cache on your CPU won't be able to fit as many things in as it used to.

Once you factor in the pros and cons, you end up with a 64-bit machine that runs desktop applications a few percentage points faster than the 32-bit machine it replaced. There are some exceptions, of course-- most notably games and audio/video editing-- but on average, performance remains roughly the same for typical desktop applications. It's hard to find a definitive set of benchmarks that tell the entire 64-bit versus 32-bit performance story, but all the ones I've seen show rough parity.

I recently upgraded both my work and home machines to 4 GB of memory. Based on the positive Vista x64 experiences related by coworkers and Scott Hanselman, I took the plunge and upgraded to Vista x64. It was the only way to use anything close to the full 4 GB of memory. I resisted mightily, because I expected 64-bit driver and software problems, but much to my surprise, I've had none. Zero. Zilch. It's been unbelievably smooth. Perhaps it's because I waited a good six months after the initial release of Vista to move to x64, but everything "just works". All my hardware has 64-bit drivers. Many of my applications even come in x64 flavors, and the ones that don't still work flawlessly. I didn't change any of the hardware other than adding memory, but I'd swear my system is more responsive under x64 in daily use. And I no longer run into certain aggravating 32-bit operating system limits.

Of course, my original advice regarding 64-bit operating systems hasn't changed. Unless you have more than 2 GB of memory, there's no reason to bother with 64-bit. But have you priced memory recently? Now that 4 GB configurations are approaching mainstream, it's encouraging to know that 64-bit operating systems are out there, and that they work with a minimum of fuss. It's certainly taken long enough to tackle this problem. Hopefully we can stay with 64-bit for the forseeable future, and leave that pesky 128-bit problem for our kids to deal with.

Discussion

Making Donations Easy

In my continuing quest for a decently full-featured graphics editor that hasn't succumbed to feature bloat, I recently installed Paint.NET for the first time. I'lll admit that I had low expectations based on the abysmal user interfaces I've experienced in other open source projects. Imagine my surprise when Paint.NET turned out to be.. well, incredibly freaking great. Not only is the UI actually friendly, modern, and easy to use, but the whole thing is so polished: the installer, the website, the tutorials and forums. It's the complete package.

But enough of my gushing about how great Paint.NET is. Last year, I declared December 1st "Support Your Favorite Small Software Vendor" day.

Check your hard drive, and I'm sure you, too, will find some bit of software written by a small software development shop, maybe even a single developer. Something you find incredibly useful. Something you rely on every day. Something you recommend without reservation to friends and peers. Something that makes using the computer that much more enjoyable. Or at least less painful.

Stop reading this post right now and buy that software. If it's not commercial software, don't let that stop you. Share the love by sending money to the person/shop/organization that created it.

This month it's MediaMonkey. Next month it might be ClipX, or Beyond Compare, or RegexBuddy, or TimeSnapper. It's time to stop floating by on the "free" version and give something back. If I can't come up with the scratch to spend a measly $20 a month supporting the very best work of my fellow independent software developers, can I really call myself a professional software developer? Can you?

As a Windows user, I work extra hard to avoid reinforcing all these negative stereotypes. I believe in the little guy writing cool Windows software. And by "believe in", I mean "pay". And so should you. Whatever operating system you choose to run, try to support the little guys writing the apps you use. We owe it to them. And, more importantly, we owe it to ourselves.

I've set a goal for myself, and I intend to stick to that goal. Whenever I encounter truly excellent software, I vote with my wallet. I pay them. Paint.NET is an open source project, though, and it can sometimes be difficult to figure out how to vote with your wallet when there's nothing to buy, and nobody to pay.

But look how easy the Paint.NET project has made it for me. The install dialog provides a gentle, unobtrusive link for me to "show my appreciation and support future development". That's exactly what I want to do.

Paint.NET install donation dialog

The donation page is similarly helpful, providing one-click PayPal donation buttons for common currency types-- along with the snail mail address if you're old school.

Paint.NET PayPal donation links

This is yet another way Paint.NET demonstrates that it is a thoroughly professional open source project. It raises the quality bar, particularly in the .NET ecosystem, where open source is often a second-class citizen.

Life is easier for commercial projects-- they have to ask you for money. But open source projects don't -- so they often have no provision for payment of any kind. That is a mistake. If I want to vote with my wallet, make it easy for me to give you my money. Set up a clearly marked donation page, and pre-populate it with brainlessly simple, one click methods to donate. If you don't want my money, that's fine too. Just tell me what charity I can donate to on behalf of your project.

I think it's hugely important to ask for donations on any non-commercial project. Not everyone can contribute time and effort. Help us help your project. Let us vote with our wallets.

(Speaking of contributions, yes, I am still planning to donate $10,000 to open-source projects in the .NET ecosystem. The money is set aside and earmarked. I'm sorry it has taken so long to set up, but I promise that it will happen by the end of the year.)

Discussion

Who Wrote This Crap?

Does this sound familiar?

your program (n): a maze of non-sequiturs littered with clever-clever tricks and irrelevant comments. Compare MY PROGRAM.

my program (n): a gem of algorithmic precision, offering the most sublime balance between compact, efficient coding on the one hand, and fully commented legibility for posterity on the other. Compare YOUR PROGRAM.

I first read this in the original 1993 edition of Code Complete. It's quoted from a much earlier book, Stan Kelley-Bootle's The Devil's Dp Dictionary, which was published in 1981. It's still true, more than 25 years later. There's a knee-jerk predisposition to look at code you didn't write, and for various reasons large and small, proclaim it absolute crap. But figuring out who's actually responsible for that crappy code takes some detective work.

The upcoming Visual Studio 2008, or at least the Team System flavor of it, finally delivers a feature I've wanted for years: it can display the code side-by-side with the person who last changed that code.

Visual Studio IDE with source control annotations (blame)

The last person to change any particular line is identified right there, next to the lines they changed, along with the date and revision number. Hovering over the revision number reveals a tooltip containing any checkin comments associated with that change. Clicking on the revision number brings up the full details dialog for that checkin.

Although I have mixed feelings about source control integration with the IDE, I think this is a fairly compelling argument in favor of it. Sometimes you just want to know who wrote this crap, and having that information directly next to the code in your editor saves many tedious steps of manually tracking down the owner of those particular lines.

This feature is called "annotate" in Team System source control, but it's called "blame" in Subversion and in Vault. So if you're wondering who to blame, now you know. It's all those other developers. Obviously.

Discussion

Don't Click Here: The Art of Hyperlinking

I've often thought there is a subtle art to the humble hyperlink, that stalwart building block of hypertext, the stuff that Ted Nelson's Xanadu dream was made of.

The word hypertext was coined by Nelson and published in a paper delivered to a national conference of the Association for Computing Machinery in 1965. Adding to his design for a nonsequential writing tool, Nelson proposed a feature called "zippered lists," in which elements in one text would be linked to related or identical elements in other texts. Nelson's two interests, screen editing and nonsequential writing, were merging. With zippered lists, links could be made between large sections, small sections, whole pages, or single paragraphs. The writer and reader could manufacture a unique document by following a set of links between discrete documents that were "zipped" together.

Many precedents for the idea of hypertext existed in literature and science. The Talmud, for instance, is a sort of hypertext, with blocks of commentary arranged in concentric rectangles around the page. So are scholarly footnotes, with their numbered links between the main body of the text and supplementary scholarship.

In July 1945, long before Nelson turned his attention to electronic information systems, Vannevar Bush published an essay titled "As We May Think" in The Atlantic Monthly, which described a hypothetical system of information storage and retrieval called "memex." Memex would allow readers to create personal indexes to documents, and to link passages from different documents together with special markers. While Bush's description was purely speculative, he gave a brilliant and influential preview of some of the features Nelson would attempt to realize in Xanadu.

The inventor's original hypertext design predicted most of the essential components of today's hypertext systems. Nonetheless, his talk to the Association for Computing Machinery had little impact. There was a brief burst of interest in this strange researcher, but although his ideas were intriguing, Nelson lacked the technical knowledge to prove that it was possible to build the system he envisioned.

I distinctly remember reading this 1995 Wired article on Ted Nelson and Xanadu when it was published. It had a profound impact on me. I've always remembered it, long after that initial read. I know it's novella long, but it's arguably the best single article I've ever read in Wired; I encourage you to read it in its entirety when you have time. It speaks volumes about the souls of computers – and the software developers who love them.

Xanadu was vaporware long before the term even existed. You might think that Ted Nelson would be pleased that HTML and the world wide web have delivered much of the Xanadu dream, almost 40 years later. But you'd be wrong:

HTML is precisely what we were trying to prevent – ever-breaking links, links going outward only, quotes you can't follow to their origins, no version management, no rights management.

I suspect Wikipedia may be closer to Ted's vision of Xanadu: a self-contained constellation of highly interlinked information, with provisions for identity, versioning, and rights management.

But enough about the history of the hyperlink. How can we use them effectively in the here and now? I thoroughly enjoyed Philipp Lenssen's recent link usability tips. I liked it so much, in fact, that I'm using it as a template for a visual compendium of link usability tips – the art of hyperlinking.

  1. Ensure your links are large enough to easily click. When building links, don't run afoul of Fitt's Law. If what you're linking is small, make it bigger. If you can't make it bigger, at least fluff it up a bit with clickable borders so it's easier for people to accurately click. In the below screenshot, only the numbers are linked, which is a shame.

    Example of small, hard to click hyperlinks

  2. The first link is the most important one. The first link will garner most of the reader's attention, and the highest clickthrough rates. Choose your first link appropriately. Start with the important stuff. Don't squander your first link on a triviality.

    Example article with first link as the most relevant one

  3. Don't link everything. Using too many links will turn your text into noise. This works in two dimensions: excessive linking makes text difficult to read, and excessive linking causes deflation in the value of all your existing links. Link in moderation. Only link things important enough to warrant a link.

    Example of excessively hyperlinked text

  4. Don't radically alter link behavior. Links are the cornerstone of the web. Users have built up years of expectactions based on existing behavior in their web browsers. When you change the way hyperlinks work, you're redefining a fundamental part of the web. Is this really what you want? Is this really what your readers want?

    Example of link gadget that radically alters link behavior

  5. Don't title your link "Click Here". Don't even use the words "Click" or "Here" anywhere in your link text. Describe what the link will do for the user when they click on it.

    Example of unnecessary 'Click Here' text in a hyperlink

  6. Don't link things the user might want to select and copy. Woe upon the poor user who needs to select and copy hyperlinked text. It requires a complex ballet of very precise mouse movements to get it to work at all. Here, I'm trying to select the name "Ralph Waldo Emerson", which is part of the hyperlink. Granted, this is not a terribly common scenario – it's probably the most subtle tip on Philipp's list. But when it happens, it's awkward and unpleasant, so do give it some consideration.

    Example of hyperlink making it difficult to select text

  7. Don't include icons on every link. If we're linking in moderation, we should be using link icons in extreme moderation. If every other link has an icon, it's noise. Only highly unusual or irregular links should include icons. I'd also argue that your text, if written properly, can easily communicate the type of link as well as an icon can, but this gets into the realm of personal preference.

    Example of link icons

  8. Don't make your content depend on links to work. Not everyone will click on your hyperlinks. Either they're too busy to click every single link you put in front of them, or maybe they're reading your article in another format where they can't click on the links: print, offline, or mobile. Either way, it's important to provide the context necessary to make your content understandable without the need to visit whatever is behind those hyperlinks. (If you're wondering what this example is about, I should warn you – it's not worth it. For once the inanity of Digg comments was totally appropriate: "retarded blog war".)

    Example of links which provide very little context

  9. Don't hide your links. Hyperlinks should look like hyperlinks. Give them a distinct style, so they cannot be confused with any of the other text on the page. Definitely choose a unique color not used anywhere else on your page, and consider using the well-worn convention of the link underline when necessary. What's clickable here?

    Example of link text that can easily be confused

  10. Don't mix advertising and links. These look like hyperlinks, but they're actually advertising. Which type of link is which, again? And why should the user have to think about this?

    Example of special type of advertising hyperlinks

  11. Don't obfuscate your URLs. Users can preview where your link will ultimately send them by hovering their mouse over it and viewing the URL in the status bar. Avoid using redirects or URL shortening services which make the URL totally opaque. The user shouldn't have to take a leap of faith when clicking on your links.

    Example of obfuscated hyperlink

To head off any potential hate mail headed my way, these are guidelines, not rules. If you know what you're doing, you also know that rules were made to be broken in the right circumstances. The problem is that most people writing HTML don't know what they're doing. A search for "click here" is ample proof of that.

Most of this is advice on writing HTML – which, in my estimation, is basic writing advice in today's online world. Hyperlinking should be taught alongside Strunk & White as far as I'm concerned. Knowing how to hyperlink effectively is fundamental. But as software developers, we can go farther when writing code – we can control the text of the links we generate, too. I touched on this briefly in Don't Devalue The Address Bar, but it's worthy of an entire blog post. In the meantime, Keyvan Nayyeri's Simplify your URLs is a fantastic starting point.

Discussion