Coding Horror

programming and human factors

Typography: Where Engineers and Designers Meet

Over the christmas break, my wife and I visited New York City for the first time. One of the many highlights of our trip was the Museum of Modern Art, which is running a year-long special exhibit, 50 Years of Helvetica. It's a tiny exhibit tucked away in a corner of MoMA. Blink and you'll miss it amongst all the other wonderful art. But even a small exhibit provides ample physical evidence that Helvetica-- a humble font, nothing more than a collection of mathematical curves shaped into letterforms-- had a huge impact on the world.

Helvetica is so highly regarded in the design world there's a full length documentary on the topic: Helvetica the Movie.

Helvetica the Movie

Another little-known fact about Helvetica is the relationship between this timeless classic and another font you've almost certainly encountered before: Arial. John Gruber explains:

Helvetica is perhaps the most popular typeface in the world, and is widely acclaimed as one of the best. Arial is a tawdry, inferior knock-off of Helvetica, but which, to the detriment of the world, Microsoft chose to license for Windows simply because it was cheaper. Because Arial is a default Windows font and Helvetica is not, it is ubiquitous. Mark Simonson's "The Scourge of Arial" is an excellent resource on both Arial's history and its typographic deficiencies; his accompanying sidebar is an excellent primer on the specific differences between Arial and Helvetica.

You do have to be something of a font geek to appreciate the subtle differences between Helvetica and Arial, much less Helvetica and its precursor, Grotesk. But the discussion leads directly to another hugely important twenty-first century problem: how do you copyright the completely abstract, pure intellectual property that is a font?

All computer geeks tend to fall in love with typography at some point in their careers. Donald Knuth is a fine example; frustrated with the limited typesetting options available for his books in the late 70's, Knuth went on a "brief hiatus" to come up with something better. Seven years later, he unleashed TeX upon the world.

In 1977, Knuth halted research on his books for what he expected to be a one-year hiatus. Instead, it took 10. Accompanied by Jill, Knuth took design classes from Stanford art professor Matthew Kahn. Knuth, trying to train his programmer's brain to think like an artist's, wanted to create a program that would understand why each stroke in a typeface would be pleasing to the eye. "I wanted to try to capture the intelligence of the design, not just the outcome of the design," he says. For example, how do you insert line breaks into a paragraph so there isn't too much space between words and so that most of the lines don't end in hyphens? Although this seems like an aesthetic challenge to be solved by human taste, Knuth says, computers do it well. "This is a combinatorial problem," he explains. "There might be a thousand ways to break a paragraph into lines and each way has a score." His solution was to build a computer program capable of ranking the thousand options and picking the best one.

Typography and fonts are a rare and vital intersection point between software engineers and designers. And there's absolutely no better book on the topic than Thinking with Type: A Critical Guide for Designers, Writers, Editors, & Students. I recommend it without hesitation to all of the above, and certainly to software engineers with even the slightest passing interest in typography.

Thinking with Type: A Critical Guide for Designers, Writers, Editors, & Students

Like all great books, it teaches you "why", not "how":

This is not a book about fonts. It is a book about how to use them. Typefaces are an essential resource employed by graphic designers, just as glass, stone, steel, and countless other materials are employed by architects. Graphic designers sometimes create their own fonts and custom lettering. More commonly, however, they tap the vast library of existing typefaces, choosing and combining them in response to a particular audience or situation. To do this with wit and wisdom requires knowledge of how-- and why-- letterforms have evolved.

I think I can trace my initial interest in fonts way, way back to the pirate crack credit screens on Apple // software I encountered as a wayward teenager.

apple // crack screen

I count four fonts on this crack screen. There were countless disk sets of these low-resolution bitmap fonts to choose from. Even back in the mid-80s, these primitive fonts added a particular style, a feeling, an intonation to the text-- and we only had a dismal little 280 x 192 screen to work with. How wonderfully liberating it must feel to have thousands of RGB anti-aliased pixels to render beautiful fonts with today, much less the millions we'll eventually have.

Discussion

The Five Browser Shortcuts Everyone Should Know

Nobody has time to memorize a complete list of web browser keyboard shortcuts, and really, why should they? I only know a handful of web browser keyboard shortcuts, myself, and I probably use the same five shortcuts a hundred times a day. But not everyone knows about these five essential browser keyboard shortcuts. Let's fix that.

I spend more time in my browser than any other single application on my computer. Launching such a commonly used application should be completely frictionless. I use the built in Windows Vista quick launch shortcuts. My web browser is the first item on my quick launch bar, so all I need to do is tap Windows+1 to bring up a new browser instance.

winkey + 1

Have you set up a keyboard shortcut to launch your preferred web browser? If not, why not? Once the browser is up, I usually want to be in one of two places: the address bar, or the search box.

To navigate to the address bar, press Alt+D.

screenshot: browser keyboard shortcut, Alt+D

To navigate to the search box, press Ctrl+E.

screenshot: browser keyboard shortcut, Ctrl+E

Another nifty thing about these two shortcuts is that, if you're running Windows Vista, they work identically in Vista's File Explorer. Some keyboard conventions can follow you from the web back to your desktop, too.

Once you've entered the URL or search term, normally you'd press Enter, right? Wait a second. If you press enter, whatever's currently displayed in your browser will be replaced with a different website. But it doesn't have to be. Rather than pressing enter, press Alt+Enter to open the website or search in a new tab.

screenshot: keyboard shortcut, Alt+Enter

These four key sequences probably constitute 99% of the typing I do while browsing the web. If you want to get extra fancy you can use Ctrl+Tab to iterate through all those tabs you now have open in your browser, but it's not required. I'm no keyboard purist. I promote fully two-handed computer usage, whether those two hands are tapping away on the keyboard or split between the keyboard and the mouse. I'm often mousing away while I use these shortcuts.

The final shortcut is obvious, once you know it. A few days ago, I received a very nice email from Antonio complimenting my blog, while asking for a change in the design:

I have been reading your blog for a while now and have noticed that on almost all posts there are links to either past posts or other sites. My suggestion is to make the links open in a new window (or in my case a new tab). I want to continue reading your blog post and just have a tab open with the URL of the new link and not make the page I am on load the link's URL. I know I can just right click and then select "open on new tab", but it would be much easier to just click on the links :)

It's a great suggestion. So great, in fact, that this behavior is already built into the web browser. While you're reading, press the middle mouse button (the "mouse wheel" button) to open related links in a new tab. The links will open in a new tab in the background, so they don't interrupt the flow of what you're doing. When you're done, you can go back and explore the related sites in all those newly opened tabs at your convenience.

pressing the middle mouse button

What you end up with is a pile of new tabs. If the middle mouse button giveth, the middle mouse button can also taketh away: click the middle mouse button on a tab to close it. I've grown so enamored of this behavior, like Paul Stovell, I expect middle click conventions to work everywhere. I curse every time I middle-click on a taskbar button, expecting that app to close.

I apologize if you feel I've insulted your intelligence with such basic shortcuts. But realize that not everyone knows what you know. And that's a shame, because these five simple tips …

  1. Set up a keyboard shortcut to launch your browser
  2. Alt+D to navigate to the browser address bar
  3. Ctrl+E to navigate to the browser search box
  4. Alt+Enter to open searches or websites in a new tab
  5. The middle mouse button opens links in a new tab, and also closes tabs

… sure could make browsing the web a much more pleasant experience, if everyone knew about them.

Discussion

What's On Your Keychain, 2008 Edition

Over the last few years, I've become mildly obsessive about the contents of my keychain. Here's what's on my keychain today:

What's on my keychain in 2008

In internet parlance, this is known as EDC or every-day carry. There's an entire internet forum dedicated to the art and science of determining what goes in your pocket. As expected, in terms of strip-mining an obsession, the internet delivers.

I originally wrote about the evolution of my keychain in 2005 and again in 2006. Here's the current lineup:

  1. Leatherman Squirt S4 multitool
  2. Corsair 8 GB Flash Voyager thumb drive
  3. Fenix L0D-CE AAA LED flashlight

The one constant is the Leatherman Squirt. Mine is actually personalized with a Pulp Fiction joke that not everyone gets; I opted to flip it over this year so I wouldn't offend. You can view the text in previous years' photos, if you're curious. I absolutely adore the Squirt. There's a reason I've been carrying this great little multitool since 2005; I use it almost every single day. Prior to the Squirt, I carried the Leatherman Micra, but the Squirt is a far more versatile multitool in almost the same form factor and weight. If you're open to carrying a small multitool, I recommend the Squirt without reservation. I have yet to discover anything better in its weight class. Note that the Squirt comes in a few flavors, which do vary slightly:

I was, however, sorely tempted to get a Leatherman Skeletool. It's beautiful.

Leatherman Skeletool CX

(The carbon fiber CX model is pictured; it also comes in an all-metal version which is $20 cheaper.) According to the Leatherman site, it's twice the weight and size of the Squirt, which puts it squarely out of EDC contention for me.

In 2005, I carried a 512 MB thumb drive. In 2006, 1 GB. In 2007, 4 GB. This year it's a whopping 8 gigabytes. As capacities increase, speed of the thumb drive becomes paramount. What good is a gigantic 16 GB thumb drive if it takes you an hour to transfer your data? I'd prefer to carry a tiny USB thumb drive, but my research indicated that all the svelte, sexy, impossibly tiny USB thumb drives inevitably come with a hefty speed penalty. My previous 4 GB drive was tiny, the size of a half-stick of gum, but slow enough that I found it awkward to use in practice. After doing a bit more research for this generation of my keychain, I finally arrived at the Corsair 8 GB Flash Voyager thumb drive, which offered the best blend of size, speed, and cost. In my testing, I can read from the Flash Voyager at around 25 MB/sec (5 minutes to dump), and write to it at about 7 MB/sec (19 minutes to fill). Not too shabby. Be sure to consider speed when buying your next high capacity USB drive, or like me, you may end up disappointed.

I didn't realize how obsolete my barely two year old AAA battery powered LED flashlight was until I picked up the new Fenix L0D-CE. A commenter to my previous keychain post recommended this brand, which sports a fancy new Cree LED. I figured it'd be a minor upgrade, but I was blown away by the difference in brightness compared to my old LED flashlight-- the Fenix L0D is incredibly bright! Don't take my word for it; this experienced flashight reviewer was impressed too:

The sheer volume of light produced is amazing for a single AAA cell light. My readings show that on the "high" setting the L0D-CE produces more overall light than a 3-D cell Maglite. On "medium" it produces more overall light than a common 2-D cell light. All this from one AAA cell.

You read that right: this little LED dynamo produces more light from a single teeny-tiny AAA than an older, traditional bulb technology Maglite produced from three enormous D cell batteries. Amazing! As alluded to in the review-- and unlike my previous LED flashlight-- this model has five different modes, all selectable by rapidly switching it off, then back on:

  1. Medium (default), 3.5 hours @ 20 lumens
  2. High, 1 hour @ 60 lumens
  3. Low, 8.5 hours @ 7.5 lumens
  4. Strobe light
  5. SOS pattern

The AAA model is constrained by the limitations of the battery. Imagine how bright the other, larger models in the Fenix family can get:

I had no idea LED technology was advancing so rapidly. Honestly, unless you enjoy blinding people for fun (this does have its charms), the single AAA model should suffice. It is astonishingly bright in any dim area.

That's probably far more than you wanted to know about what's on my keychain. So what's on your keychain this year, and why?

Discussion

How Should We Teach Computer Science?

Greg Wilson recently emailed me the following question:

I'm teaching a software engineering class to third-year students at the University of Toronto starting in January, and would like to include at least one hour on deployment --- [deployment] never came up in any of my classes, and it's glossed over pretty quickly in most software engineering textbooks, but I have learned the hard way that it's often as big a challenge as getting the application written in the first place.

Deployment is a huge hurdle. It's a challenge even for the best software development teams, and it's incredibly important: if users can't get past the install step, none of the code you've written matters! And yet, as Greg notes, existing software engineering textbooks give this crucial topic only cursory treatment. Along the same lines, a few weeks ago, a younger coworker noted to me in passing that he never learned anything about source control in any of his computer science classes. How could that be? Source control is the very bedrock of software engineering.

If we aren't teaching fundamental software engineering skills like deployment and source control in college today, we're teaching computer science the wrong way. What good is learning to write code in the abstract if you can't work on that code as a team in a controlled environment, and you can't deploy the resulting software? As so many computer science graduates belatedly figure out after landing their first real programming job, it isn't any good at all.

Today's computer science students should develop software under conditions as close as possible to the real world, or the best available approximation thereof. Every line of code should be written under source control at all times. This is not negotiable. When it's time to deploy the code, try deploying to a commercial shared web host, and discovering everything that entails. If it's an executable, create a standalone installer package that users have to download, install, and then have some mechanism to file bug reports when they inevitably can't get it to work. Students should personally follow up on each bug filed for the software they've written.

Will this be painful? Boy, oh boy, will it ever. It'll be excruciating. Students will hate it. They'll begin to question why anyone in their right mind would want to write software.

Welcome to the real world.

After I wrote my response to Greg, Joel Spolsky posted an entry on computer science education that, at least to my eye, seemed hauntingly similar to the advice I offered:

I think the solution would be to create a programming-intensive BFA in Software Development -- a Julliard for programmers. Such a program would consist of a practical studio requirement developing significant works of software on teams with very experienced teachers, with a sprinkling of liberal arts classes for balance.

When I said BFA, Bachelor of Fine Arts, I meant it: software development is an art, and the existing Computer Science education, where you're expected to learn a few things about NP completeness and Quicksort is singularly inadequate to training students how to develop software.

Imagine instead an undergraduate curriculum that consists of 1/3 liberal arts, and 2/3 software development work. The teachers are experienced software developers from industry. The studio operates like a software company. You might be able to major in Game Development and work on a significant game title, for example, and that's how you spend most of your time, just like a film student spends a lot of time actually making films and the dance students spend most of their time dancing.

This is not to say that computer science programs should neglect theory. Fundamental concepts such as algorithms and data structures are still important. My algorithms class was my favorite and by far the most useful class I ever took for my own computer science degree. But teaching these things at the expense of neglecting more prosaic real world software engineering skills-- skills that you'll desperately need as a practicing software developer-- is a colossal mistake. It's what Steve Yegge was alluding to in his fantastical Wizard School essay.. I think.

There is the concern that all those highfalutin' computer science degrees could degenerate into little more than vocational school programs, something Joel mentioned in his excellent Yale address:

At Ivy League institutions, everything is Unix, functional programming, and theoretical stuff about state machines. As you move down the chain to less and less selective schools Java starts to appear. Move even lower and you literally start to see classes in topics like Microsoft Visual Studio 2005 101, three credits. By the time you get to the 2 year institutions, you see the same kind of SQL-Server-in-21-days "certification" courses you see advertised on the weekends on cable TV. Isn't it time to start your career in (different voice) Java Enterprise Beans!

You can have it both ways. That's why I'm so gung-ho for internships. College CS classes tend to be so dry and academic that you must spend your summers working in industry, otherwise you won't have the crucial software engineering skills you'll need to survive once you graduate. Unimportant little things like, say, source control and deployment and learning to deal with users. I constantly harp on internships whenever I meet college students pursuing a computer science degree. It's for your own good.

It does strike me as a bit unfair to force students to rely on internships to complete their education in computer science. Or, perhaps, something even worse. "Want to learn computer science? No college necessary! Just download some ISO images and found your own social networking startup!" Unleashing the naked greed of the TechCrunch crowd on tender young programming minds seems downright cruel.

So how should we teach computer science? The more cynical among us might say you can't. I think that's a cop-out. If students want to prepare themselves for a career in software development, they need to shed the theory and spend a significant portion of their time creating software with all the warty, prickly, unglamorous bits included. Half of software engineering is pain mitigation. If you aren't cursing your web hosting provider every week, fighting with your source control system every day, deciphering angry bug reports from your users every hour-- you aren't being taught computer science.

Discussion

The Enduring Art of Computer Programming

I saw on reddit that today, January 10th, is Donald Knuth's seventieth birthday.

Knuth is my Homeboy

Knuth is arguably the most famous living computer scientist, author of the seminal Art of Computer Programming series. Here's how serious Mr. Knuth is – his books are dedicated, not to his wife or a loved one, but to a computer:

This series of books is affectionately dedicated

to the Type 650 computer once installed at
Case Institute of Technology,
in remembrance of many pleasant evenings.

Jeffrey Shallit compiled an excellent set of links commemorating the 70th birthday of this legendary figure:

  • The Genius of Donald Knuth: Typesetting with Boxes and Glue. "I don't know of any other software other than TeX implemented in the 1970s that remains absolutely and unquestionably dominant in its domain. And the glue-and-boxes model of text layout was a piece of absolute genius - one of the most masterful examples of capturing an extremely complex problem using an extremely simple model. It's beautiful. And it's typical of the kind of thing that Knuth does."
  • Opinion 86 "So Knuth is very right to worry about constants. And he gets his hands dirty and does the coding all by himself, and he gave us such great programs as TeX, and its fully-detailed manuals. He taught us by example the art of computer programming, and he modestly claims that it is art in the sense of the artisan rather than that of the artist. But his perfect artisanship became the most refined of fine arts."
  • Analyzing Algorithm X "Knuth was the first to use the phrase 'analysis of algorithms,' at the 1970 ICM in Nice. He popularized and extended O-notation (previously used in functional analysis) as an essential tool for algorithm analysis. And his Art of Computer Programming set the standards for the field and is still well worth reading today."
  • Volume 4 is already written (in our hearts). "But this being a lecture series, Knuth also fields questions from the audience about everything from sin and redemption to mathematical Platonism. He has a habit of parrying all the really difficult questions with humor; indeed, he does this so often one comes to suspect humor is his answer."
  • Don Knuth is 70 "As a member of a community whose life is punctuated by twice-yearly conferences, what I find most inspiring about Knuth is his dedication to perfection, whatever time it might take to achieve it."
  • Today is Knuth's 70th birthday!! "He was one of the first people to realize that an algorithm can be analysed in a mathematical and intelligent way without running it. This is one of the most important starting points for computer science theory. Perhaps even for computer science."
  • Happy Birthday, Don Knuth! "Don Knuth straddled both worlds effortlessly, gaining respect from 15 year old hackers and 50 year old researchers alike. And that's the most tremendous feat of all."
  • Donald Knuth and Me: "Later, when I attended university, I began to understand Knuth's wider influence. Almost everywhere I turned, Knuth had been there before."

For mainstream press coverage of Donald Knuth, Jeffrey recommends:

My very favorite thing about Mr. Knuth is that, despite the profound and enduring depth of his contributions to the field of computer science, he has a great sense of humor. For proof, let's go back in time. Way, way back, to Mad Magazine #33, originally published in 1957.

These images are from Absolutely Mad: 50 Years of Mad Magazine, a DVD-ROM containing (almost) every issue of Mad. Surprisingly, the disc isn't encumbered by any bizarre DRM scheme; every issue is a simple PDF file in a folder on the disc. The resolution isn't as high as I would like, but I'm not about to complain after paying thirty-three measly bucks for a nearly complete digital library of Mad.

(And now, even better, you can get extremely high resolution versions of early Mad Magazines from Comixology. Sadly, it only goes up to issue #23 at the moment.)

As a long time fan of Mad Magazine, I was delighted to discover that Donald Knuth contributed an article to Mad, "The Potrzebie System of Weights and Measures", while he was still in high school. It's a little difficult to read the introductory text that ties the article to Knuth, so I'll quote it here.

When Milwaukee's Donald Knuth first presented his revolutionary system of weights and measures to the members of the Wisconsin Academy of Science, Arts, and Letters, they were astounded... mainly because Donald also has two heads. All kidding aside, Donald's system won first prize as the "most original presentation". So far, the system has been adopted in Tierra del Fuego, Afghanistan, and Southern Rhodesia. The U.N. is considering it for world adoption.

This new system of measuring, which is destined to become the measuring system of the future, has decided improvements over the other systems now in use. It is based on measurements taken 6-9-12 at the Physics Lab of Milwaukee Lutheran High School, in Milwaukee, Wis., when the thickness of Mad Magazine #26 was determined to be 2.263348517438173216473 mm. This length is the basis for the entire system, and is called one potrzebie of length. The Potrzebie has also been standardized at 3515.3502 wave lengths of the red line in the spectrum of cadmium. A partial table of the Potrzebie System, the measuring system of the future, is given below.

I still subscribe to Mad Magazine; the biting satire and political humor haven't aged a bit in the intervening fifty years. I know it sounds crazy for a grown man to extol the virtues of what most charitably consider to be a kids' humor rag. But I'm not the only one. Just ask the Los Angeles Times' Robert Boyd:

[Mad Magazine] instilled in me a habit of mind, a way of thinking about a world rife with false fronts, small print, deceptive ads, booby traps, treacherous language, double standards, half truths, subliminal pitches and product placements; it warned me that I was often merely the target of people who claimed to be my friend; it prompted me to mistrust authority, to read between the lines, to take nothing at face value, to see patterns in the often shoddy construction of movies and TV shows; and it got me to think critically in a way that few actual humans charged with my care ever bothered to.

Programming algorithms are hard science, backed by some serious math. Thanks for the reminder, Mr. Knuth, that computer science is indeed serious stuff, but it's also a lot of fun. Here's to you – and to the enduring art of computer programming you introduced us all to.

Discussion