Coding Horror

programming and human factors

In Programming, One Is The Loneliest Number

Is software development an activity preferred by anti-social, misanthropic individuals who'd rather deal with computers than other people? If so, does it then follow that all software projects are best performed by a single person, working alone?

digital silhouette

The answer to the first question may be a reluctant yes, but the answer to the second question is a resounding and definitive no. I was struck by this beautifully written piece which explains the dangers of programming alone:

Some folks have claimed that [working alone] presents a great opportunity to establish your own process. In my experience, there is no process in a team of one. There's nothing in place to hold off the torrents of work that come your way. There's no one to correct you when the urge to gold-plate the code comes along. There's no one to review your code. There's no one to ensure that your code is checked in on time, labeled properly, unit tested regularly. There's no one to ensure that you're following a coding standard. There's no one to monitor your timeliness on defect correction. There's no one to verify that you're not just marking defects as "not reproducible" when, in fact, they are. There's no one to double-check your estimates, and call you on it when you're just yanking something out of your ass.

There's no one to pick up the slack when you're sick, or away on a business trip. There's no one to help out when you're overworked, sidetracked with phone calls, pointless meetings, and menial tasks that someone springs on you at the last minute and absolutely must be done right now. There's no one to bounce ideas off of, no one to help you figure your way out of a bind, no one to collaborate with on designs, architectures or technologies. You're working in a vacuum. And in a vacuum, no one can hear you scream.

If anyone's reading this, let this be a lesson to you. Think hard before you accept a job as the sole developer at a company. It's a whole new kind of hell. If given the chance, take the job working with other developers, where you can at least work with others who can mentor you and help you develop your skill set, and keep you abreast of current technology.

Working alone is a temptation for many desperate software developers who feel trapped, surrounded by incompetence and mismanagement in the desert of the real. Working alone means complete control over a software project, wielding ultimate power over every decision. But working on a software project all by yourself, instead of being empowering, is paradoxically debilitating. It's a shifting mirage that offers the tantalizing promise of relief, while somehow leaving you thirstier and weaker than you started.

Like many programmers, I was drawn to computers as a child because I was an introvert. The world of computers – that calm, rational oasis of ones and zeros – seemed so much more inviting than the irrational, unexplainable world of people and social interactions with no clear right and wrong. Computers weren't better than people, exactly, but they were sure one heck of a lot easier to understand.

Computing in the early, pre-internet era was the very definition of a solitary activity. Dani Berry, the author of M.U.L.E., sums up with this famous quote: "No one ever said on their deathbed, 'Gee, I wish I had spent more time alone with my computer.'" But we've long since left the days of solitary 8-bit programming behind. The internet, and the increasing scope and complexity of software, have made sure of that. I can barely program these days without an active internet connection; I feel crippled when I'm not networked into the vast hive mind of programming knowledge on the internet.

What good are nifty coding tricks if you can't show them off to anyone? How can you possibly learn the craft without being exposed to other programmers with different ideas, different approaches, and different skillsets? Who will review your code and tell you when there's an easier approach you didn't see? If you're serious about programming, you should demand to work with your peers.

There's only so far you can go in this field by yourself. Seek out other smart programmers. Work with them. Endeavor to be the dumbest guy in the room, and you'll quickly discover that software development is a far more social activity than most people realize. There's a lot you can learn from your fellow introverts.

Discussion

Escaping From Gilligan's Island

I find it helpful to revisit Steve McConnell's list of classic development process mistakes, and the accompanying case study, at least once every year. Stop me if you've heard this one before:

"Look, Mike," Tomas said. "I can hand off my code today and call it 'feature complete', but I've probably got three weeks of cleanup work to do once I hand it off." Mike asked what Tomas meant by "cleanup." "I haven't gotten the company logo to show up on every page, and I haven't gotten the agent's name and phone number to print on the bottom of every page. It's little stuff like that. All of the important stuff works fine. I'm 99-percent done."

As that old software proverb goes, the first ninety percent of the task takes ninety percent of the time, and the last ten percent takes the other ninety percent.

The Classic Mistakes case study is unnerving to read; it's like those staged re-enactments you see on America's Most Wanted. It's an exaggerated but strangely accurate summary of every pathological software project I've ever participated in, which is to say almost all of them.

This is the phenomenon McConnell likens to Gilligan's Island. Every week there's some new, crazy scheme to escape the island, but at the end of the episode, the castaways always end up stuck on the island for yet another week.

The cast of Gilligan's Island

If you don't immediately see the parallels with software development, allow me to reacquaint you with the long, dismal history of software project failure. Classic mistakes are classic because they're so seductive. You have to actively recognize when you're falling into one of these traps. As Steve once said in an interview:

Actually succeeding in a software project depends a whole lot less on not doing a few things wrong but on doing almost everything right.

Which is why you should have every single one of the 36 classic mistakes outlined in McConnell's Rapid Development committed to memory by now:

People Mistakes Process Mistakes
Undermined motivation
Weak personnel
Uncontrolled problem employees
Heroics
Adding people to a late project
Noisy, crowded offices
Friction between developers and customers
Unrealistic expectations
Lack of effective project sponsorship
Lack of stakeholder buy-in
Lack of user input
Politics placed over substance
Wishful thinking
Overly optimistic schedules
Insufficient risk management
Contractor failure
Insufficient planning
Abandonment of planning under pressure
Wasted time during the fuzzy front end
Shortchanged upstream activities
Inadequate design
Shortchanged quality assurance
Insufficient management controls
Premature or too frequent convergence
Omitting necessary tasks from estimates
Planning to catch up later
Code-like-hell programming

Product Mistakes

Technology Mistakes
Requirements gold-plating
Feature creep
Developer gold-plating
Push me, pull me negotiation
Research-oriented development
Silver-bullet syndrome
Overestimated savings from new tools or methods
Switching tools in the middle of a project
Lack of automated source control

I've increasingly come to believe the only difference between experienced and inexperienced software developers is that the experienced ones realize when they're making mistakes. The same rule applies to software projects and project managers. If you're not actively scanning through the list of Classic Software Development Mistakes as you run your software project, you have no idea how likely it is you're making one of these mistakes right now.

Making mistakes is inevitable, but repeating the same ones over and over doesn't have to be. You should endeavor to make all-new, spectacular, never-seen-before mistakes. To that end, Steve McConnell highlighted a few new classic mistakes in his blog that he's about to add to the canon, 10 years later:

  • Confusing estimates with targets
  • Excessive multi-tasking
  • Assuming global development has a negligible impact on total effort
  • Unclear project vision
  • Trusting the map more than the terrain
  • Outsourcing to reduce cost
  • Letting a team go dark (replaces the previous "lack of management controls")

Steve is also looking for our feedback. He published a Classic Mistakes Survey and invited everyone to participate. If you have any kind of software project experience under your belt, please do.

It's true, the odds are against you. But it's a good idea to periodically remind yourself that maybe, just maybe – if you can avoid making the same classic mistakes as so many other software projects before you – you might actually manage to escape from the island one of these days.

Discussion

How to Clean Up a Windows Spyware Infestation

I recently upgraded my dedicated racing simulation PC, so I was forced to re-install Windows XP SP2, along with all the games. As I was downloading the no-cd patches for the various racing sims I own, I was suddenly and inexplicably deluged with popups, icons, and unwanted software installations. I got that sinking feeling: I had become the unfortunate victim of a spyware infestation.

Of course, this is completely my own fault for browsing the web using the 2004-era web browser included with a default install of Windows XP Service Pack 2. If I was thinking rationally, I would have downloaded Firefox first, or at least connected to Windows Update to get the latest patches, before venturing on to the open internet. But I figured I'd save myself that work, and just pop into a few specific web sites for a few quick downloads. Couldn't hurt, right? Let my mistake be a lesson to everyone reading this: never browse the web without the very latest version of your preferred web browser. Intentionally choosing to browse the web with a three year old browser, as I did, is an incredibly dangerous thing to do.

The consequences in this case are fairly minimal since this isn't even my secondary machine-- it's a special-purpose PC dedicated to gaming. Reinstalling the operating system is no big deal. But it's still an inconvenient timesink, and in any case, the spyware infestation has to be dealt with because it causes serious performance problems and will even interrupt gameplay with incessant popups.

The two most common sites for no-cd patches are MegaGames and GameCopyWorld. In case you're wondering, yes, I do own all my games. I download no-cd patches for convenience's sake; I consider them a privilege of ownership for knowledgeable, ethical PC gamers. I figured the infection came from one of these sites. So I set up a honeypot virtual machine under Virtual PC 2007, using the ancient, original 2001 release of Windows XP and the classic Devil's Own key, and began testing.

Here's a shot of Task Manager at the desktop, after installing the necessary virtual machine additions. This is a completely plain vanilla, clean Windows XP installation: no service packs, no updates, no nothing. This system is connected to the internet, but it's not as dangerous as it sounds. Because it's behind a NAT router that blocks all incoming connections, there's no way it can get passively infected. I let it connect to the internet and quiesce at the desktop for about an hour, just to prove my point. No passive infections occurred behind a NAT router, even for this woefully out of date September 2001 era install of Windows XP.

spyware: taskman before

Now we're leaving passivity behind, and unwisely browsing the open internet with the unpatched, six year old original version of Internet Explorer 6.0. Danger, Will Robinson! I left Task Manager running as I browsed to MegaGames, downloaded a no-cd patch, and... nothing. I then visited GameCopyWorld, downloaded a no-cd patch, and... all of a sudden, it's crystal clear who the culprit is. Check out Task Manager now:

spyware: taskman after

This comes as a shock to me, because GameCopyWorld is recommended often in gaming forums. I consider(ed) it a reputable web site. I've never had a problem with the site before, because I usually surf with the latest updates. But the unpatched browser spyware infestation from visiting GCW-- just from visiting the web pages, even if you don't download a single thing-- is nearly immediate and completely devastating. The virtual machine desktop, after a few scant minutes, tells the story:

spyware: desktop after

It isn't pretty, and let me tell you, I have a new degree of sympathy for the poor users who become the unfortunate victims of spyware infestations. The machine becomes borderline unusable, between...

  • new icons that magically appear on your desktop
  • full-screen popups that occur every two minutes
  • dialog boxes that offer to "install antivirus software" with only an OK button
  • system performance degradation from all those spyware background processes

... it's a wonder people don't just give up on computing altogether. Once the door is open, it seems the entire neighborhood of malware, spyware, and adware vendors take up residence in your machine. There should be a special circle of hell reserved for companies who make money doing this to people.

At first, I was mad at myself for letting this happen. I should know better, and I do know better. Then I channeled that anger into action: this is my machine, and I'll be damned if I will stand for any slimy, unwanted malware, adware, or spyware that takes up residence on it. I resolved to clean up my own machine and fix the mess I made. It's easier than you might think, and I'll show you exactly how I did it.

Our first order of business is to stop any spyware that's currently running. You'll need something a bit more heavy-duty than mere Task Manager-- get Sysinternals' Process Explorer. Download it, run it, and sort the process list by Company Name.

spyware: process explorer screenshot

Kill any processes that don't have a Company Name (with the exception of DPCs, Interrupts, System, and System Idle Process). Right-click the processes and select Kill, or select them and press the Delete key. You can use my initial screenshot of Task Manager, at the top of this post, as a reference for what should be running in a clean Windows XP installation. But there's usually no need to be that specific; unless it has a Company Name you recognize, it's highly likely to be a rogue application and should be terminated.

Stopping the running spyware is only half the battle. Now we need to stop the spyware from restarting the next time we boot the system. Msconfig is a partial solution, but again we need something more powerful than what is provided out of the box. Namely, SysInternals' AutoRuns utility. Download it, run it, and start browsing through the list that appears:

spyware: autoruns screenshot

As you can see, there's a bunch of spyware, malware, adware, and god knows what else gunking up the works-- all from visiting a single website! Scroll through the list, all the way to the bottom, scanning for blank Publishers, or any Publisher you don't recognize. If you see anything that's suspect, delete it! In a default Windows install, 99.5% of the entries will have "Microsoft Corporation" as the Publisher. Any reputable vendor will have no problem attaching their name to their work, so it's generally only the blank entries you need to worry about.

Now reboot the system. We've removed most of the spyware infestation, but there's a certain much more virulent class of spyware that can survive this treatment. We'll deal with them next.

After rebooting, check Process Explorer and Autoruns for anything suspicious, exactly as we did before. The first thing I noticed that "came back" in Autoruns was a suspicious driver, core.sys, that didn't have a Publisher. I used the powerful Find | Find Handle or DLL menu in Process Explorer to locate any active references to this file.

spyware: process explorer find

Unfortunately I didn't capture the right screenshot at the time, so I'm showing a generic search result above. Anyway, there was exactly one open handle to the core.sys file. I selected the result, which highlights the corresponding handle in the lower pane of the Process Explorer view. Right-click the handle entry in the lower pane and click "Close Handle".

spyware: process explorer, close handle

After I closed the handle, I could physically delete the rogue core.sys file from the filesystem, along with the Autoruns entry for it. Problem solved!

The other item that reappeared in Autoruns after the reboot was an oddly named DLL file with hooks into Winlogon and Explorer. In addition to the suspicious name, each entry carries the tell-tale sign of the missing Publisher value:

spyware: winlogon hooks

Delete the entries in Autoruns all you want; they'll keep coming back when you press F5 to refresh. This rogue, randomly named DLL continually monitors to make sure its ugly little hooks are in place. The nasty thing about processes attached to Winlogon is that they're very difficult to kill or remove. We can kill Explorer, but killing Winlogon is not an option; it's the root process of Windows, so shutting it down causes the OS to restart. It's a difficult catch-22.

But we're smarter than the malware vendors. Fire up Process Explorer and use the Find | Find Handle or DLL menu to locate all the instances of this DLL by name. (See, I told you this option was powerful.) Kill any open handles to this file that you find, exactly as we did before. But you'll need to go one step further. We know from the Autoruns that this DLL is likely to be attached to the Explorer and Winlogon processes, but let the find results be your guide. Double-click on any processes you found that reference this DLL. In the process properties dialog, select the Threads tab. Scroll through the threads and kill every one that has the rogue DLL loaded.

spyware: killing threads in process explorer

Once you've killed all the threads, you can finally delete the entries in Autoruns without them coming back. Reboot, and your machine is now completely free of spyware. I count 17 entries in Task Manager, exactly the same number as when I originally started.

Of course, the smartest thing to do is not to get infected with spyware, malware, or adware in the first place. I can't emphasize this enough: always browse with the latest patches for your preferred web browser. But if you do happen to get infected, at least now you have the tools and knowledge to banish these evildoers from your machine forever.

Update: If you're worried about spyware, malware, and adware, you should strongly consider not running as an Administrator.

Discussion

Incremental Feature Search in Applications

I'm a big fan of incremental search. But incremental search isn't just for navigating large text documents. As applications get larger and more complicated, incremental search is also useful for navigating the sea of features that modern applications offer.

Office 2007's design overhaul is arguably one of the most significant innovations in GUI applications since the invention of menus and toolbars:

Of course, as you know if you've read Part 1 of the story, many of today's UI paradigms attributed to Apple were introduced well before the Lisa or the Macintosh. Regardless of who gets credit for them, they're good paradigms. There's nothing wrong with menus and toolbars-based UI for certain applications. Truth be told, these paradigms served Office well for a number of releases.

It's not that menus and toolbars are bad or that the people who created them weren't smart. The problem is that Office has outgrown them. There's a point beyond which menus and toolbars cease to scale well. A flat menu with 8 well-organized commands on it works just great; a three-level hierarchical menu containing 35 loosely-related commands can be a bit of a disaster.

In short, we're not trying to destroy anything. Our goal is to create a new standard user interface for full-featured productivity applications. The original team who built Word or Excel couldn't have imagined how much their products would be able to do today. I want us to step back, to think through the question: "what kind of interface would they have built knowing how Word turned out?"

It's absolutely true that menus and toolbars don't scale. The Office 2007 ribbon takes cues from web design to make navigating the thousands of features in Word, Excel, and Powerpoint much easier. But the ribbon, although it's a major improvement over menus and toolbars, isn't perfect, either:

I was working with Excel all day yesterday, trying to find a command I know existed in Excel 2003 and can be found quite easily. I was clicking every tab and hovering over all the buttons. I must have gone through the Ribbon at least 5 times. In the end, the stupid command wasn't even in the ribbon to begin with. You had to manually add it to the "Quick Access Toolbar". If I had "Scout", I could have saved at least the frustration of not being able to find a tool that I know is there, not to mention the time and effort wasted.

I know a star developer who is expert at Word, and the same exact thing happened to her. How do you find what isn't in the ribbon? Well, you could use incremental search to find it. Microsoft has an experimental beta of an incremental ribbon search feature for Office 2007, codenamed "Scout":

Office 2007 search feature codename 'scout'

Unfortunately, it looks like internal politics at Microsoft may have killed the ribbon search add-in, which is a shame. A search feature doesn't take anything away from the ribbon. They serve two different audiences, and complement each other perfectly. I'm with Long Zheng: the ribbon search feature should be shipped as a PowerToy.

The first time I saw an application use an incremental feature searching technique was back in 2004. The options dialog for Quest's Toad database utility became so complex that it required a search function to find anything in it. At the time, I wasn't too keen on the idea of an options dialog that complicated, but I have since bowed to its inevitability. Applications get more feature-rich over time, and navigation methods have to evolve to keep up.

You probably already know that Vista's revamped start menu takes advantage of incremental searching. But other Microsoft applications are starting to adopt this paradigm as well. Take Microsoft's new Expression design tools, for example. In most development tools, you're facing down enormous lists of properties all the time. How do you find the particular property you're looking for? You guessed it: incremental search.

Expression Blend property search filtering

In the above screenshot, I'm filtering the properties for a Windows Presentation Foundation button by typing "ind" in the properties search field. Note how the interface dynamically filters, showing only button properties that match what I've typed as I type it. Isn't that much faster than scrolling through a list?

If the evolution of the web has taught us anything, it's that search inevitably becomes the dominant navigation metaphor. Simple applications may be able to get away with menus and toolbars, or better yet, a ribbon. But as the application grows larger and more complex, it's faster to incrementally search for the feature we need.

Discussion

Where Are The High Resolution Displays?

In a recent post, Dave Shea documented his love/hate relationship with the pixel grid:

Here's the caveat though -- high resolution displays. At 100dpi, ClearType wins out, but we're not going to be stuck here much longer. Give it a few years, let's do this comparison again when 200dpi is standard. I suspect the pixel grid won't matter nearly so much then.

I was somewhat curious about Dave's claim that in "a few years" displays with 200 DPI will be standard fare. So I did some research to document how far we've come in display resolution over the last twenty years.

Year Model Resolution Size DPI
1984 Original Macintosh 512 x 342 9" (8.5") 72
1984 IBM PC AT 640 x 350 13" (12.3") 60
1994 Apple Multiple Scan 17 Display 1024 x 768 17" (16.1") 80
2004 Apple Cinema HD display 2560 x 1600 30" 100

I used the Tag studios Monitor DPI calculator to arrive at the DPI numbers in the above table. I couldn't quite figure out what the actual displayable area of those early CRT monitors were, so I estimated about 5% non-displayable area based on the diagonal measurement.

Regardless, it's sobering to consider that the resolution of computer displays has increased by less than a factor of two over the last twenty years. Sure, displays have gotten larger-- much larger-- but actual display resolution in terms of pixels per inch has only gone up by a factor of about 1.6.

I can't think of any other piece of computer hardware that has improved so little since 1984.

Some manufacturers do make high resolution displays, but they're far from common, and very few get anywhere close to 200 DPI. Here's one model ViewSonic was demonstrating in 2002:

This 22.2-inch LCD panel being sold by Viewsonic uses the same panel developed and marketed by IBM last year (T220/T221). The difference is that IBM charged nearly $20,000 for its version; Viewsonic plans on selling this one for around $8,000. That's still pretty pricey -- what makes this panel so special?

Try 9.2 million pixels, for one thing. This 16x9 aspect panel has a native resolution of 3840x2400 pixels. That translates to roughly 200 dots per inch. In fact, you have to put your nose up to the screen to really notice the pixels. Scanned topographical maps could be easily read, even down to the smallest typeface. The monitor is targeted towards specialized image processing and CAD applications, and offers a 400:1 contrast ratio. Driving 9.2 megapixels requires a graphics card with twin TMDS transmitters.

High pixel density monitors are far outside the mainstream. The large versions are prohibitively expensive; the small versions can't justify their price premium over the lower-resolution competition with larger physical size. It's telling that today, in 2007, the Apple store doesn't even sell a single standalone LCD offering over 100 DPI. Nor can I find a single high resolution LCD of any type on newegg. I have no doubt that if I had $10,000 burning a hole in my pocket, I could buy a 200 DPI display somewhere, but at consumer prices and through consumer outlets, high resolution displays simply don't exist.

Most of the time, you see high resolution display options on laptops, where the notebook form factor physically precludes the display from getting any larger. Manufacturers are forced to pack more and more pixels into a LCD panel of a fixed size:

When I purchased my notebook I had a choice of three monitor resolutions - the standard 1200 x 800, 1680 x 1050, and 1920 x 1200. The diagonal screen size is 15.4" giving me the three corresponding pixel densities of 98, 129, and a whopping 147 ppi!

It's hard to see this choice of display resolutions as anything other than a side-effect of laptop size restrictions. If notebook vendors could somehow fit a folding 30" LCD panel into a laptop, they absolutely would. But even at 147 DPI, we're only halfway to our goal. To reach 200 DPI, that same 15.4" laptop display would have to pack in 2560 x 1600 pixels. Imagine a 30" Apple Cinema HD display shrunken by half, and you'll get the idea.

Short of some kind of miraculous technological breakthrough, I can't see computer displays reaching 200 DPI in "a few years". It's unlikely we'll even get there in ten years. I'd love to be proven wrong, but all the evidence of history-- not to mention typical consumer "bigger is better" behavior-- is overwhelming.

Discussion