Coding Horror

programming and human factors

Mort, Elvis, Einstein, and You

Earlier this week I wrote about The Two Types of Programmers. Based on the huge number of comments, it seemed to strike a nerve. Or two. This surprised me, because it was never meant to be the inflammatory, provocative diatribe that many people interpreted it as. It got so out of hand that Ben Collins-Sussman, the original author of the post I quoted, was driven to post a followup clarifying his original post.

Many of the commenters were offended that I somehow lumped them into a vast unwashed eighty-percent sea of vocational programmers. Here's what's particularly ironic: the very act of commenting on an article about software development automatically means you're not a vocational eighty-percenter. Trust me. I absolutely was not calling any of my readers inadequate. I don't say that because I'm a world class suckup; my blog isn't some kind of special case or even particularly good. I say it because if you're reading any programming blog whatsoever, you're demonstrated a willingness to improve your skills and learn more about your chosen profession.

Thus, if you read the article, you are most assuredly in the twenty percent category. The other eighty percent are not actively thinking about the craft of software development. They would never find that piece, much less read it. They simply don't read programming blogs – other than as the result of web searches to find quick-fix answers to a specific problem they're having. Nor have they read any of the books in my recommended reading list. The defining characteristic of the vast majority of these so-called "vocational" programmers is that they are unreachable. It doesn't matter what you, I or anyone else writes here – they'll never see it.

The problem isn't the other 80%. The problem is that we're stuck inside our own insular little 20% world, and we forget that there's a very large group of programmers we have almost no influence over. Very little we do will make any difference outside our relatively small group. The problem, as I obviously failed to make clear in the post, is figuring out how to reach the unreachable. That's how you make lasting and permanent changes in the craft of software development. Not by catering to the elite – these people take care of themselves – but by reaching out to the majority of everyday programmers.

That was my point. I'm sorry I did such a bad job of communicating it. But on the plus side, at least it got people thinking and talking about the issue.

Some people objected to the very idea of categorizing programmers into groups of any kind. But there's a rich history of doing exactly that, with interesting and sometimes unintended consequences. In early 2004, Nikhil Kothari wrote about three personas Microsoft came up with while working on Visual Studio 2005.

We have three primary personas across the developer division: Mort, Elvis and Einstein.

Mort, the opportunistic developer, likes to create quick-working solutions for immediate problems. He focuses on productivity and learns as needed.

personas: Mort

Elvis, the pragmatic programmer, likes to create long-lasting solutions addressing the problem domain, and learning while working on the solution.

personas: Elvis

Einstein, the paranoid programmer, likes to create the most efficient solution to a given problem, and typically learn in advance before working on the solution.

personas: Einstein

These personas helped guide the design of features during the Visual Studio 2005 product cycle.

The description above is only a rough summarization of several characteristics collected and documented by our usability folks. During the meeting, a program manager on our team applied these personas in the context of server controls rather well:

  • Mort would be a developer most comfortable and satisfied if the control could be used as-is and it just worked.
  • Elvis would like to able to customize the control to get the desired behavior through properties and code, or be willing to wire up multiple controls together.
  • Einstein would love to be able to deeply understand the control implementation, and want to be able to extend it to give it different behavior, or go so far as to re-implement it.

I can't quite date exactly when these personas came to exist at Microsoft. Wesner Moise has an even earlier reference to these personas, wherein he amusingly refers to himself as "used to be an Einstein." Wes, old buddy, I'm afraid you're the archetypal Einstein, no matter how much you might think otherwise.

These personas have been controversial for years; they've sparked a lot of intense discussion. Evidently there's a fine line between "persona" and "stereotype":

The Microsoft developer personas that include Mort, Elvis, and Einstein are ultimately an ethically bankrupt mechanism to pigeonhole software developers into the kind of overly simplified categories that a typical marketing staffer is comfortable with. While intended to help this particular parasitic segment of the corporate world to behaviorally model the psychological predispositions of software developers at their work in an unrealistically simple way, it has instead turned into a system of limitations that developers have begun to impose upon themselves to the detriment of the advancement of software development practice and industry. It appears to be a bid by developers to rid themselves of the capacity for rational thought in favor of tribal identification with corporate brands and software rock stars.

Personas, in and of themselves, are not a bad thing. I've written before about the importance of API usability, and personas let you get a leg up on usability by considering the different audiences that will be using your code.

But I can empathize. As a long time Visual Basic and VB.NET developer by trade, I truly resented being lumped in with Mort. I'm not just some clock-punching code monkey – I actually care about the craft of software development. So what if I happen to write code in a language that doesn't brutalize me with case sensitivity and curly-bracket megalomania? My language choice is ultimately no more meaningful than the choice between caffeinated cola beverages, so it's an illusory difference at that.

Paul Vick works on the VB language team at Microsoft and he echoes some of my concerns:

The fundamental error I think most people make with the personas is that they see them as mutually exclusive rather than points along the experience spectrum. When I'm working on the VB compiler, I'm definitely an Einstein, thinking at a very high level. When I'm working on stuff like the VBParser sample, I'm generally an Elvis, thinking at a somewhat lower level. And when I'm writing batch scripts or ad-hoc data analysis tools, I'm definitely a Mort, hacking around to figure out what I'm trying to do.

The point really is that most people are usually Mort, Elvis and Einstein all at the same time, depending on what they're doing. And by building tools that target one or the other, we're artificially segregating people's work into buckets that don't really map onto their daily lives. (I would also argue that the past several releases of Visual Studio has emphasized some personas over others.) Finding a way to better serve people as they move through the flow of the day-to-day work is something that is need of some serious attention.

Mort, like the twenty percent analogy Ben originally came up with, is more than a persona or a stereotype. It's a call to action.

I think the solution is to quit pandering to Mort with our condescending paternalistic attitude, and instead demand better from Mort. If the capabilities of the average developer truly is as bleak as many make it out to be, we shouldn't just accept it, but work to raise the quality of the average developer. "Average developer" should describe an acceptable level of competence.

We have to realize that Mort is responsible for a lot of important systems. Systems that affect the general population. When I hear of recent cases of identity thefts at Choicepoint, especially those caused by lax security such as using default passwords for the database, I think of Mort. When I read that $250 million worth of taxpayer money has gone into an overhaul of the FBI Case File system, and the system has to be scrapped, I think of Mort.

Given this much responsibility, we should expect more from Mort. So Mort, I hate to say this, but software development is not like working the register at McDonalds where putting in your 9 to 5 is enough. I am all for work-life balance, but you have to understand that software development is an incredibly challenging field, requiring intense concentration and strong mental faculty. It's time for you to attend a conference or two to improve your skills. It's time for you to subscribe to a few blogs and read a few more books. But read deeper books than How to program the VCR in 21 days. For example, read a book on Design Patterns or Refactoring. Mort, I am afraid it's time for you to quit coasting. It's time for you to step it up a notch.

I firmly believe it is our job to leave the craft of software development better than we found it. If you're anything like me, you wrote horrible code when you started out as a fledgling programmer, too. But through concerted effort and practice, I was determined to suck less every year. I'll admit this is sort of painful, because us programmers aren't exactly known for our people skills. But we owe it to our craft – and to ourselves – to reach out and help our fellow programmers, at least in some small way.

Being a professional programmer is more than just writing great code. Being a professional programmer means helping other programmers become professionals, too. We're all in this thing together. Not everyone can be reached. But some can.

Discussion

What If They Gave a Browser War and Microsoft Never Came?

Two weeks ago, Apple announced a new version of WebKit, the underlying rendering technology of their Safari web browser. The feature list is impressive:

  • Enhanced Rich Text Editing
  • Faster JavaScript and DOM (~ 2x)
  • Faster Page Loading
  • SVG support
  • XPath support
  • Improved JavaScript XML technology (XSLT, DOMParser, XMLSerializer, and enhanced XMLHttpRequest support)
  • Styleable form controls
  • Additional advanced CSS support: 2.1, 3.0, and experimental.
  • Reduced memory use (~14%)
  • Web Developer Tools included

That's a awfully compelling list of new features for an essential application I spend many, many hours a day in-- my web browser. Although Safari on Windows is little more than a glorified, feature-poor Mac emulator, the killer core WebKit feature list is enough to convince me to download it and run it through its paces. Apple is a serious competitor in the browser space.

Last week, the first Beta of Firefox 3.0 was released. I'm similarly impressed with the giant list of improvements and new features in this browser, too. It appears to have some innovative changes to the UI, along with native GUI rendering which was one of my pet peeves with previous versions of Firefox. Firefox has been a contender since version 1.5, and it looks like version 3.0 will push up their mindshare even further. Deservedly so. Firefox is great stuff, and the add-on ecosystem is second to none.

Clearly, the browser wars are heating up to a level we haven't seen since the heady bubble days of the late 90's. That's good news for everyone who uses the web. Nothing drives innovation quite like competition.

Given the level of fierce competition out there now, Microsoft must have some really killer features up their sleeves for Internet Explorer 8, right?

(Pretend that I've inserted the sound of gently chirping crickets here.)

Microsoft hasn't released any information on Internet Explorer 8. None. Nada. Zilch. Believe me, I've tried to pry it out of them:

During a session at Mix today, attendee Jeff Atwood asked Internet Explorer platform architect Chris Wilson for more information about when it might be released. The five-year gap between IE 6 and IE 7 notwithstanding, Atwood noted that people have come to expect a new version of a browser every couple of years. He asked whether the next IE would come with the next Windows version or before then -- "out of band," as they say.

Wilson reiterated Microsoft's promise that it will never again go five years "without an upgrade to the platform." He noted that the company was suggesting a 12- to 18-month development cycle at last year's Mix conference. "There's no exact date," he said, adding later, "I think that your expectation of having a new browser platform every couple of years is definitely a valid one."

Chris is an extremely nice guy, and clearly very technically competent. I'm sure he's under some kind of bizarre corporate gag order to say nothing. But how, exactly, does silence help the massive audience of people who use Internet Explorer on a daily basis? We're all left wondering-- what if they gave a browser war, and Microsoft never came?*

What if they gave a war, and nobody came...

IE 6 was a great browser-- in 2001. By 2005, not so much. IE 7 was a critical stopgap, because IE 6 devolved into Netscape 4.7x during the five years it was the latest and greatest and only version. So consider the history. The entire world was trapped in an abusive relationship with Microsoft for that long, dark five year period. I think we'd like-- no, I think we deserve-- some assurances that this abusive cycle will not repeat itself.

My friend and colleague, Jon Galloway, said it best in a recent twitter update:

Just about every "Microsoft doesn't get it" problem boils down to long and secretive development cycles. Where's the IE 8 CTP?

Exactly. I don't think there's a more important single application on the planet right now than the web browser. If they can't get this right-- and soon-- I'm not sure there's any hope left.

Discussion

The Big Ball of Mud and Other Architectural Disasters

Mistakes are inevitable on any software project. But mistakes, if handled appropriately, are OK. Mistakes can be intercepted, adjusted, and ultimately addressed. The root of deep, fatal software project problems is not knowing when you're making a mistake. These types of mistakes tend to fester into massive, systemic project failure. That's why I'm fond of citing McConnell's list of classic mistakes; I find it helpful to review every so often as a sort of triage self-check. I ask myself-- am I making any of these mistakes without even realizing it?

I suppose this could lead to a sort of project hypochondria, where you're constantly defending against mysterious, unseen project illnesses. I don't know about you, but I'd much rather be working on a project with a paranoid project manager than an oblivious one. Only the paranoid survive.

Perhaps that's also why I enjoy Brian Foote and Joseph Yoder's Big Ball of Mud paper so much. This paper was originally presented at the 1997 conference on Patterns Languages of Programs, amusingly acryonymed PLoP. It describes classic architectural mistakes in software development.

The architecture that actually predominates in practice is the BIG BALL OF MUD.

A BIG BALL OF MUD is haphazardly structured, sprawling, sloppy, duct-tape and bailing wire, spaghetti code jungle. We've all seen them. These systems show unmistakable signs of unregulated growth, and repeated, expedient repair. Information is shared promiscuously among distant elements of the system, often to the point where nearly all the important information becomes global or duplicated. The overall structure of the system may never have been well defined. If it was, it may have eroded beyond recognition. Programmers with a shred of architectural sensibility shun these quagmires. Only those who are unconcerned about architecture, and, perhaps, are comfortable with the inertia of the day-to-day chore of patching the holes in these failing dikes, are content to work on such systems.

Still, this approach endures and thrives. Why is this architecture so popular? Is it as bad as it seems, or might it serve as a way-station on the road to more enduring, elegant artifacts? What forces drive good programmers to build ugly systems? Can we avoid this? Should we? How can we make such systems better?

It's a great read. The authors enumerate seven architectural pathologies:

1. Big Ball of Mud
(a.k.a. Shantytown, Spaghetti Code)

Shantytowns are usually built from common, inexpensive materials and simple tools. Shantytowns can be built using relatively unskilled labor. Even though the labor force is "unskilled" in the customary sense, the construction and maintenance of this sort of housing can be quite labor intensive. There is little specialization. Each housing unit is constructed and maintained primarily by its inhabitants, and each inhabitant must be a jack of all the necessary trades. There is little concern for infrastructure, since infrastructure requires coordination and capital, and specialized resources, equipment, and skills. There is little overall planning or regulation of growth. Shantytowns emerge where there is a need for housing, a surplus of unskilled labor, and a dearth of capital investment. Shantytowns fulfill an immediate, local need for housing by bringing available resources to bear on the problem. Loftier architectural goals are a luxury that has to wait.

shantytown, mon

Maintaining a shantytown is labor-intensive and requires a broad range of skills. One must be able to improvise repairs with the materials on-hand, and master tasks from roof repair to ad hoc sanitation. However, there is little of the sort of skilled specialization that one sees in a mature economy.

All too many of our software systems are, architecturally, little more than shantytowns. Investment in tools and infrastructure is too often inadequate. Tools are usually primitive, and infrastructure such as libraries and frameworks, is undercapitalized. Individual portions of the system grow unchecked, and the lack of infrastructure and architecture allows problems in one part of the system to erode and pollute adjacent portions. Deadlines loom like monsoons, and architectural elegance seems unattainable.

2. Throwaway Code
(a.k.a. Quick Hack, Kleenex Code, Disposable Code, Scripting, Killer Demo, Permanent Prototype, Boomtown)

A homeowner might erect a temporary storage shed or car port, with every intention of quickly tearing it down and replacing it with something more permanent. Such structures have a way of enduring indefinitely. The money expected to replace them might not become available. Or, once the new structure is constructed, the temptation to continue to use the old one for "a while" might be hard to resist.

garbage dump

Likewise, when you are prototyping a system, you are not usually concerned with how elegant or efficient your code is. You know that you will only use it to prove a concept. Once the prototype is done, the code will be thrown away and written properly. As the time nears to demonstrate the prototype, the temptation to load it with impressive but utterly inefficient realizations of the system's expected eventual functionality can be hard to resist. Sometimes, this strategy can be a bit too successful. The client, rather than funding the next phase of the project, may slate the prototype itself for release.

3. Piecemeal Growth
(a.k.a. Urban Sprawl, Iterative-Incremental Development)

Urban planning has an uneven history of success. For instance, Washington D.C. was laid out according to a master plan designed by the French architect L'Enfant. The capitals of Brazil (Brasilia) and Nigeria (Abuja) started as paper cities as well. Other cities, such as Houston, have grown without any overarching plan to guide them. Each approach has its problems. For instance, the radial street plans in L'Enfant's master plan become awkward past a certain distance from the center. The lack of any plan at all, on the other hand, leads to a patchwork of residential, commercial, and industrial areas that is dictated by the capricious interaction of local forces such as land ownership, capital, and zoning. Since concerns such as recreation, shopping close to homes, and noise and pollution away from homes are not brought directly into the mix, they are not adequately addressed.

urban sprawl in Beijing

Most cities are more like Houston than Abuja. They may begin as settlements, subdivisions, docks, or railway stops. Maybe people were drawn by gold, or lumber, access to transportation, or empty land. As time goes on, certain settlements achieve a critical mass, and a positive feedback cycle ensues. The city's success draws tradesmen, merchants, doctors, and clergymen. The growing population is able to support infrastructure, governmental institutions, and police protection. These, in turn, draw more people. Different sections of town develop distinct identities. With few exceptions, (Salt Lake City comes to mind) the founders of these settlements never stopped to think that they were founding major cities. Their ambitions were usually more modest, and immediate.

4. Keep It Working
(a.k.a. Vitality, Baby Steps, Daily Build, First Do No Harm)

Once a city establishes its infrastructure, it is imperative that it be kept working. For example, if the sewers break, and aren't quickly repaired, the consequences can escalate from merely unpleasant to genuinely life threatening. People come to expect that they can rely on their public utilities being available 24 hours per day. They (rightfully) expect to be able to demand that an outage be treated as an emergency.

Software can be like this. Often a business becomes dependent upon the data driving it. Businesses have become critically dependent on their software and computing infrastructures. There are numerous mission critical systems that must be on-the-air twenty-four hours a day/seven days per week. If these systems go down, inventories can not be checked, employees can not be paid, aircraft cannot be routed, and so on.

infrastructure, an electrical substation

There may be times where taking a system down for a major overhaul can be justified, but usually, doing so is fraught with peril. However, once the system is brought back up, it is difficult to tell which from among a large collection of modifications might have caused a new problem. Every change is suspect. Deferring such integration is a recipe for misery.

5. Shearing Layers

The notion of SHEARING LAYERS is one of the centerpieces of Brand's How Buildings Learn. Brand, in turn synthesized his ideas from a variety of sources, including British designer Frank Duffy, and ecologist R. V. O'Neill.

Brand quotes Duffy as saying: "Our basic argument is that there isn't any such thing as a building. A building properly conceived is several layers of longevity of built components".

Brand distilled Duffy's proposed layers into these six: Site, Structure, Skin, Services, Space Plan, and Stuff. Site is geographical setting. Structure is the load bearing elements, such as the foundation and skeleton. Skin is the exterior surface, such as siding and windows. Services are the circulatory and nervous systems of a building, such as its heating plant, wiring, and plumbing. The Space Plan includes walls, flooring, and ceilings. Stuff includes lamps, chairs, appliances, bulletin boards, and paintings.

shearing layers

These layers change at different rates. Site, they say, is eternal. Structure may last from 30 to 300 years. Skin lasts for around 20 years, as it responds to the elements, and to the whims of fashion. Services succumb to wear and technical obsolescence more quickly, in 7 to 15 years. Commercial Space Plans may turn over every 3 years. Stuff is, like software, subject to unrelenting flux.

6. Sweeping It Under The Rug
(a.k.a. Potemkin Village, Housecleaning, Pretty Face, Quarantine, Hiding it Under the Bed, Rehabilitation)

One of the most spectacular examples of sweeping a problem under the rug is the concrete sarcophagus that Soviet engineers constructed to put a 10,000 year lid on the infamous reactor number four at Chernobyl, in what is now Ukraine.

Chernobyl sarcophagus

If you can't make a mess go away, at least you can hide it. Urban renewal can begin by painting murals over graffiti and putting fences around abandoned property. Children often learn that a single heap in the closet is better than a scattered mess in the middle of the floor.

7. Reconstruction
(a.k.a. Total Rewrite, Demolition, Plan to Throw One Away, Start Over)

Atlanta's Fulton County Stadium was built in 1966 to serve as the home of baseball's Atlanta Braves, and football's Atlanta Falcons. In August of 1997, the stadium was demolished. Two factors contributed to its relatively rapid obsolescence. One was that the architecture of the original stadium was incapable of accommodating the addition of the "sky-box" suites that the spreadsheets of ‘90s sporting economics demanded. No conceivable retrofit could accommodate this requirement. Addressing it meant starting over, from the ground up. The second was that the stadium's attempt to provide a cheap, general solution to the problem of providing a forum for both baseball and football audiences compromised the needs of both. In only thirty-one years, the balance among these forces had shifted decidedly. The facility is being replaced by two new single-purpose stadia.

stadium demolition

Might there be lessons for us about unexpected requirements and designing general components here?

The first step in dealing with a problem is to admit you have one. If you catch glimpses of any of these themes in your current software project, I encourage you to read the relevant sections in the paper, which goes into much more detail-- and provides ideas for remediation strategies.

Discussion

The Two Types of Programmers

Contrary to myth, there aren't fourteen types of programmers. There are really only two, as Ben Collins-Sussman reminds us.

There are two "classes" of programmers in the world of software development: I'm going to call them the 20% and the 80%.

The 20% folks are what many would call "alpha" programmers -- the leaders, trailblazers, trendsetters, the kind of folks that places like Google and Fog Creek software are obsessed with hiring. These folks were the first ones to install Linux at home in the 90's; the people who write lisp compilers and learn Haskell on weekends "just for fun"; they actively participate in open source projects; they're always aware of the latest, coolest new trends in programming and tools.

The 80% folks make up the bulk of the software development industry. They're not stupid; they're merely vocational. They went to school, learned just enough Java/C#/C++, then got a job writing internal apps for banks, governments, travel firms, law firms, etc. The world usually never sees their software. They use whatever tools Microsoft hands down to them -- usally VS.NET if they're doing C++, or maybe a GUI IDE like Eclipse or IntelliJ for Java development. They've never used Linux, and aren't very interested in it anyway. Many have never even used version control. If they have, it's only whatever tool shipped in the Microsoft box (like SourceSafe), or some ancient thing handed down to them. They know exactly enough to get their job done, then go home on the weekend and forget about computers.

As I work with teams of programmers in the field, I'm consistently struck by the yawning abyss between that 20% and the rest of the world. It makes the divide between the open-source and Microsoft camps look like a shallow ditch.

Shocking statement #1: Most of the software industry is made up of 80% programmers. Yes, most of the world is small Windows development shops, or small firms hiring internal programmers. Most companies have a few 20% folks, and they're usually the ones lobbying against pointy-haired bosses to change policies, or upgrade tools, or to use a sane version-control system.

Shocking statement #2: Most alpha-geeks forget about shocking statement #1. People who work on open source software, participate in passionate cryptography arguments on Slashdot, and download the latest GIT releases are extremely likely to lose sight of the fact that "the 80%" exists at all. They get all excited about the latest Linux distro or AJAX toolkit or distributed SCM system, spend all weekend on it, blog about it… and then are confounded about why they can't get their office to start using it.

Perhaps not shocking to me, but an excellent and important reminder for everyone, nonetheless.

I often think we're wasting our time writing blogs which are largely read by the same 20%. In my experience, there's precious little trickle-down effect from the alpha programmers to everyone else. And if there is, it takes decades. If you really want to change the software development status quo, if you want to make a difference this year, you have to help us reach outside our insular little group of alpha programmers and effect change in the other 80% of the world. And that is far, far more difficult than preaching to the converted 20%. It's why I admire people like Scott Mitchell so much, because he understands the importance of reaching out to the other 80%:

I like programming and really enjoy ASP.NET. I think it's neat and fun and interesting and cool how you can go from literally nothing to having a data-driven web application that can be used by people around the world in an amazingly fast amount of time. Furthermore, I want to spread that enthusiasm to folks. I want to say to those who may have never programmed, or to those who are using competing technologies, or to those who are just starting out - "Come over here and try out this ASP.NET stuff. Here, let me show you what it can do!" That's why I teach (which pays pennies compared to consulting). That's why I write (which pays better than teaching, but still is not anywhere near as lucrative as consulting). That's why I give free talks at local user groups and community-sponsored conferences here in Southern California. To get the word out!

To me, saying that titles like Teach Yourself X in 24 Hours cheapen the craft is tantamount to saying, "Our club is full. Go away." It's not saying, "Let's welcome the newbies and get them excited about this technology." Rather, it's saying, "Newbies are ok, but they must first realize how hard this is, how hard we've worked, and how much more we know than them." I worry that such sentiment from the community will come across as pompousness to those very people whom we should be welcoming.

I wish this was easier for me, because I agree with Scott. I'm terrible at the things he's describing. I think the true measure of success isn't how many alpha geeks you can get to pay attention to you. It's how many typical, average programmers you've reached out to, if only in some small way. If you really care about the craft of software development, you'll help us build that bridge between the 20% and the 80%, too.

Update: This was a controversial post. See my followup to this post for further explanation.

Discussion

Has CAPTCHA Been "Broken"?

A recent Wall Street Journal describes Ticketmaster's problems with online scalpers:

The Internet era has brought speed and convenience to all sorts of consumer transactions. For concertgoers, however, it has also led to ever-faster sellouts for hot events. Ticketmaster deploys technology that is supposed to stop brokers from gaining access to large numbers of seats via online sales. But it says brokers' software circumvents the company's protections.

That has placed large numbers of seats in the hands of brokers who use eBay Inc.'s StubHub, Craigslist and other online venues to resell the tickets at a big mark up.

One situation roiling consumers involves the 54-concert "Best of Both Worlds" tour in which singer-actress Miley Cyrus is performing sets as herself and as her fictional alter ego, Hannah Montana. Parents and children have found finding tickets for the shows difficult and expensive. The issue is drawing the attention of government officials. On Thursday -- in a rare Internet-age example of authorities enforcing antiscalping laws -- the attorneys general of Missouri and Arkansas filed lawsuits against people accused of illegally reselling Hannah Montana tickets.

According to StubHub, tickets for "Best of Both Worlds" are currently selling for an average $237, making them pricier than seats for the Police ($209), Justin Timberlake ($182) and Beyonc ($212). The highest face value for a ticket on the Hannah Montana tour: $63.

They must have really pissed off some high ranking political parents to get that kind of attention. Not that they don't deserve it-- scalpers are evil, profiteering bastards, to be sure. They deserve all the pain we can send their way.

The "technology that is supposed to stop brokers" they're referring to is CAPTCHA.

For instance, companies like Ticketmaster require customers searching for tickets online to replicate a set of the squiggly letters and numbers, known as a "Captcha." Theoretically, only human customers can correctly identify the characters despite the odd fonts, screening out automated purchasing programs. But RMG's software, according to Mr. Kovach, can also "figure out the randomly generated characters and retype them automatically." Mr. Kovach said RMG employees also gave him advice on fooling Ticketmaster's computers into thinking his requests were coming from different Internet addresses. Neither Mr. Kovach nor his lawyer could be reached for comment.

So if online scalpers are somehow beating the system, does that mean CAPTCHA has been broken? I covered this topic a year ago, and my opinion has not changed. If CAPTCHAs were well and truly broken, Google, Yahoo, and Hotmail would stop using them. Why would they continue to use something that doesn't work? I'm not going to rehash all the arguments here, but if you have strong feelings on this topic, I urge you to read my earlier post before commenting.

Ticketmaster's problem is that their CAPTCHA is not good enough. Programmers don't seem to understand what makes a CAPTCHA difficult to "break". But it's not difficult to find out. Heck, the hackers themselves will tell you how to do CAPTCHA correctly if you just know where to look. For example, this Chinese hacker's page breaks down a number of common CAPTCHAs, and the price of software he sells to defeat them at a certain percentage success rate:

the9
100%
$500
captcha-decoder-1.png
dvbbs
95%
$1,000
captcha-decoder-2.png
Shanda
90%
$1,500
captcha-decoder-3.png
Baidu
80%
$3,000
captcha-decoder-4.png
eBay
70%
$4,000
captcha-decoder-5.png
Ticketmaster
50%
$6,000
captcha-decoder-6.png
Google
(unbreakable)
captcha-decoder-7.png
Hotmail
(unbreakable)
captcha-decoder-8.png
Yahoo
(unbreakable)
captcha-decoder-9.png

It seems an awful lot of programmers subscribe to the "add some crazy patterns and/or colors to the text and pray for the best" school of CAPTCHA design. That's not only sloppy, it just doesn't work. The top of this chart is littered with their failed attempts. On some sites, this is OK. They don't need the same world-class level of protection from bots and scripts that Ticketmaster does-- there's tremendous financial incentive for scalpers to break their system.

This particular hacker estimates a 50% success rate against the Ticketmaster captcha, long before the above article was published. No wonder those parents weren't able to buy their kids Hannah Montana tickets-- it's not because of failings in CAPTCHA protection, it's because the ticketmaster programmers failed to implement CAPTCHA correctly.

Instead of hacking together their own partially effective (and often not even human solvable) CAPTCHA, what Ticketmaster's programmers should have done is studied prior art-- in particular, by outright copying the high-volume, extensively researched Yahoo, Google, and Hotmail CAPTCHAs. I'm awfully fond of Google's CAPTCHA technique; in my professional opinion, it is simultaneously the most readable and the most hellishly difficult to OCR correctly. If you need industrial strength protection from bots and scripts, that's where you want to start.

Discussion