Coding Horror

programming and human factors

The iPhone Software Revolution

The original iPhone was for suckers hard-core gadget enthusiasts only. But as I predicted, 12 months later, the iPhone 3G rectified all the shortcomings of the first version. And now, with the iPhone 3GS, we've reached the mythical third version:

A computer industry adage is that Microsoft does not make a successful product until version 3. Its Windows operating system was not a big success until the third version was introduced in 1990 and, similarly, its Internet Explorer browsing software was lackluster until the third version.

The platform is now so compelling and polished that even I took the plunge. For context, this is the first Apple product I've owned since 1984. Literally.

I am largely ambivalent towards Apple, but it's impossible to be ambivalent about the iPhone -- and in particular, the latest and greatest iPhone 3GS. It is the Pentium to the 486 of the iPhone 3G. A landmark, genre-defining product, no longer a mere smartphone but an honest to God fully capable, no-compromises computer in the palm of your hand.

Here's how far I am willing to go: I believe the iPhone will ultimately be judged a more important product than the original Apple Macintosh.

iphone3gs1.jpg

Yes, I am dead serious. Just check back here in fifteen to twenty years to see if I was right. (Hint: I will be.)

There's always been a weird tension in Apple's computer designs, because they attempt to control every nuance of the entire experience from end to end. For the best Appletm experience, you run custom Appletm applications on artfully designed Appletm hardware dongles. That's fundamentally at odds with the classic hacker mentality that birthed the general purpose computer. You can see it in the wild west, anything goes Linux ecosystem. You can even see it in the Wintel axis of evil, where a million motley mixtures of hardware, software, and operating system variants are allowed to bloom, like little beige stickered flowers, for a price.

But a cell phone? It's a closed ecosystem, by definition, running on a proprietary network. By a status quo of incompetent megacorporations who wouldn't know user friendliness or good design if it ran up behind them and bit them in the rear end of their expensive, tailored suits. All those things that bugged me about Apple's computers are utter non-issues in the phone market. Proprietary handset? So is every other handset. Locked in to a single vendor? Everyone signs a multi-year contract. One company controlling your entire experience? That's how it's always been done. Nokia, Sony/Ericsson, Microsoft, RIM -- these guys clearly had no idea what they were in for when Apple set their sights on the cell phone market -- a market that is a nearly perfect match to Apple's strengths.

Apple was born to make a kick-ass phone. And with the lead they have, I predict they will dominate the market for years to come.

Consider all the myriad devices that the iPhone 3GS can sub for, and in some cases, outright replace:

  • GPS
  • Netbook (for casual web browsing and email)
  • Gameboy
  • Watch
  • Camera
  • MP4 Video Recorder
  • MP3 player
  • DVD player
  • eBook reader

Oh yeah, and I heard you can make phone calls with it, too. Like any general purpose computer, it's a jack of all trades.

As impressive as the new hardware is, the software story is even bigger. If you're a software developer, the iPhone can become a career changing device, all thanks to one little teeny-tiny icon on the iPhone home screen:

app_store.jpg

The App Store makes it brainlessly easy to install, upgrade, and purchase new applications. But more importantly, any software developer -- at the mild entry cost of owning a Mac, and signing up for the $99 iPhone Developer Program -- can build an app and sell it to the worldwide audience of iPhone users. Apple makes this stuff look easy, when historically it has been anything but. How many successful garage developers do you know for Nintendo DS? For the Motorola Razr? For Palm? For Windows Mobile?

Apple has never been particularly great at supporting software developers, but I have to give them their due: with the iPhone developer program, they've changed the game. Nowhere is this more evident than in software pricing. I went on a software buying spree when I picked up my iPhone 3GS, ending up with almost three pages of new applications from the App Store. I was a little worried that I might rack up a substantial bill, but how can I resist when cool stuff like ports of the classic Amiga Pinball Dreams are available, or the historic Guru Meditation? The list of useful (and useless) apps is almost endless, and growing every day.

My total bill for 3 screens worth of great iPhone software applications? About fifty bucks. I've paid more than that for Xbox 360 games I ended up playing for a total of maybe three hours! About half of the apps were free, and the rest were a few bucks. I think the most I paid was $9.99, and that was for an entire library. What's revolutionary here isn't just the development ecosystem, but the economics that support it, too. At these crazy low prices, why not fill your phone with cool and useful apps? You might wonder if developers can really make a living selling apps that only cost 99 cents. Sure you can, if you sell hundreds of thousands of copies:

Freeverse, one of the leading developers and publishers of iPhone games, sold the millionth copy of its Flick Fishing game over the weekend, making Flick Fishing the first paid application to reach the one million download milestone. Flick Fishing, which costs 99 cents, allows iPhone and iPod touch users to take a virtual fishing trip with the flick of a wrist. The game uses the iPhone's accelerometer to recreate a casting motion, then a combination of bait choice and fishing skill helps players land the big fish.

Preliminary weekly reports for the period from 23 March to 19 April indicate that Flight Control sold a total of 587,485 units during this time. We estimate total sales are now over 700,000 units, with the bulk of sales occurring in a 3 week period. Flight Control

That's an honorable way to get rich programming, and a nice business alternative to the dog-eat-dog world of advertising subsidized apps.

I love nothing more than supporting my fellow software developers by voting with my wallet. it does my heart good to see so many indie and garage developers making it big on the iPhone. (Also, I'm a sucker for physics games, and there are a bunch of great ones available in the App Store). I'm more than happy to pitch in a few bucks every month for a great new iPhone app.

If this has all come across as too rah-rah, too uncritical a view of the iPhone, I apologize. There are certainly things to be critical about, such as the App Store's weird enforcement policies, the lack of support for emulators, or Flash, or anything else that might somehow undermine the platform as decided in some paranoid, secretive Apple back room. Not that we'd ever hear about it.

I didn't write this to kiss Apple's ass. I wrote this because I truly feel that the iPhone is a key inflection point in software development. We will look back on this as the time when "software" stopped being something that geeks buy (or worse, bootleg), and started being something that everyone buys, every day. You'd have to be a jaded developer indeed not to find something magical and transformative in this formula, and although others will clearly follow, the iPhone is leading the way.

"There's an app for that." Kudos, Apple. From the bottom of my hoary old software developer heart.

Discussion

Scaling Up vs. Scaling Out: Hidden Costs

In My Scaling Hero, I described the amazing scaling story of plentyoffish.com. It's impressive by any measure, but also particularly relevant to us because we're on the Microsoft stack, too. I was intrigued when Markus posted this recent update:

Last monday we upgraded our core database server after a power outage knocked the site offline. I haven't touched this machine since 2005 so it was a major undertaking to do it last minute. We upgraded from a machine with 64 GB of ram and 8 CPUs to a HP ProLiant DL785 with 512 GB of ram and 32 CPUs ...

The HP ProLiant DL785 G5 starts at $16,999 -- and that's barebones, with nothing inside. Fully configured, as Markus describes, it's kind of a monster:

  • 7U size (a typical server is 2U, and mainstream servers are often 1U)
  • 8 CPU sockets
  • 64 memory sockets
  • 16 drive bays
  • 11 expansion slots
  • 6 power supplies

It's unclear if they bought it pre-configured, or added the disks, CPUs, and memory themselves. The most expensive configuration shown on the HP website is $37,398 and that includes only 4 processors, no drives, and a paltry 32 GB memory. When topped out with ultra-expensive 8 GB memory DIMMs, 8 high end Opterons, 10,000 RPM hard drives, and everything else -- by my estimates, it probably cost closer to $100,000. That might even be a lowball number, considering that the DL785 submitted to the TPC benchmark website (pdf) had a "system cost" of $186,700. And that machine only had 256 GB of RAM. (But, to be fair, that total included another major storage array, and a bunch of software.)

At any rate, let's assume $100,000 is a reasonable ballpark for the monster server Markus purchased. It is the very definition of scaling up -- a seriously big iron single server.

But what if you scaled out, instead -- Hadoop or MapReduce style, across lots and lots of inexpensive servers? After some initial configuration bumps, I've been happy with the inexpensive Lenovo ThinkServer RS110 servers we use. They're no match for that DL785 -- but they aren't exactly chopped liver, either:

Lenovo ThinkServer RS110 barebones $600
8 GB RAM $100
2 x eBay drive brackets $50
2 x 500 GB SATA hard drives, mirrored $100
Intel Xeon X3360 2.83 GHz quad-core CPU $300

Grand total of $1,150 per server. Plus another 10 percent for tax, shipping, and so forth. I replace the bundled CPU and memory that the server ships with, and then resell the salvaged parts on eBay for about $100 -- so let's call the total price per server $1,200.

Now, assuming a fixed spend of $100,000, we could build 83 of those 1U servers. Let's compare what we end up with for our money:

  Scaling Up Scaling Out
CPUs 32 332
RAM 512 GB 664 GB
Disk 4 TB 40.5 TB

Now which approach makes more sense?

(These numbers are a bit skewed because that DL785 is at the absolute extreme end of the big iron spectrum. You pay a hefty premium for fully maxxing out. It is possible to build a slightly less powerful server with far better bang for the buck.)

But there's something else to consider: software licensing.

  Scaling Up Scaling Out
OS $2,310 $33,200*
SQL $8,318 $49,800*

(If you're using all open source software, then of course these costs will be very close to zero. We're assuming a Microsoft shop here, with the necessary licenses for Windows Server 2008 and SQL Server 2008.)

Now which approach makes more sense?

What about the power costs? Electricity and rack space isn't free.

  Scaling Up Scaling Out
Peak Watts 1,200w 16,600w
Power Cost / Year $1,577 $21,815

Now which approach makes more sense?

I'm not picking favorites. This is presented as food for thought. There are at least a dozen other factors you'd want to consider depending on the particulars of your situation. Scaling up and scaling out are both viable solutions, depending on what problem you're trying to solve, and what resources (financial, software, and otherwise) you have at hand.

That said, I think it's fair to conclude that scaling out is only frictionless when you use open source software. Otherwise, you're in a bit of a conundrum: scaling up means paying less for licenses and a lot more for hardware, while scaling out means paying less for the hardware, and a whole lot more for licenses.

* I have no idea if these are the right prices for Windows Server 2008 and SQL Server 2008, because reading about the licensing models makes my brain hurt. If anything, it could be substantially more.

Discussion

Monty Hall, Monty Fall, Monty Crawl

Remember The Problem of the Unfinished Game? And the almost 2,500 comments those two posts generated? I know, I like to pretend it didn't happen, either. Some objected to the way I asked the question, but it was a simple question asked in simple language. I think what they're really objecting to is how unintuitive the answer is.

Which reminds me of another question that you've probably heard of:

Suppose the contestants on a game show are given the choice of three doors: behind one door is a car; behind the others, goats. After a contestant picks a door, the host, who knows what's behind all the doors, opens one of the unchosen doors, which reveals a goat. He then asks the contestant, "Do you want to switch doors?"

monty-hall-problem-doors.jpg

Should the contestant switch doors?

This is, of course, the Monty Hall problem. It's been covered to death, and quite well I might add, by dozens of writers who are far more talented than I.

What's interesting about this problem, to me at least, is not the solution, but the vehemence with which people react to the solution – as described in The Drunkard's Walk: How Randomness Rules Our Lives.

the-drunkards-walk-cover.png

It appears to be a pretty silly question. Two doors are available – open one and you win; open the other and you lose – so it seems self-evident that whether you change your choice or not, your chances of winning are 50/50. What could be simpler? The thing is, Marilyn said in her column that it is better to switch.

Despite the public's much-heralded lethargy when it comes to mathematical issues, Marilyn's readers reacted as if she'd advocated ceding California back to Mexico. Her denial of the obvious brought her an avalanche of mail, 10,000 letters by her estimate. If you ask the American people whether they agree that plants create the oxygen in the air, light travels faster than sound, or you cannot make radioactive milk by boiling it, you will get double-digit disagreement in each case (13 percent, 24 percent, and 35 percent, respectively). But on this issue, Americans were united: Ninety-two percent agreed Marilyn was wrong.

Perhaps the public can be forgiven their ignorance, but what of the experts? Surprisingly, the mathematicians fare little better.

Almost 1,000 Ph.D.s wrote in, many of them math professors, who seemed especially irate. "You blew it," wrote a mathematician from George Mason University. From Dickinson State University came this: "I am in shock that after being corrected by at least three mathematicians, you still do not see your mistake." From Georgetown: "How many irate mathematicians are needed to change your mind?" And someone from the U.S. Army Research Institute remarked, "If all those Ph.D.s are wrong the country would be in serious trouble." Responses continued in such great numbers and for such a long time that after devoting quite a bit of column space to the issue, Marilyn decided she whould no longer address it.

The army PhD who wrote in may have been correct that if all those PhDs were wrong, it would be a sign of trouble. But Marilyn was correct. When told of this, Paul Erdos, one of the leading mathematicians of the 20th century, said, "That's impossible." Then, when presented with a formal mathematical proof of the correct answer, he still didn't believe it and grew angry. Only after a colleague arranged for a computer simulation in which Erdos watched hundreds of trials that came out 2-to-1 in favor of switching did Erdos concede that he was wrong.

You may recognize Paul Erdos from a particularly obscure XKCD cartoon last week. So if you feel like an idiot because you couldn't figure out the Monty Hall problem, take heart. The problem is so unintuitive one of the most notable mathematicians of the last century couldn't wrap his head around it. That's ... well, that's amazing.

How can something that seems so obvious be so wrong? Apparently our brains are not wired to do these sorts of probability problems very well. Personally, I found the text of Jeffrey Rosenthal's Monty Hall, Monty Fall, Monty Crawl (pdf) to be the most illuminating, because it asks us to consider some related possibilities, and how they might affect the outcome:

Monty Fall Problem: In this variant, once you have selected one of the three doors, the host slips on a banana peel and accidentally pushes open another door, which just happens not to contain the car. Now what are the probabilities that you will win, either by sticking with your original door, or switching doors?

Monty Crawl Problem: Once you have selected one of the three doors, the host then reveals one non-selected door which does not contain the car. However, the host is very tired, and crawls from his position (near Door #1) to the door he is to open. In particular, if he has a choice of doors to open, then he opens the smallest number available door. (For example, if you selected Door #1 and the car was indeed behind Door #1, then the host would always open Door #2, never Door #3.) Now what are the probabilities that you will win the car if you stick versus if you switch?

Paul Erdos was brilliant, but even he realized his own limits when presented with the highly unintuitive Monty Hall problem. For his epitaph, he suggested, in his native Hungarian, "Végre nem butulok tovább". This translates into English as "I've finally stopped getting dumber."

If only the rest of us could be so lucky.

Discussion

We Done Been ... Framed!

In my previous post, Url Shorteners: Destroying the Web Since 2002, I mentioned that one of the "features" of the new generation of URL shortening services is to frame the target content.

Digg is one of the most popular sites to implement this strategy. Here's how it works. If you're logged in to Digg, every target link you click from Digg is a shortened URL of their own creation. If I click through to a Stack Overflow article someone else has "Dugg", I'm sent to this link.

http://digg.com/d1tBya

diggbar-stack-overflow-screenshot.png

For logged in users, every outgoing Digg link is framed inside the "DiggBar". It's a way of dragging the Digg experience with you wherever you go -- while you're reading the target article, you can vote it up, see related articles, share, and so forth. And if you share this shortened URL with other users, they'll get the same behavior, provided they also hold a Digg login cookie.

At this point you're probably expecting me to rant about how evil the DiggBar is, and how it, too, is destroying the web, etcetera, etcetera, so on, and so forth. But I can't muster the indignant rage. I can give you, at best, ambivalence. Here's why:

  1. The DiggBar is not served to the vast majority of anonymous users, but only to users who have opted in to the Digg experience by signing up.
  2. The new rel="canonical" directive is used on target links so search engines can tell which links are the "real", authoritative links to the content. They won't be confused or have search engine juice diluted by Digg's shortened URLs. At least that's the theory, anyway.
  3. No Digg ads are served via the DiggBar, so the framed content is not "wrapped" in ads.
  4. I believe Digg users themselves can opt out of DiggBar via a preferences setting.
Digg is trying to build a business, just like we are with Stack Overflow. I can't fault them for their desire to extend the Digg community outward a little bit, given the zillions of outgoing links they feed to the world. Particularly when they attempted to do so in a semi-ethical way, actively soliciting community feedback along the way.

In short, Digg isn't the problem. But even if they were -- if you don't want to be framed by the DiggBar, or any other website for that matter, you could put so-called "frame-busting" JavaScript in your pages.

if (parent.frames.length > 0) {
top.location.replace(document.location);
}

Problem solved! This code (or the many frame-busting variants thereof) does work on the DiggBar. But not every framing site is as reputable as Digg. What happens when we put on our hypothetical black hats and start designing for evil?

I'll tell you what happens. This happens.

var prevent_bust = 0
window.onbeforeunload = function() { prevent_bust++ }
setInterval(function() {
if (prevent_bust > 0) {
prevent_bust -= 2
window.top.location = 'http://server-which-responds-with-204.com'
}
}, 1)

On most browsers a 204 (No Content) HTTP response will do nothing, meaning it will leave you on the current page. But the request attempt will override the previous frame busting attempt, rendering it useless. If the server responds quickly this will be almost invisible to the user.

When life serves you lemons, make a lemon cannon. Produce frame-busting-busting JavaScript. This code does the following:

  • increments a counter every time the browser attempts to navigate away from the current page, via the window.onbeforeonload event handler
  • sets up a timer that fires every millisecond via setInterval(), and if it sees the counter incremented, changes the current location to an URL of the attacker's control
  • that URL serves up a page with HTTP status code 204, which does not cause the browser to navigate anywhere

Net effect: frame-busting busted. Which might naturally lead you to wonder -- hey buster, can you bust the frame-busting buster? And, if so, where does it end?

In the 1998 movie, The Big Hit, the protagonists kidnap the daughter of an extremely wealthy Japanese businessman. When they call to deliver the ransom notice, they turn to Gump who employs a brand name Trace Buster to prevent police from tracing the call.

the-big-hit-cover.jpg

Unbeknownst to Gump, the father has a Trace-Buster-Buster at his disposal. This in turn triggers Gump to use his Trace-Buster-Buster-Buster in an ever escalating battle to evade detection.

What's really scary is that near as I can tell, there is no solution. Due to cross-domain JavaScript security restrictions, it is almost impossible for the framed site to block or interfere with the parent page's evil JavaScript that is intentionally and aggressively blocking the framebusting.

If an evil website decides it's going to frame your website, you will be framed. Period. Frame-busting is nothing more than a false sense of security; it doesn't work. This was a disturbing revelation to me, because framing is the first step on the road to clickjacking:

A clickjacked page tricks a user into performing undesired actions by clicking on a concealed link. On a clickjacked page, the attackers show a set of dummy buttons, then load another page over it in a transparent layer. The users think that they are clicking the visible buttons, while they are actually performing actions on the hidden page. The hidden page may be an authentic page, and therefore the attackers can trick users into performing actions which the users never intended to do and there is no way of tracing such actions later, as the user was genuinely authenticated on the other page.

For example, a user might play a game in which they have to click on some buttons, but another authentic page like a web mail site from a popular service is loaded in a hidden iframe on top of the game. The iframe will load only if the user has saved the password for its respective site. The buttons in the game are placed such that their positions coincide exactly with the select all mail button and then the delete mail button. The consequence is that the user unknowingly deleted all the mail in their folder while playing a simple game. Other known exploits have been tricking users to enable their webcam and microphone through flash (which has since been corrected by Adobe), tricking users to make their social networking profile information public, making users follow someone on Twitter, etc.

I've fallen prey to a mild clickjacking exploit on Twitter myself! It really does happen -- and it's not hard to do.

Yes, Digg frames ethically, so your frame-busting of the DiggBar will appear to work. But if the framing site is evil, good luck. When faced with a determined, skilled adversary that wants to frame your contnet, all bets are off. I don't think it's possible to escape. So consider this a wakeup call: you should build clickjacking countermeasures as if your website could be framed at any time.

I was a skeptic. I didn't want to believe it either. But once shown the exploits on our own site -- fortunately, by a white hat security expert -- I lived to regret that. Don't let frame-busting code lull you into a false sense of security, too.

Discussion

Url Shorteners: Destroying the Web Since 2002

Is anyone else as sick as I am of all the mainstream news coverage on Twitter? Don't get me wrong, I'm a Twitter fan, and I've been a user since 2006. To me, it's a form of public instant messaging -- yet another way to maximize the value of my keystrokes. Still, I'm a little perplexed as to the media's near-obsession with the service. If a day goes by now without the New York Times or CNN mentioning Twitter in some way, I become concerned. Am I really getting all the news? Or just the stupid, too long, non-140-character version of the news?

I guess I should be pleased that I was a (relatively) early adopter and advocate of software that has achieved the rarest of feats in the software industry -- critical mass. Adoption by the proverbial "average user". Whatever you may think of Twitter, consider this: as a software developer, you'll be fortunate to build one project that achieves critical mass in your entire life. And even then, only if you are a very, very lucky programmer: in the right place, at the right time, with the right idea, working with the right people. Most of us never get there. I don't think I will.

There is one side effect of this unprecedented popularity, though, that I definitely wouldn't have predicted: the mainstreaming of URL shortening services. You can barely use Twitter without being forced into near-mastery of URL shortening. For example, this is the super-secret, patented formula I often use when composing my Twitter messages:

"brief summary or opinion" [link for more detail]

Twitter only allows 140 characters in each status update. Some might view this as a limitation, but I consider it Twitter's best feature. I am all for enforced brevity. Maybe that's due to the pain of living through a lifetime of emfail. But depending on the size of the comment and the URL (and some URLs can be ridiculously long), I can't quite fit everything in there without sounding like an SMS-addled teenage girl. This is where URL shortening comes in.

Now, I know what you're thinking. You're a clever programmer. You could implement some kind of fancy jQuery callback to shorten the URL, and replace the longer URL with the shorter URL right there in the text as the user pauses in typing. But you don't even have to be that clever; most of the URL shortening services (that aren't in their infancy) deliver a rather predictable size for the URLs they return. You could simply estimate the size of the URL post-shortening -- maybe adding 1 character as a fudge factor for safety -- and allow the update.

Twitter, I can assure you, is far more brain damaged than you can possibly imagine. It will indeed shorten URLs that fit in the 140 character limit (whoopee!), but it does nothing for URLs that don't fit -- it will not allow you to submit the message. All part of its endearing charm.

Lame, yes, but it means that the typical, mainstream browser-based Twitter user is forced to become proficient with URL shortening services. Due to the increased exposure they've enjoyed through Twitter's meteoric rise to fame, the number of URL shortening services has exploded, and rapidly evolved -- they're no longer viewed as utility services to make URLs more convenient, but a way to subjugate, control, and perhaps worst of all, "monetize" the web experience.

This is dangerous territory we're veering into now, as Joshua Schachter explains.

So there are clear benefits for both the service (low cost of entry, potentially easy profit) and the linker (the quick rush of popularity). But URL shorteners are bad for the rest of us.

The worst problem is that shortening services add another layer of indirection to an already creaky system. A regular hyperlink implicates a browser, its DNS resolver, the publisher's DNS server, and the publisher's website. With a shortening service, you're adding something that acts like a third DNS resolver, except one that is assembled out of unvetted PHP and MySQL, without the benevolent oversight of luminaries like Dan Kaminsky and St. Postel.

The web is little more than a maze of hyperlinks, and if you can insert yourself as an intermediary in that maze, you can transform or undermine the experience in fundamental ways. Consider the disturbing directions newer URL shortening services are taking:

  • NotifyURL sends an email when the link is first visited.
  • SnipURL introduces social bookmarking features such as usernames and RSS feeds.
  • DwarfURL generates statistics.
  • Adjix, XR.com and Linkbee are ad-supported models of URL shorteners that share the revenue with their users.
  • bit.ly offers gratis click-through statistics and charts.
  • Digg offers a shortened URL which includes not just the target URL, but an iframed version that includes a set of Digg-related controls called the Digg bar.
  • Doiop allows the shortening to be selected by the user, and Unicode can be used to achieve really short URLs.

Believe it: the humble hyperlink, thanks to pervasive URL shortening, can now be wielded as a weapon. The internet is the house that PageRank built, and it's all predicated on hyperlinks. Once you start making every link your special flavor of "shortened" link, framing the target content -- heck, maybe wrapping it in a few ads for good measure -- you've completely turned that system on its head.

What's aggravating to me is that the current situation is completely accidental. If Twitter had provided a sane way to link a single word, none of these weaselly URL shortening clones would have reared their ugly heads at all. Consider how simple it is to decouple the hyperlink from the display text in, say, phpBB, or Markdown, or even good old HTML markup itself:

<a href="http://example.com">foo</a>
[url=http://example.com]foo[/url]
[foo](http://example.com)

Every tiny URL is another baby step towards destroying the web as we know it. Which is exactly what you'd want to do if you're attempting to build a business on top of the ruins. Personally, I'd prefer to see the big, objective search engines who naturally sit at the center of the web offer their own URL shortening services. Who better to generate short hashes of every possible URL than the companies who already have cached copies of every URL on the internet, anyway?

Discussion