Coding Horror

programming and human factors

Building Unbreakable Links

I was reading through some of the DataGrid Girl's oh-so-cute article links, and I encountered a few dead ones. It's not really Marcie's fault; dead links are inevitable on any page as it ages. Such is the nature of absolute links. For example, this one:

http://msdn.microsoft.com/msdnmag/issues/02/03/cutting/cutting0203.asp

A few years ago, I had this thought: why do we use traditional absolute URLs any more? Why not build all of our links using relative Google search terms? For the above broken link, we can restate it like so:

http://www.google.com/search?q=msdnmag+asp.net+data+shaping&btnI=1

All you need to do is run a quick function to determine three or four of the most unique words on the page, then feed them to Google as query terms with the "I'm feeling lucky" parameter. Now you have a permanent, unbreakable link to that content. Well, permanent unless Google goes out of business, or the content disappears from the internet completely.

Of course, it's unlikely everyone will adopt this approach for the most obvious reason: Google would become unbelievably powerful. They would be the "link DNS" for the entire internet. But as a practical solution to Marcie's problem, I think it is totally workable. Whenever I link to articles in my code, I try to do so through very specific google search terms, which are likely to produce valid links many years from now-- even if the content moves to a different place on the internet.

All I need is some sort of web-based tool to automatically parse a page and produce 4-5 unique words from that page. It's sort of like googlewhacking, but with a more practical bent.

Written by Jeff Atwood

Indoor enthusiast. Co-founder of Stack Overflow and Discourse. Disclaimer: I have no idea what I'm talking about. Find me here: https://infosec.exchange/@codinghorror