Coding Horror

programming and human factors

The Principle of Least Power

Tim Berners-Lee on the Principle of Least Power:

Computer Science spent the last forty years making languages which were as powerful as possible. Nowadays we have to appreciate the reasons for picking not the most powerful solution but the least powerful. The less powerful the language, the more you can do with the data stored in that language. If you write it in a simple declarative from, anyone can write a program to analyze it. If, for example, a web page with weather data has RDF describing that data, a user can retrieve it as a table, perhaps average it, plot it, deduce things from it in combination with other information. At the other end of the scale is the weather information portrayed by the cunning Java applet. While this might allow a very cool user interface, it cannot be analyzed at all. The search engine finding the page will have no idea of what the data is or what it is about. The only way to find out what a Java applet means is to set it running in front of a person.

This was later codified in a more formal W3C document, The Rule of Least Power. I propose a corollary to this rule, which in the spirit of recent memes, I'll call Atwood's Law: any application that can be written in JavaScript, will eventually be written in JavaScript.

If you liked that article, I recommend the rest of Berners-Lee's architectural and philosophical points page. Although the content is quite old in internet time-- only two of the articles were written in the last year-- it still contains some timeless nuggets of advice and insight from the guy who invented the world wide web.

Written by Jeff Atwood

Indoor enthusiast. Co-founder of Stack Overflow and Discourse. Disclaimer: I have no idea what I'm talking about. Find me here: https://infosec.exchange/@codinghorror