Coding Horror

programming and human factors

Putting the Science Back Into Computer Science

The term "computer science" is a borderline oxymoron. Very little of what we do in software development is science:

Many historians suggest that modern science began around 1600 in the time and with the efforts of Galileo Galilei (1564-1642), Johannes Kepler (1571-1630), and Francis Bacon (1561-1626). Their era punctuated the change from scholasticism of the Middle Ages and Renaissance to science as we know it. Scholasticism largely involved deductive reasoning from principles supplied by Aristotle, by scripture, or by notions of perfection (which largely involved circles and spheres). It was thus a "top-down" intellectual enterprise. Modern science instead involved induction from multiple observations of nature, and so worked "bottom-up" from basic observation or experiment to generalization. In the words of Bacon's Novum organum, "For man is but the servant or interpreter of nature; what he does and what he knows is only what he has observed of nature's order in fact or in thought; beyond this he knows nothing and can do nothing. . . . All depends on keeping the eye steadily fixed upon the facts of nature and so receiving the images simply as they are."

Galileo's and Kepler's work exemplified this fundamental change in attitude. Medieval thinking had assumed a centrality of humanity, so that the earth on which humans lived was thought to be the center of the universe. It had also assumed a perfection requiring orbits of heavenly bodies to be circular. Nicolaus Copernicus (1473-1543, and thus a hundred years before Galileo and Kepler) had cautiously broken with the first of these assumptions to conclude tentatively that the earth orbited the sun, but he clung to the idea of a perfectly circular orbit. Galileo argued much more forcefully for an earth orbiting the sun, ultimately breaking the earth-centered view that was based on human-centered logic. Kepler showed that the orbits of the planets are ellipses, rather than the circles required of a philosophically perfect universe. More recent observations - that those orbits are changing ellipses, that the earth is not perfectly spherical but is an oblate spheroid, and that the sun occupies no central position in just one galaxy among billions of galaxies - would all be very distasteful to the scholastic view of the world, which assumed geometric perfection and human or earthly centrality.

To summarize: The logic of modern science requires that observations or facts govern the validity of generalizations or theories. Previous thinking had often gone the opposite direction. Galileo was reminded of that previous direction when he was taken to Rome and condemned because his "proposition that the sun is in the centre of the world and immovable from its place is absurd . . . because it is expressly contrary to Holy Scripture" (to quote the official judgment of the court).

Inside a computer, we're back in the dark ages: everyone is God, and the earth is the center of the universe. Under those conditions, classical objective science -- establishing a hypothesis, determining an experimental method, then gathering data to support or disprove your hypothesis -- is difficult.

If software development was a science, I doubt you would hear it referred to as craftsmanship, or farming. Even the term software engineering makes me uneasy; if bridges and airplanes were engineered the same way I "engineer" software, I'd be afraid to leave the house. Of course, those guys have a big advantage over me: they're grounded in the reality of physics. They don't have to build a bridge on Jupiter, or construct an airplane out of brand new, previously unknown materials. We constantly get the rug pulled out from under us, because there are no rules in our universe-- at least, none other than the ones we make up.

That's one of the reasons I find usability research so reasurring: it puts the science back into computer science. For example, browse a few back issues of the Software Usability Research Laboratory at Wichita State. Ironically, the only way to achieve anything resembling objective science with computers is to measure things outside the box...

‘Hot Spot' eye tracking map of the Atkins homepage

.. See what I mean?

Written by Jeff Atwood

Indoor enthusiast. Co-founder of Stack Overflow and Discourse. Disclaimer: I have no idea what I'm talking about. Find me here: https://infosec.exchange/@codinghorror