Alan Kay is one of my computing heroes. All this stuff we do every day as programmers? Kay had a hand in inventing a huge swath of it:
Computer scientist Kay was the leader of the group that invented object-oriented programming, the graphical user interface, 3D computer graphics, and ARPANET, the predecessor of the Internet
So as you might imagine, I was pretty thrilled to see he was dabbling a little in Stack Overflow. It's difficult to fathom the participation of a legend like Alan in a site for regular programmers. Maybe we should add a Turing Award badge. At least people can't complain that it is unobtainable.
Jeff Moser, an avid Stack Overflow user with a an outstanding blog of his own, had the opportunity to meet Alan recently and ask him about it. Jeff gave me permission to reprint his field report here.
Since I knew I'd be seeing Alan Kay at Rebooting Computing, I decided to verify his Stack Overflow usage in person. According to Alan, he found the original question using an automated search alert just like Atwood had guessed.
We then proceeded to discuss how it's sad that identity is still hard online. For example, it's hard to prove if I'm telling the truth here. As for that, the best I can offer is to look at my picture on my blog and compare with this picture from the Summit:
(Alan is on my right)
Alan is a great person to talk to because of his huge experience in the computing field.
He's currently working at the Viewpoints Research Institute where they're doing some classic PARC style research of trying to do for software what Moore's Law did for hardware. A decent explanation by Alan Kay himself is available here (wmv). For specifics, you might want to check out the recent PhD thesis of Alessandro Warth, one of Alan's students.
One of the greatest lessons I've personally learned from Alan is just how important computing history is in order to understand the context of inventions. One of Alan's greatest heroes is J.C.R. Licklider (a.k.a. "Lick"). Our discussions a few months ago led me to read "The Dream Machine" and write a post about it.
A consequence of studying history well is that you'll notice that a ton of the really cool and interesting stuff was developed in the ARPA->PARC days and it's slowed down since. I'd assume that's why he's curious about anything post-PARC's peak days (e.g. 1980+).
I'd say that Alan firmly believes that the "Computer Revolution Hasn't Happened Yet" (still) even though he's been talking about it for decades.
- see his '97 talk at OOPSLA.
- and this video from last month at the 40 Year Anniversary of Engelbart's "Mother of all Demos".
Speculating from discussions, I'd say that the problem he sees is that computers should help us become better thinkers rather than "distracting/entertaining ourselves to death." Alan likes to use the example that our "pop culture" is more concerned with "air guitar" and "Guitar Hero" rather than appreciating genuine beauty and expressiveness of real instruments (even though it takes a bit longer to master). Check out 1:03:40 of this video from program for the Future. In effect, we're selling our potential short.
I think that's my biggest take away from Alan about computing: computers can do so much more than we're using them for now (e.g. provide "a teacher for every learner").
Hope this helps provide some context.
Indeed it does, Jeff. If you'd like to get a sense of what Alan is about and the things he's working on, I recommend this Conversation with Alan Kay from the ACM.
It's not that people are completely stupid, but if there's a big idea and you have deadlines and you have expedience and you have competitors, very likely what you'll do is take a low-pass filter on that idea and implement one part of it and miss what has to be done next. This happens over and over again. If you're using early-binding languages as most people do, rather than late-binding languages, then you really start getting locked in to stuff that you've already done. You can't reformulate things that easily.
Let's say the adoption of programming languages has very often been somewhat accidental, and the emphasis has very often been on how easy it is to implement the programming language rather than on its actual merits and features. For instance, Basic would never have surfaced because there was always a language better than Basic for that purpose. That language was Joss, which predated Basic and was beautiful. But Basic happened to be on a GE timesharing system that was done by Dartmouth, and when GE decided to franchise that, it started spreading Basic around just because it was there, not because it had any intrinsic merits whatsoever.
This happens over and over again. The languages of Niklaus Wirth have spread wildly and widely because he has been one of the most conscientious documenters of languages and one of the earlier ones to do algorithmic languages using p-codes (pseudocodes) -- the same kinds of things that we use. The idea of using those things has a common origin in the hardware of a machine called the Burroughs B5000 from the early 1960s, which the establishment hated.
To me, the quintessential Alan Kay presentation is Doing with Images Makes Symbols: Communicating With Computers.
As the video illustrates, computers are almost secondary to most of Alan's work; that's the true brilliance of it. The real goal is teaching and learning. I'm reminded of a comment Andrew Stuart, a veteran software development recruiter, once sent me in email:
One subtle but interesting observation that I would make - your article points out that "what software developers do best is learn" - this is close to the mark, though I would rearrange the words slightly to "what the best software developers do is learn." Not all software developers learn, but the best ones certainly do.
And this, I think, lies at the heart of everything Alan does -- computing not as an end in itself, but as a vehicle for learning how to learn.