Let the IDE do it

On Bruce Eckel’s Static vs. Dynamic [typing]:

Despite this, I’ve had some leanings back in the direction of static type checking. As you point out, the goal is to create solid components – the question is how to accomplish that? In a dynamic language you have the flexibility to do rapid experimentation which is highly productive, but to ensure that your code is airtight you must be both proficient and diligent at unit testing. In a language that leans towards static type checking, the compiler will ensure that certain things will not slip through the cracks, and this is helpful, although the resulting language will typically make you work harder for a desired result, and the reader must also work harder to understand what you’ve done. I think the impact of this is much greater than we imagine.

In addition, I think that statically typed languages give the illusion of program correctness. In fact, they can only go so far in determining the correctness of a program, by checking the syntax. But I think such languages encourage people to think everything is OK, when in fact the requirement for unit testing is just as important. I also suspect that the extra effort required to run the gauntlet of the compiler saps some of the energy required to do the unit testing. And you bring up an interesting question in suggesting that a dynamically-typed language may require more unit testing than a statically typed language. Of this I am not convinced; I suspect the amount may be roughly the same and if I am correct it implies that the extra effort required to jump through the static type-checking hoops may be less fruitful than we might believe.

What is the best of both worlds? In my own experience, it’s very helpful to create models in a dynamic language, because there is a very low barrier to redesigning as you learn. Possibly more important, you’re able to quickly try out your ideas to see how they work with actual data, to get some real feedback about the veracity of the model, and change the model rapidly to conform to your new understanding.

Bruce goes on to halfheartedly propose modeling in one [dynamic] language and developing in another [static] language.* Is this a workable solution? Personally, I don’t see why we can’t have our cake and eat it too – in a single language. If we tightly couple the IDE and the languagethe IDE could warn us about static typing errors, but give us the flexibility to treat the types dynamically when we compile. Give the IDE some functionality traditionally assigned to the compiler: that’s the true leap of faith, and the only way to deliver the best of both worlds.

I have also been remiss in not mentioning Wesner Moise’s posts on next-generation IDE designs:

Religious beliefs about key bindings will disappear, because most of those key combinations will become completely irrelevant in the new world. Yes, Don Box will finally kiss his Emacs goodbye. The new paradigm actually eliminates syntax errors (at least, they will be flagged immediately by the new graphical editor) and many semantic errors, because each edit operation directly alters, adds or deletes a node in the in-memory parse tree.

This also supports the tight IDE and compiler coupling that I think will become the watermark for next generation languages over the next 5-10 years.

*It’s difficult to imagine reading a more unenthusiastic endorsement of static typing. Although it’s true that static typing hasn’t killed any puppies. Yet.

Related posts

I Fight For The Users

I Fight For The Users

If you haven’t been able to keep up with my blistering pace of one blog post per year, I don’t blame you. There’s a lot going on right now. It’s a busy time. But let’s pause and take a moment to celebrate that Elon Musk

By Jeff Atwood ·
Comments
Updating The Single Most Influential Book of the BASIC Era

Updating The Single Most Influential Book of the BASIC Era

In a way, these two books are responsible for my entire professional career. With early computers, you didn’t boot up to a fancy schmancy desktop, or a screen full of apps you could easily poke and prod with your finger. No, those computers booted up to the command line.

By Jeff Atwood ·
Comments
To Serve Man, with Software

To Serve Man, with Software

I didn’t choose to be a programmer. Somehow, it seemed, the computers chose me. For a long time, that was fine, that was enough; that was all I needed. But along the way I never felt that being a programmer was this unambiguously great-for-everyone career field with zero downsides.

By Jeff Atwood ·
Comments
The Raspberry Pi Has Revolutionized Emulation

The Raspberry Pi Has Revolutionized Emulation

very geek goes through a phase where they discover emulation. It’s practically a rite of passage. I think I spent most of my childhood – and a large part of my life as a young adult – desperately wishing I was in a video game arcade. When I finally obtained my

By Jeff Atwood ·
Comments

Recent Posts

Let's Talk About The American Dream

Let's Talk About The American Dream

A few months ago I wrote about what it means to stay gold — to hold on to the best parts of ourselves, our communities, and the American Dream itself. But staying gold isn’t passive. It takes work. It takes action. It takes hard conversations that ask us to confront

By Jeff Atwood ·
Comments
Stay Gold, America

Stay Gold, America

We are at an unprecedented point in American history, and I'm concerned we may lose sight of the American Dream.

By Jeff Atwood ·
Comments
The Great Filter Comes For Us All

The Great Filter Comes For Us All

With a 13 billion year head start on evolution, why haven’t any other forms of life in the universe contacted us by now? (Arrival is a fantastic movie. Watch it, but don’t stop there – read the Story of Your Life novella it was based on for so much

By Jeff Atwood ·
Comments
I Fight For The Users

I Fight For The Users

If you haven’t been able to keep up with my blistering pace of one blog post per year, I don’t blame you. There’s a lot going on right now. It’s a busy time. But let’s pause and take a moment to celebrate that Elon Musk

By Jeff Atwood ·
Comments