Coding Horror

programming and human factors

Inherits Nothing

Have you ever noticed that new .NET developers have a tendency to use inheritance for.. well, everything? On some level, this is understandable, since inheritance is used throughout the framework; everything in .NET inherits from a root object. There's one big difference, though: we're writing crappy business logic code, not a language. What is appopriate for a language developer may not be appropriate for simple business code that needs to be maintainable and easy to understand above all else.

Inheritance is a specialized tool, and should only be used for situations that truly warrant a parent-child relationship, and all the "hidden" behavior that entails. I'm sure I have lost some of the OO purists at this point, so lest you think I'm a lunatic who has completely abandoned OO principles, I'd like to point out that I am in good company: Dan Appleman also feels this way. Here's a little excerpt from his excellent (and still relevant) Moving to VB.NET: Strategies, Concepts and Code:

I've been a C++ prorammer for longer than I've programmed in Visual Basic-- and I still program actively in both languages. I've been a firm advocate of object-oriented programming since I first understood the concept of a class back in 1977; and I've programmed in frameworks like ATL that use inheritance extensively and successfully.

But in terms of using inheritance in one of my own applications or components, in all of those years, I can think of maybe a half a dozen times, at most, where inheritance was the right choice.

So, yes, .NET uses inheritance-- it's built into the architecture. And yes, the code generated by the various designers will use inheritance to give you the framework on which you'll build your own code.

However, if you really understand inheritance, you may find yourself living the rest of your career without ever creating a single inheritable class or component.

He goes on to say exactly what I would: inheritance is only one way of many to achieve code reuse. Having a simple object , without all the complex (and mostly hidden) rules of inheritance, is plenty good enough to achieve the most important goal: code reuse.

As for the argument of, "if the .NET framework is built on inheritance, so we should be too!", my question to you is this: how many of you are writing programming languages? How many of you guys are writing operating system kernels? The reality is, those are extreme and rare circumstances, and shouldn't be used as a model for anything other than writing languages or operating systems.

To me, the added baggage of inheritance-- like all added complexity-- is always guilty until proven innocent. Don't inherit unless you have a very compelling and specific set of reasons to inherit.

Written by Jeff Atwood

Indoor enthusiast. Co-founder of Stack Exchange and Discourse. Disclaimer: I have no idea what I'm talking about. Find me here: http://twitter.com/codinghorror