Coding Horror

programming and human factors

The Positive Impact of Negative Thinking

In Waltzing with Bears: Managing Risk on Software Projects, DeMarco and Lister outline the dangers of penalizing negative thinking:

Once you've identified and quantified these risks, they can be managed just like the others. But getting them out on the table can be a problem. The culture of our organizations sometimes makes it impossible to talk about a really worrisome risk. We are like a primitive tribe that tries to hold the devil at bay by refusing to say his name.

Why didn't [technicians present at the 1986 Challenger launch] speak up [about the known risks of launching the shuttle in subzero weather conditions]? Their reasons were the same ones that stop people from articulating risks at companies everywhere. They take the form of unwritten rules, built into the corporate culture:

  1. Don't be a negative thinker.
  2. Don't raise a problem unless you have a solution for it.
  3. Don't say something is a problem unless you can prove it is.
  4. Don't be the spoiler.
  5. Don't articulate a problem unless you want its immediate solution to become your responsibility

Healthy cultures attach great value to the concept of a team. Being judged a "team player" is enormously important, and not being one can be fatal to a career. Articulating a risk shouldn't be seen as anti-team, but it often is. These unwritten rules are not very discriminating; they don't make much distinction between speaking up responsibility and whining. And because they rules are never openly discussed, they never get adjusted for changing circumstances.

We are all enjoined to adopt a can-do mentality in our work. And there's the rub. Saying the risk is an exercise in can't-do. Risk discovery is profoundly at odds with this fundamental aspect of our organizations.

Waltzing with Bears is very clear on this point: the biggest risk on any software project is the risks you haven't considered. You can't know the unknown, of course, but you'll do a lot better at risk management if you encourage a culture of responsible risk assessment instead of mindless can-do heroics.

Personally, I love it when developers come to me with potential problems in our applications. Far from being negative, this has all kinds of positive implications:

  • Deep knowledge of the application. The developer knows enough about the entire app to feel confident there's a problem.
  • Concern for quality of workmanship. A less concerned developer would shrug this off as "not their problem". They get paid either way, right?
  • Team player. If a developer is bringing up problems in a proactive way, that means they also (consciously or not) understand why it's important for the entire team to not fall prey to the Broken Window syndrome.

The only way to truly manage risk on a software development project is to solicit input from every team member on what could go wrong-- not only at the start of the project but also throughout its lifecycle. If you do, you'll have a far more predictable development schedule. And a much better product.

Written by Jeff Atwood

Indoor enthusiast. Co-founder of Stack Overflow and Discourse. Disclaimer: I have no idea what I'm talking about. Find me here: https://infosec.exchange/@codinghorror