Culture of Negligence – Learn What NASA Failed to Learn

However mindful and careful might you be when behind the wheel, at some point you will inevitably do something dangerous. Near miss some other car when overtaking. Powering through a corner which turns out to be covered with black ice. Forcing a right of way on the intersection. Just a moment of mindlessness, unseen risk, lack of experience – each of these can be potentially disastrous. Thing is, in huge majority of cases, nothing happens. Somehow you recover, just to burst out with nervous laughter once the adrenaline rush stops. Then, inevitably and possibly subconsciously, you will draw some conclusions. You might become more careful or, oddly, start to care less. The latter one is more likely if your similar experiences from the past ended the same way, with no serious consequences. Some action is potentially deadly – but it always works out, somehow. You might start tolerating it.

And it’s not like the phenomenon is narrowed down to single people. Teams do it. Societies do it. Even large and knowledge based organizations do it. Ever heard of NASA?

Back in the 1970s, a new kind of space transportation was developed. The Space Shuttle, for many years the most complicated piece of machinery ever constructed by mankind. In 1986, over 8 years from first free flight, with no serious problems emerging along the way, the unthinkable happened.

An incorrectly designed O-ring (in layman’s terms, a kind of toroidal gasket) failed during lift-off. It caused a breach in the solid rocket booster joint, allowing pressurized burning gas from within the engine to reach the outside. Within moments, the booster violently separated from the shuttle, causing the structural failure of the external tank in the process. Aerodynamic forces broke up the orbiter (the shuttle vehicle itself), with cockpit plummeting down to Earth. Cargo of the external tank consisted of liquid hydrogen fuel and oxidizer, leading to huge fireball. Everybody on board died, with nightmarish investigation hinting they might’ve been alive until cockpit hit the ground.

And the scary thing is, NASA management was aware of potentially flawed O-ring design since 1977. Though, for years and over course of several launches, nothing had happened. The threat became tolerable, then acceptable, then became just a normal fact.

Now, one would expect that NASA, the organization that put men on the Moon, would learn its lesson, right? Well, have you ever heard of Space Shuttle Columbia? In 2003, a piece of foam insulation hit the heat shield during take-off. It went unnoticed. On re-entry, the damaged shield allowed for hot atmospheric gasses to penetrate one of the wings. The orbiter, blasting through the air at over twenty times the speed of sound, soon became unstable and slowly broke apart. The debris was scattered over the area of small country. Sadly, yet again, everyone on board died.

Unbelievably, yet again, the foam shedding was nothing new. Of documented cases, it had happened six times before, with each having serious possible consequences. The threat became tolerable, then acceptable, then became just a normal fact.

Sociologist Diane Vaughan, in her book about the Challenger launch decision process, called this thinking “normalization of deviance”. It’s the same we do with reckless driving. Worryingly, it’s the same in corporate environment. Think of a project that’s flawed from the beginning. Pivot charts all red, unhappy staff, conflicting priorities – you know the drill. Astonishingly, with enormous effort and dedication, many of these projects deliver what was expected. What is then said about such project?

“GREAT SUCCESS!”

Then there’s another project and another, and another one after that. Things go wrong, management practices are irrational, priorities are miscommunicated, all that’s bad and ugly. Systematically, warning lights decay, one after another. People start neglecting them and accepting potential disasters as mere facts. It’s not far from taking a lift from someone you know drives like an idiot on a regular basis, expecting that, once again, nothing bad will happen. The threat becomes tolerable, then acceptable, then just a normal fact.

Eventually, with continuous blunting of senses, a culture of negligence forms.

Whatever project you’re now in, look at the list of risks you have. It’s likely to be painfully generic Excel sheet, which you’re used to simply create, store and forget. Don’t. Things that could take your project, your team or even your company down, might be there. Sure, neither of them ever materialized.

So far.

Add your comment!