One of the concepts in the book Managing the Unexpected is "Weak Signals of Failure". There is a lot to this, in operations small things are always going wrong, this is the "Unexpected" in the title. Risk management is not just about reducing or avoiding risk but also having a plan for coping when things go wrong.
In medicine when poor outcomes result from the failure to recognize and properly deal with weak signals of failure a term used is "failure to rescue"
From this post at The Epicurean Dealmaker - 50 Ways to Leave Your Lover.
The point of risk management is not to prevent failure, for that is impossible. The point is to have a plan ready to manage and control failure when it inevitably comes.
The post links to a commencement speech by Atul Gawande . The speech uses a medical example of a women who had surgery for one problem but had a second hidden, unexpected problem which was discovered by investigation of a weak signal of failure.
From the speech -
This may in fact be the real story of human and societal improvement. We talk a lot about “risk management”—a nice hygienic phrase. But in the end, risk is necessary. Things can and will go wrong. Yet some have a better capacity to prepare for the possibility, to limit the damage, and to sometimes even retrieve success from failure.
When things go wrong, there seem to be three main pitfalls to avoid, three ways to fail to rescue. You could choose a wrong plan, an inadequate plan, or no plan at all. Say you’re cooking and you inadvertently set a grease pan on fire. Throwing gasoline on the fire would be a completely wrong plan. Trying to blow the fire out would be inadequate. And ignoring it—“Fire? What fire?”—would be no plan at all.
Also mentioned however was the Deepwater Horizon disaster. -
In the BP oil disaster in the Gulf of Mexico two years ago, all of these elements came into play, leading to the death of eleven men and the spillage of five million barrels of oil over three months. According to the official investigation, there had been early signs that the drill pipe was having problems and was improperly designed, but the companies involved did nothing. Then, on the evening of April 20, 2010, during a routine test of the well, the rig crew detected a serious abnormality in the pressure in the drill pipe. They watched it and took more measurements, which revealed a number of other abnormalities that signal a “kick”—an undetected pressure buildup. But it was two hours before they recognized the seriousness of the situation—two hours without a plan of action. Then, when they did recognize the trouble, they sent the flow through a piece of equipment that can’t handle such pressures. The kick escalated to a blowout, and the mud-gas mix exploded. At that point, emergency crews went into action. But for twelve minutes, no one sounded a general alarm to abandon the rig, leading directly to the loss of eleven lives in a second explosion.Failure to Rescue - Weak Signals of Failure, same concepts.
I highly recommend Managing the Unexpected, it's a good solid book. I like being able to recognize these concepts when I run into them elsewhere.
K.C.
I posted about Managing the Unexpected Thinking Like a Mariner - Managing the Unexpected
and here At Sea
A good review from Harvard Business Review here.
For terminology this is a good site (pdf) Normal Accident Theory from NASA
3 comments:
Regarding the NASA study reference. It is dated 5 years before the shuttle Columbia disaster, wherein NASA basically did nothing to investigate damage during the launch. Apparently NASA quickly forgot about it's own warnings.
KC,
Are you OK? Just away at work, I expect - left a good meaty post to keep us busy in the meantime.
Bless you and come home safe!
Reid
Hey Reid,
Thanks for the comment. Been at sea. AIW.
K.C.
Post a Comment