Normal Accidents
Wed, Oct 1, 2014
I’m just finishing Command and Control, a frightening account of a Titan II missile accident intermingled with other terrifying Tritium-related tidbits. You may have read an excerpt from it about a close call involving a crashed B-52 and two Hydrogen bombs.
My uncle was a cold-war-era B-52 navigator. As a kid, I never really understood what he did. I rarely saw him; but when I did, I remember him patiently answering my questions about airplanes and reviewing the (in)accuracy of my latest airplane designs. I thought about him, and his hazardous occupation, a lot as I read this book.
The epilogue paraphrases Charles B. Perrow’s, Normal Accidents: Living with High Risk Technologies (Just added to my Kindle list.) If you are a software developer, I hope this quote rings true:
Certain patterns and faults seemed common to all of them. The most dangerous systems had elements that were “tightly coupled” and interactive. They didn’t function in a simple, linear way, like an assembly line. When a problem arose on an assembly line, you could stop the line until a solution was found. But in a tightly coupled system, many things occurred simultaneously — and they could prove difficult to stop. If those things also interacted with each other, it might be hard to know exactly what was happening when a problem arose, let alone know what to do about it. The complexity of such a system was bound to bring surprises. “No one dreamed that when X failed, Y would also be out of order,” Perrow gave as an example, “and the two failures would interact so as to both start a fire and silence the fire alarm.”