Book Club: Normal Accidents

Welcome to Risk Book Club! Where I read and summarize arguments about risk so you don't have to. Today we're reading Normal Accidents by Charles Perrow (1984).


The bar summary: Most of us live in a world of trying to avoid accidents, but Charles Perrow says industrial society is so complex we might as well get used to them!


Ok but really tell me what this book is about: Perrow, writing in the 1980s, remarks that we live in a world of more and more “high-risk technologies”--nuclear power plants, airplanes, dams, and genetic engineering all make his scary list.


Complex systems cause failures because of the way they interrelate–Perrow argues that 2 or more failures among components can lead to a catastrophe when we don’t anticipate how components of our system might interact.


Examples of this abound–the Boeing AirMAX crashes are a good example of a system that is too “tightly coupled” according to Perrow. The planes relied on a single sensor to understand the angle of the plane, and when this sensor failed, there was no backup sensor to alert pilots to the accurate angle. Which would have been fine, if pilots had been trained to recognize this error and correct it. But they hadn’t been. So the interaction between the sensor failure and the human component led to catastrophe (side note this article is excellent but I really need to stop reading about plane crashes so I can enjoy travel again).

Best out of context quote: “This grant allowed me to put together a toxic and corrosive group of graduate research assistants who argued with me and each other for a year” (p. viii). I guess when the research team is bad Charles gets revenge in print.


Why should I care? In your workplace, this might work out as a computer system being down on the day the only person in the office who knows the ins and outs of that system is sick. A relevant concern with COVID. Just that person being out, or just the system being down, wouldn’t be a big deal on their own, but because these two components rely on each other, they can cause a crisis. This is what Perrow means by “tight coupling” in systems.


For Perrow, the best way forward when dealing with risky systems is to reduce the chain reactions caused by tight coupling. Instead, ask yourself “how can I create space in this system?”


I tell my students this all the time when we talk about “controlling the controllables.” Your car not starting one morning isn’t a big deal unless you needed to get to the library to submit an assignment that is due that very morning. If the assignment is already turned in, or you follow a rigorous car maintenance schedule due to your fear of driving (this is not an autobiographical detail), those two components wouldn’t interact on the morning the assignment was due.


But, in life, this is pretty privileged advice. Maybe you couldn’t take your car to the mechanic until your next paycheck comes in. Maybe your at-home laptop is broken or your wifi is down. Preparing for risk, and creating that space in our systems, requires a basic level of security that many people just don’t have. That’s why personalizing risks like natural disasters is so dang frustrating– “just make a preparedness kit with 72 hours of food and water!” the government says, “and buy flood insurance!” But if you don’t have the extra resources to do those things, the system is already very tightly coupled when disaster hits.


The bottom line: Nonetheless, many companies do have the resources to create space in their systems but sometimes don’t because of the cost. Perrow is saying that an accident is “normal” when, given the complexities of the system, failures are unimagined yet inevitable. And if you’re risk-averse and trying to save your workplace a bunch of bad press, money and time lost, or worse, you should work on imagining the cascading effects of these systems and create redundancies that reduce risk.