*** Welcome to piglix ***

Normal Accidents

Normal Accidents
Normal Accidents (Charles Perrow book) cover art.jpg
Author Charles Perrow
Publisher Basic Books
Publication date
1984
ISBN

Normal Accidents: Living with High-Risk Technologies is a 1984 book by Yale sociologist Charles Perrow, which provides a detailed analysis of complex systems conducted from a social sciences perspective. It was the first to "propose a framework for characterizing complex technological systems such as air traffic, marine traffic, chemical plants, dams, and especially nuclear power plants according to their riskiness". Perrow says that multiple and unexpected failures are built into society's complex and tightly-coupled systems. Such accidents are unavoidable and cannot be designed around.

Perrow's argument based on human error, big accidents tend to escalate, and technology is not the problem, the organizations are. Each of these principles is still relevant today.

"Normal" accidents, or system accidents, are so-called by Perrow because such accidents are inevitable in extremely complex systems. Given the characteristic of the system involved, multiple failures which interact with each other will occur, despite efforts to avoid them. Perrow said that operator error is a very common problem, many failures relate to organizations rather than technology, and big accidents almost always have very small beginnings. Such events appear trivial to begin with before unpredictably cascading through the system to create a large event with severe consequences.

Normal Accidents contributed key concepts to a set of intellectual developments in the 1980s that revolutionized the conception of safety and risk. It made the case for examining technological failures as the product of highly interacting systems, and highlighted organizational and management factors as the main causes of failures. Technological disasters could no longer be ascribed to isolated equipment malfunction, operator error or acts of God.

Perrow identifies three conditions that make a system likely to be susceptible to Normal Accidents. These are:

The inspiration for Perrow's books was the 1979 Three Mile Island accident, where a nuclear accident resulted from an unanticipated interaction of multiple failures in a complex system. The event was an example of a normal accident because it was "unexpected, incomprehensible, uncontrollable and unavoidable".

Perrow concluded that the failure at Three Mile Island was a consequence of the system's immense complexity. Such modern high-risk systems, he realized, were prone to failures however well they were managed. It was inevitable that they would eventually suffer what he termed a 'normal accident'. Therefore, he suggested, we might do better to contemplate a radical redesign, or if that was not possible, to abandon such technology entirely.


...
Wikipedia

...