“If I took a different way, would it have changed our fates a little?” Asking the “what if” questions is just something we do when, looking back after a tragedy, we try to collectively understand what went wrong, what might have been prevented and what we might have done differently. In the case of Japan’s earthquake-tsunami-nuclear disaster, the crisis hasn’t yet passed.
For now the world waits and watches the Fukushimi Daichii reactors, their cooling mechanisms and the small crew of workersstill on site struggling to stave off the worst forms of disaster. But while we wait we ponder – how far can we trust ourselves, and the systems we devise, to resist both predictable and unpredictable catastrophes? The particular combination of earthquake, tsunami, nuclear accident and bad weather was unpredictable, but all of these items are studied in Japan’s disaster response planning.
Indeed, at the request of the Japanese government, in 2009 theOECD carried out review of Japan’s risk management policies concerning large-scale floods and earthquakes. Among other recommendations, the report states that: “Industries that can trigger special harm in case of flood accidents, such as chemical and nuclear industries, should be required by law to move to safer areas”.
In fact, often after a disaster strikes, it turns out that warning went unheeded, or terrible mistakes were made. At Bhopal, security systems were disabled to save money. At Three Mile Island, an indicator on a valve gave ambiguous information. In the world’s worst plane crash at Tenerife airport in 1977, a pilot took off without clearance. It can be the case, as for Deepwater Horizon, that neither machines nor people behave as expected.
And in the Fukushima plant, emergency generators were in place, but not on high enough ground. So the backup system itself was vulnerable, making catastrophic failure possible. The spent fuel is stored next to the reactor, so damage to the reactor can damage the stored rods too. The “comparatively smaller and less expensive containment structure” makes the reactor more vulnerable than other designs.
What are we missing about the interaction of complex systems(the human mind/will/choice patterns being one of these) that allow these huge catastrophic events to occur in spite of multiple redundant systems, failsafes and billions of dollars directed at preventing them? What would allow us to improve risk assessment? Abetter understanding of human perception of risk for a start. Why we tend to minimise it, why we are overly optimistic about our ability to manage it and deal with events when they arise, why our love for the elegance of technology and engineering leaves us blind to its weaknesses.
We have to pay attention to the combination of human factors, economic opportunism and political influences that characterise decision making in a practical context. Rather than deploring these exogenous influences, they could be integrated into the process -by assuming that operators will always look for ways to save time/money/effort or that industry will resist backtracking on a technology that has been adopted. We have to learn to factor the costs of catastrophe – environmental economic and social – into the decisions around prevention and maintenance.
Otherwise, we end up with grim advice such as that given by Yuli Andreyev, former head of the agency tasked with cleaning up after Chernobyl. Yesterday, he told the Guardian that the Japanese authorities “had to be willing to sacrifice nuclear response workers for the good of the greater public”.