Hinkley Point B is an aging power plant overlooking the Bristol Channel. The location was once designed to welcome visiting schoolchildren but is now defended against terrorists by a maze of checkpoints and perimeter fencing. At the heart of the site, which I visited on a mizzling, unseasonable day in late July, looms a vast grey slab of a building containing a pair of nuclear reactors.
Hinkley Point B began operating shortly before the doomed TMI-2 reactor at Three Mile Island in Pennsylvania, and is due to be decommissioned after 40 years of service in 2016. As parts of the plant are showing the industrial equivalent of crow's feet, it runs at 70 percent capacity to minimize further wear and tear. But when I asked EDF Energy, a subsidiary of one of the world's largest nuclear energy companies, whether I could visit a nuclear facility to talk about safety, Hinkley Point B was the site they volunteered.
It might have seemed a strange choice on their part, but I was on a strange mission. I hadn't come to Hinkley Point B to learn about the safety of nuclear energy. I'd come because I wanted to learn about the safety of the financial system.
The connection between banks and nuclear reactors is not obvious to most bankers, nor banking regulators. But to the men and women who study industrial accidents such as Three Mile Island, Deepwater Horizon, Bhopal or the Challenger shuttle"”engineers, psychologists and even sociologists"”the connection is obvious. James Reason, a psychologist who studies human error in aviation, medicine, shipping and industry, uses the downfall of Barings Bank as a favourite case study. "I used to speak to bankers about risk and accidents and they thought I was talking about people banging their shins," he told me. "Then they discovered what a risk is. It came with the name of Nick Leeson."
Peter Higginson, the head of safety at Hinkley Point B, also thinks there is a parallel. An earnest physicist in a dark-blue shirt and tan slacks, he breaks off from a long safety briefing to muse about banking. "I have done my own thinking about the financial crisis," he says. "Could they have learned something from us? I ask the question."
One catastrophe expert who has no doubt about the answer is Charles Perrow, emeritus professor of sociology at Yale. He is convinced that bankers and banking regulators could and should have been paying attention to ideas in safety engineering and safety psychology. Perrow published a book, Normal Accidents, after Three Mile Island and before Chernobyl, which explored the dynamics of disasters and argued that in a certain kind of system, accidents were inevitable"”or "normal".
For Perrow, the dangerous combination is a system that is both complex and "tightly coupled". Harvard university is a complex system. A change in US student visa policy; or a new government scheme to fund research; or the appearance of a fashionable book in economics, or physics, or anthropology; or an internecine academic row"”all could have unpredictable consequences for Harvard and trigger a range of unexpected responses.
A domino-toppling display is not especially complex, but it is tightly coupled. So is a loaf of bread rising in the oven. For Perrow, the defining characteristic of a tightly coupled process is that once it starts, it's difficult or impossible to stop.
Nuclear power stations are both complex and tightly coupled systems. They contain a bewildering array of mechanisms designed to start a nuclear reaction, slow it down, use it to generate heat, use the heat to generate electricity, intervene if there is a problem, or warn operators of odd conditions inside the plant. At Three Mile Island, the most famous nuclear accident in US history, four or five safety systems malfunctioned within the first 13 seconds. Dozens of separate alarms were sounding in the control room. The fundamental story of the accident was that the operators were trying to stabilise a nuclear reactor whose behaviour was, at the time, utterly mysterious. That is the nature of accidents in a complex and tightly coupled system.
Perrow believes that finance is a perfect example of a complex, tightly coupled system. In fact, he says, its complexity "exceeds the complexity of any nuclear plant I ever studied".
So if the bankers and their regulators did start paying attention to the unglamorous insights of industrial safety experts, what might they learn?
. . .
It might seem obvious that the way to make a complex system safer is to install some safety measures. Engineers have long known that life is not so simple. In 1638, Galileo described an early example of unintended consequences in engineering. Masons would store stone columns horizontally, lifted off the soil by two piles of stone. The columns often cracked in the middle under their own weight. The "solution""”a third pile of stone in the centre"”didn't help. The two end supports would often settle a little, and the column, balanced like a see-saw on the central pile, would then snap as the ends sagged.
Galileo had found a simple example of a profound point: a new safety measure or reinforcement often introduces unexpected ways for things to go wrong. This was true at Three Mile Island. It was also true during the horrific accident on the Piper Alpha oil and gas platform in 1988, which was aggravated by a safety device designed to prevent vast seawater pumps from starting automatically and killing the rig's divers. The death toll was 167.
In 1966, at the Fermi nuclear reactor near Detroit, a partial meltdown put the lives of 65,000 people at risk. Several weeks after the plant was shut down, the reactor vessel had cooled enough to identify the culprit: a zirconium filter the size of a crushed beer can, which had been dislodged by a surge of coolant in the reactor core and then blocked the circulation of the coolant. The filter had been installed at the last moment for safety reasons, at the express request of the Nuclear Regulatory Commission.
Read Full Article »