Suzanne is an industrial engineer and as such she once worked for the Canadian Nuclear Safety Commission, a government agency responsible for regulating nuclear energy and the use of radioactive material. Her specialty is the human factor and the design of complex systems to optimize human potential and minimize human error. As well as studying technical topics such as mechanics, she also studied cognitive psychology such as how people perceive and process information and how they react to events.
The safety of nuclear plants can be jeopardized by many complex interacting factors in the system, and there is no doubt in her mind that includes human error. As an example, she points to the nuclear accident at Three Mile Island in New York State in 1979. She cites human error as a significant factor there. That said she cringes when the blame for a breakdown is laid on humans because all too often when complex systems are designed, human strengths and weaknesses, such as limits in short term memory, are not taken into consideration.
“The mind is not a computer; we are not very good at remembering a lot of detailed data such as long lists of machine part numbers. Humans are, however, very good at matching a complex pattern to a similar one they have previously experienced. ‘You look very much like my Aunt Martha’. This is something that computers struggle to do well. Any complex system – nuclear plants, airplanes and even cars – need to be designed to leverage human strengths while minimizing the use of human weakness.
“The problem with the Three Mile Island accident was you had a lot of things happening at once, which is common in a serious accident. It is rarely just one thing but a whole bunch of things that line up to cause the failure. But all I hear in the media is that it was human error. Ultimately, it was poor consideration of the human in the system. Some engineers would like to take the human outside the system, but that is not realistic,” she says.
Most nuclear power plants have lots of automated and manual safety checks, yet the operator is often not provided with a good overview of what is going on. Remote supervision of a system is very challenging, says Suzanne. “It’s not like riding a bike where you can see what’s ahead of you down the road. In a nuclear plant, the operator views the reactor through a software screen in a control room. The operator can’t actually see for himself.” And this was vital in the Three Mile Island accident, she says, because so many of the system displays were not coordinated but were independent gauges. There were hundreds of different alarms all going off at the same time.
“Some of those alarms were not visible to the operator, and some had been malfunctioning for weeks. This, combined with the fact that they were not designed to enable good “pattern matching” for the operator, resulted in the operator not recognizing the exact nature of the problem,” she says.
Attitudes among engineers changed in a big way after the accident at Chernobyl. They became more cautious and diligent about safety processes, says Suzanne. But again she points to the human error involved in the biggest nuclear accident in the world. The test, which set off the accident, was conducted at three in the morning when people’s performance is at its worst and there was huge political and management pressure to complete the test successfully. This resulted in operators feeling they had to override safety systems to complete the test.
Since Chernobyl, there is now a “strong culture for not interfering with automated shutdowns in Ontario. There is a tendency to let the system shut down and then figure out what went wrong. Before, it was up to the discretion of the operators, who had to consider the high cost involved in lost revenue and getting a nuclear power plant up and running again.
“Could we have an accident like the one at Three Mile Island with lots of bells and whistles going off? Yes we could,” says Suzanne, answering her own question. “You never get zero probability of an accident. There is always some risk and the only thing to do is try to minimize the risk and decide if the benefits outweigh the risks.”
The vision of nuclear power 40 years ago was of cheap energy but that didn’t turn out to be true. In fact, Suzanne confirms, nuclear power is quite expensive. She doesn’t think any more reactors should be built. However, she is not optimistic that they won’t be. She thinks the politicians “see centralized mega projects as sexy and haven’t got their heads around small highly distributed technical solutions, which are typical of renewable energy projects.” And there are always vested interests.
Thanks to Bill Curry for the info.