Sonja Schmid on Chernobyl

We discussed the Chernobyl disaster with nuclear energy expert Sonja Schmid on the occasion of our special Chernobyl issue (May 2016). Schmid, Associate Professor in the Department of Science and Technology in Society at Virginia Tech, is the author of “Producing Power” (MIT Press), a history of the Chernobyl accident that draws on interviews and archival work to devise a multi-strand narrative of the catastrophe. Schmid speaks about the multiple causes for Chernobyl, the politics of nuclear energy and the myth of a unified Chernobyl experience.

 

In your book you say that the explanations usually given to account for the Chernobyl catastrophe are insufficient. What are those explanations and why do they fail?

The first explanation for Chernobyl that was given by the Soviet delegation that reported to the International Atomic Energy Agency in Vienna in August of 1986 – which was really still in the midst of the clean-up effort – was that the operators in the control room made mistakes that directly caused the disaster. In other words, the first explanation given to the outside world was that the disaster was due to human error. Much more sophisticated explanations were given within the Soviet Union both officially and behind closed doors at the time that also mentioned some design flaws in this particular reactor, the RBMK, and there was obviously also criticism of management directly at the nuclear power station as well as more generally in the nuclear industry and perhaps even within the Soviet Union as a whole. So that over the years, these alternative explanations – design flaw and management issues – were added to the initial human error account. And so today when you look up some of the more reliable sources for the accident, they will typically list those three reasons: human error, design issues and systemic causes such as corruption or a failure of training and assigning people to the right job.


But you wouldn’t deny that all these explanations play a part in explaining the disaster? Your contention is not that there were no mistakes made, or that Soviet corruption didn’t exist nor that they were irrelevant to leading to the disaster.

My argument is that each one of these explanations, on its own, is not enough to explain the accident. I believe that they interacted in a very unfortunate way to produce this disaster. If you think about it, we won’t ever be able to completely eliminate human error or human judgement for that matter. And maybe we don’t want to, because sometimes automating technologies, which is the other extreme of that spectrum, is also faulty, and sometimes you want a human expert operator to interact with the technology and to override some automated features. It depends on the situation. Operator mistakes alone can rarely cause a disaster of this magnitude. Obviously it’s part of it, usually, but it’s also more complicated than saying that they deliberately pushed the wrong button or that they did a risky experiment, which is what is often assumed and asserted. There is a long controversy that I try to chronicle in my book about whether the operators in the control room acted against their instructions, or whether they didn’t. If you take seriously the possibility that they did in fact follow their instructions, it takes you to an even more troubling conclusion, namely that the people who wrote those instructions made a mistake. So it spirals into an array of questions about who is responsible and how those responsibilities interact.

But to return to the question of single explanations, I do think that human error played a part in Chernobyl. I just don’t think it’s enough; if it were, that would be a pretty bad sign for any technology or industry. But a design flaw should also not be enough to produce such a disaster, because if you think about it any technology we produce is in some way deficient and imperfect, and we work around these imperfections and develop strategies how to cope with them. Constance Perin has conducted field research in operator rooms of nuclear power plants in the US and described what she calls “operating as experimenting”, the way that the operators constantly maneuver around design deficiencies, how they must permanently make up for what the design was not built for, how it fails them in particular instances. The Soviets were very proud of their operator trainings, and when you go back in history you see why that was necessary. One of the reasons for that extra training is that their instrumentation and control technology was really unreliable, so that you couldn’t necessarily assume that when an alarm went off, this was really an alarm. They needed expert judgement to assess whether it was a false alarm or not. The man-machine question is really important when you’re designing a technology: what do you want your operators to do, and where do you want to eliminate human action?


Were the operators at Chernobyl punished for improvisation though it was crucial for dealing with a technology as dangerous and as complex as nuclear energy?

Well, the issue of improvisation is a tricky one and I want to be clear that I am not suggesting that coping with or handling a technology necessarily means coming up with creative solutions that weren’t foreseen in any manual. The kind of improvisation I am talking about is a skilled, experience-based expert improvisation that considers ways of achieving the same goal with different means because the means you were supposed to use either malfunctioned or are not available. And that kind of takes us to the third explanation that I think is not enough to understand the disaster, which is sometimes referred to as the system explanation – “the system was corrupt, the industry was badly managed” etc. The Soviet system, on the outside at least, looked very rigid, suggesting that there was no room for maneuvering, but as a matter of fact you had to constantly maneuver. Even buying groceries required creativity: you had to barter, trade and figure out what was sold where and at what time. So there was a pervasive culture of improvising in daily life, and it is then strange to imagine an industry where you pluck out people, put them in a control room and expect them to suddenly abide by all rules to the letter. Of course they would assume – whether consciously or not – that you make do with what you have, and that if you don’t have what you need, you figure out a different way. In that sense I think that improvisational skills were more available in the Soviet system than in cultures where you weren’t as reliant on it as people are in a dysfunctional context.


This contrasts very starkly with the Japanese context which is often thought of as being overly rigid. And yet, as you suggested early on, Fukushima showed some striking similarities to that catastrophe. Indeed, three executives of TEPCO were indicted, design flaws marked, and Japan is on its way towards returning to nuclear energy given that those presumably isolated flaws have been removed. Are you again recognizing the three narratives you detected in the accounts given for Chernobyl? Has history repeated itself?

It is very tempting to follow the three narratives that I’ve laid out. I’m really not saying that history has repeated itself because the disasters are so different and the contexts are so different. But there were similarities and they were striking. The script of first blaming the operators was followed, and in this case the charge was indeed not that they conducted an experiment, but that, on the very contrary, they followed the rules too strictly and should have broken them. It was a twist but it was still the operator argument that came first. Pretty soon thereafter, the prognosis was validated as design flaws and regulatory systems were subsequently blamed. There were the conversations about whistleblowers in the US who warned about the nuclear reactor design and its particular features back in the 1970s. And finally the Japanese nuclear industry was accused of having too cozy a relationship with the regulatory agency so that – again recalling the Soviet Union after Chernobyl – one of the first reactions to Fukushima was to rearrange and reorganize the nuclear regulatory structure. This deja-vu was remarkable. It is just important to note the differences – notably, in Chernobyl it was one reactor, in Fukushima there were three that suffered meltdowns, and in the latter case you had the natural disaster occurring all around, making it really difficult to even access the sites. So it is really difficult to compare them directly.


Another striking similarity seems to be the fact that the notion of difference is again being invoked – the idea that Japan, like the Soviet Union before, is completely different from the West and cannot be compared to it. What do you tell skeptics who read your book as a narrative of being under a spell and think that the Japanese and Soviets “thought” they were safe, whereas we in fact are?

I think you’re right that both the Soviet Union and Japan were both portrayed as “other” and “different”. Even in Japan itself the cultural explanation has been pushed, with people saying that it happened there because “the Japanese always follow the rules”. I find that untrue to what was actually happening. You had a lot of very creative people whose actions did not correspond with that stereotype. In fact, I think it was easier to argue that way in the case of Chernobyl, where there was a different reactor design as well as a distinct political, economic and ideological system. Japan is harder to keep at bay as “other” because it is such a high-tech, industrialized and well-organized society. And again, to bring to mind the cliché of the Japanese nuclear industry, they were famous for building a seismically resistant reactor, so everybody thought they could pretty much withstand the apocalypse. And also there is the Western design of the reactors. In a way, the cultural explanation is really the only thing left, which I think is why it has been invoked so many times and why Fukushima sent a shockwave through the nuclear industry.


There is a tension between your claim that nuclear technology is always only as safe as we understand it to be, and your constructive suggestions that aim at devising better emergency responses. Why shouldn’t one just throw up one’s hands and say that nuclear energy will never be safe enough? What hope is there that emergency responses won’t also always be only as good as we take them to be?

Well, this is a complicated question. First of all, we’re stuck with nuclear plants whether we like it or not. And unlike solar panels and wind turbins, you can’t just dismantle reactors, put them away and be done with it. You can see this with Chernobyl, where it’s already thirty years on, or more precisely sixteen years after they shut down the last operating reactor, and where there are still thousands of people working on the site who will continue to be working on the site literally for the foreseeable future. What Germany is proposing – to phase out nuclear energy entirely – I don’t think solves much. Whether we shut down nuclear power plants worldwide or expand aggressively, because let’s say we think it’s a good solution to combat climate change, I think we need to think more broadly. We need to think not merely about preventing disasters, but accept the possibility of them happening again and prepare for that possibility. There are some new ideas on the table right now many of which simply sound pedestrian. Where you ask yourself why that wasn’t done a long time ago, for instance to have extra pumps and generators at a safe location close-by. The US is already doing that, though it’s obviously a different geographical situation than in Europe or Japan, where everything is sort of crammed together. But what I think is being neglected, which I’m trying to get at in my new project, is the skill-based training and mental preparation that you need to respond to a nuclear disaster. We missed the boat with Chernobyl, and that’s a huge miss. With this arrogance of thinking it could never happen to us, we missed the opportunity to learn from that.

When it comes to the other question you raised – that of whether nuclear technology will ever be safe enough -, I think that it is an illusion that we will ever have a hundred percent safe technology. I think this is also a responsibility of our political leaders, to not pretend that this is an achievable goal. We need to find a way of determining what it is we can reasonably achieve, and decide whether we want to achieve that in a democratic or technocratic way. In a democracy, however flawed, we should all agree on how much risk we are willing to accept. And that’s a moving target, something which cannot be decided once and for all. Something that we accept as safe enough today may not be safe enough in ten years.


Japan is a good example for the way corporations and governments tend to withhold information from the public. But there are obvious risks attached to swinging too far towards democratic participation in technological matters. I think especially today many Europeans feel that the people’s will sways in uninformed and in many cases dangerous directions, and that it is not necessarily wise to involve them too deeply in matters which require expert attention.

I’m not suggesting you make a referendum every time you make a technical decision. But I think leaving decisions of this magnitude to a technical or political elite is also problematic. Risk has this funny effect on political decision-making: authorities are reluctant to take any decisions that involve risks so that they are delegated further outwards until they paralyze entire institutions. At least a more honest and open discussion about risks and their inevitability should be achievable. We can’t choose whether we want to live with risks or not, but we can choose with which risks we want to live. I’m a firm believer in technical expertise, but in one that exposes itself to public criticism.


In films about Chernobyl originating from the three countries arguably most affected by the disaster, that is Belarus, Russia and Ukraine, there seems to be no genuine appreciation of each other’s difficult experience. That is Belorussian films mostly restrict their attention to how Belarus was affected by the catastrophe and similarly for Russian and Ukrainian films. This suggests that, judging by the example of cinema, there is no such thing as a unified experience of the Chernobyl catastrophe. Is this something you would accredit from the perspective of your research?

I think that it would be a myth to suggest that there is anything coming close to a unified Chernobyl experience. It is deeply fragmented, maybe among national lines but also along generational lines. I noticed this when I gave talks to students who were born after Chernobyl and have no recollection of what it was like. They may obviously have read something about the catastrophe, but it doesn’t mean anything to them. Whereas if you talk to people who remember that day or period you recognize a completely different attitude towards and experience of the catastrophe. I’m not an expert in film, but a lot of related material that I’ve seen caters to the emotional experience – the invisible danger and the allure of returning to the exclusion zone. Atomic Ivan, a recent Russian film set around a nuclear plant that was in fact filmed at Russian nuclear plants, plays with the imagery of cooling towers and complicated control rooms. To me, this seemed to almost normalize nuclear energy, because in my childhood I experienced Chernobyl as a threat that lingered on in the berries and mushrooms even as years were passing. But talking to people who lived close to or worked at power plants, you get the impression that for them it was normal long ago and that Chernobyl didn’t change much about that. This is just another wrinkle in the idea that there is such a thing as “the” Chernobyl experience.


As you say, the emotional experience of an invisible danger comes up in many films about Chernobyl, and it is no less a theme in Svetlana Alexievich’s interviews about the catastrophe. Given that you believe we’re always only as safe as we think we are, would you say that the metaphor of “living in a burning house without being aware of it” is also a fitting description of our situation?

The way you pose the question seems to suggest that, if I say yes, I’m catering to this alarmist notion, which I don’t want to do. But I also don’t want to cater to the other extreme and say that we’ve always lived with radiation and pretend that there’s no problem. I think we’ve decided to just live with certain technologies, and I don’t just mean nuclear power here. There’s something special about the nuclear risk which has to do something with the fact that we cannot perceive it with our human senses, that we need technology to mediate even the perception of this risk. But let’s not forget that nuclear energy is an offspring of nuclear weapons, which still surround us and are a risk regardless of whether there will ever be an actual nuclear war. In those terms, I think a little more normalization wouldn’t hurt – accepting that this is the world we live in. In this sense, the – I want to say – “hysteria” about getting something out of the country is overblown because, especially in Europe, everything is so close together. Even if Germany phases out, there are nuclear plants all around, and if something happens at the Belgian, Czech or even French plants, not having plants of its own won’t save Germany. So, even though I myself oscillate between these two extremes, I think that a little more pragmatism wouldn’t hurt the debate.


Thank you for the interview.