Gayle Greene
It is one of the marvels of our time that the nuclear industry managed to resurrect itself from its ruins at the end of the last century, when it crumbled under its costs, inefficiencies, and mega-accidents. Chernobyl released hundreds of times the radioactivity of the Hiroshima and Nagasaki bombs combined, contaminating more than 40% of Europe and the entire Northern Hemisphere. But along came the nuclear lobby to breathe new life into the industry, passing off as “clean” this energy source that polluted half the globe. The “fresh look at nuclear”—in the words of a New York Times makeover piece (May 13, 2006)—paved the way to a “nuclear Renaissance” in the United States that Fukushima has by no means brought to a halt.
“No Radioactivity in Hiroshima Ruin,” proclaimed a New York Times headline, Sept 13, 1945. “Survey Rules out Nagasaki Dangers,” stated another headline: “Radioactivity after atomic bomb is only 1000th of that from luminous dial watch,” Oct 7, 1945. There were powerful political incentives to downplay radiation risk. As State Department Attorney William H. Taft asserted, the “mistaken impression” that low-level radiation is hazardous has the “potential to be seriously damaging to every aspect of the Department of Defense’s nuclear weapons and nuclear propulsion programs…it could impact the civilian nuclear industry… and it could raise questions regarding the use of radioactive substances in medical diagnosis and treatment.” A pamphlet issued by the Atomic Energy Commission in 1953 “insisted that low-level exposure to radiation ‘can be continued indefinitely without any detectable bodily change.’” The AEC was paying the salaries of the ABCC scientists and monitoring them “closely—some felt too closely,” writes Susan Lindee in Suffering Made Real, which documents the political pressures that shaped radiation science….The Atomic Bomb Casualty Commission scientists calculated that by 1950, when the commission began its investigations, the death rate from all causes except cancer had returned to “normal” and the cancer deaths were too few to cause alarm.
“It’s nonsense, it’s rubbish!” protested epidemiologist Dr. Alice Stewart, an early critic—and victim—of the Hiroshima studies. Stewart discovered, in 1956, that x-raying pregnant women doubled the chance of a childhood cancer: this put her on a collision course with ABCC/RERF data, which found no excess of cancer in children exposed in utero to the blasts. Nobody in the 1950s wanted to hear that a fraction of the radiation dose “known” to be safe could kill a child. During the Cold War, officials were assuring us we could survive all-out nuclear war by ducking and covering under desks and the U.S. and U.K. governments were pouring lavish subsidies into “the friendly atom.” Stewart was defunded and defamed.
Continue reading at Science with a Skew: The Nuclear Power Industry After Chernobyl and Fukushima