On May 9, workers discovered a 20-foot-diameter hole where the roof had collapsed on a makeshift nuclear waste site: a tunnel, sealed in 1965, encasing old railroad cars and equipment contaminated with radiation through years of plutonium processing. Potential radiation levels were high enough that some workers were told to shelter in place while others donned respirators and protective suits as they repaired the hole.
The Hanford complex, which dates back to 1943, produced the plutonium for the atomic bomb dropped on Nagasaki. Half the size of Rhode Island, it is often described as the most contaminated place in the United States.
It would be nice to say that Hanford is a unique canker on the US nuclear landscape, but it is not. It may be the most contaminated, but it is far from alone. At the Rocky Flats facility outside Denver, where workers fashioned Hanford’s plutonium into cores (or “pits”) for nuclear weapons, there were major fires in 1957 and 1969; each sent plutonium-laced plumes of smoke over nearby communities. Enough plutonium dust gathered in the facility’s ductwork that some worried about a spontaneous criticality event—that is, an accidental and uncontrolled nuclear chain reaction. Eventually President George H.W. Bush closed Rocky Flats in 1992 after an FBI investigation found that the facility was secretly (and illegally) burning nuclear waste in the middle of the night.
At Ohio’s Fernald plant, which processed uranium for the weapons complex, operators dumped radioactive waste into makeshift pits where it contaminated local groundwater, and blew uranium dust particles out of the smokestacks when the filters failed, as they did with some regularity. Similar stories could be told for the nuclear weapons facilities at Savannah River in North Carolina and Oak Ridge in Tennessee, which hushed up criticality accidents while contaminating nearby air and water.
There are three reasons these Cold War nuclear facilities turned into such environmental catastrophes. First, the Cold War American state, fixated on winning the arms race, put a premium on beating the Soviets at all costs. Producing uranium, plutonium, and weapons components was a higher priority than protecting the health of nearby residents or the workers at the plants, a disproportionate number of whom died of cancer. Ironically, since 1945, American nuclear weapons, intended to keep the country safe, have mainly killed Americans.
A second factor was state secrecy. As leading Cold War public intellectuals such as Daniel Patrick Moynihan and Edward Shils argued, abuse thrives in the dark, and Cold War secrecy provided much cover of darkness to places like Hanford. For decades, government officials and the contractors that ran the plants were able to deflect civilian regulators, nosy journalists, local citizens, even congressmen, by hiding behind the skirts of national security. Officials defined vital nuclear secrets expansively, to include not just the design and deployment details of weapons, but also the secret harms inflicted on Americans through their production. […]
Finally, we should not underestimate how novel and complex nuclear technology was in the early decades of the Cold War. Physicists, engineers, and technicians were still learning how the technology worked, how esoteric radioactive materials behaved in a range of conditions, and how toxic waste products were absorbed into the environment. As in any endeavor, you learn by making mistakes. Unfortunately, those mistakes left a legacy of contaminated Cold War production sites around the country that are beginning to look like a permanent archipelago of national sacrifice zones