The atomic age began 75 years ago this Dec. 2, when a group of scientists gathered at the University of Chicago to determine whether they could achieve a self-sustaining nuclear chain reaction. Their experiment, conducted in an unheated squash court underneath a defunct football stadium, was pursued in deepest secrecy. Allegedly, not even the university’s president was told. If things went awry, the demonstration would wreak havoc on the stadium, the campus and the city of Chicago.
The iconic broadcaster Edward R. Murrow would later describe what transpired: “The story of the lighting of the first atomic furnace will be told as long as stories can be listened to, for it was certainly one of the most dramatic moments in the unfolding of human knowledge.” The success of the experiment did not, however, elicit wild cheers from the assembled scientists. They understood the full import of their sobering breakthrough.
The experiment focused on a crude pile — a 20-foot-high structure made of close to 40,000 graphite bricks, weighing 20 pounds each and embedded with a total of almost 100,000 pounds of uranium. Thirteen-foot control rods, ready to be pushed in or out depending on the neutron count, protruded from the pile. Fermi, cool and collected throughout the experiment, gave orders from the balcony above the squash court.
Government, academia and industry formed a remarkable secret partnership. In the space of a few years, more than 100,000 people were employed at three remote sites — Los Alamos, N.M.; Hanford, Wash.; and Oak Ridge, Tenn. — each with distinct tasks but united in a common mission to build an atomic bomb before the Germans did.
The motivation of the leading scientists of the Manhattan Project was palpable. Many were refugees who fled Nazism and fascism in Europe. They often were Jews or, like Fermi, married to Jews. There was no question that America, their new home, needed to triumph over forces of evil.
The argument continues to this day as to whether the dropping of the atomic bombs was justified. As for the future of nuclear weaponry, there is little doubt that some other country would have discovered how to build a bomb if America had not done so first. The march of science and technology goes relentlessly forward. It is a miracle that we have not witnessed a repetition, on an even larger scale, of the suffering the citizens of Hiroshima and Nagasaki endured. Though we have come close several times, catastrophe has always been avoided. Cold War threats were confined to brinkmanship by the world’s two reigning superpowers. The looming risk of mutually assured destruction always pulled them back.
The current political climate, unstable and ominous, has jolted us out of complacency. The proliferation of nuclear weapons, many in the hands of national leaders whose judgment we have reason to fear, poses an existential threat to our planet and our very survival. In 1942, the first nuclear furnace generated half a watt of power; current nuclear reactors can produce 2 billion watts. The explosive power of the largest bomb ever tested was 30,000 times greater than the bomb dropped on Hiroshima. In a talk Fermi delivered in 1952, he expressed his hope that “man will soon grow sufficiently adult to make good use of the powers he acquires over nature.”
This year, on the 75th anniversary of the lighting of the first nuclear furnace, there sadly seems to be little evidence of such maturity.
Read more at Commentary: The first atomic furnace