Seventy-five years ago today, on August 6, 1945, President Harry S. Truman issued the order to drop an atomic bomb on Hiroshima, Japan.
The number of Japanese people who were immediately killed is estimated to be between 70,000 and 140,000, with longer-term estimates of deaths, including radiation illnesses and cancer, extending up to 220,000.
“How easily we learn to justify violence in the name of some higher cause,” President Barack Obama said when he visited Hiroshima in 2016. “Technological progress without an equivalent progress in human institutions can doom us. The scientific revolution that led to the splitting of an atom requires a moral revolution, as well.”
Obama’s use of the present tense is telling: It’s not at all clear that our grasp of nuclear technology fits a moral framework, even now, three-quarters of a century years after the U.S. deployed nuclear bombs on Hiroshima and Nagasaki. But in 2020, the nuclear power industry claims to have much more progressive ideas about applications for nuclear energy, the safety and security of reactors, and even ways to make nuclear power more accessible on a smaller scale—not less.
Could we be on the cusp of a true nuclear golden age that does responsible good for the world at last? This is a hard question, and the answer isn’t clear.
Atomic Power for War and Peace
In October 1945, following the mass murder events in Hiroshima and Nagasaki, Popular Mechanics detailed the technical specifications of what it meant to use nuclear energy to make war. But in the same article, our magazine also listed the ways the same technology could produce plentiful energy, power cars for millions of miles, and send people around the world at unprecedented speeds.
“Diverted into peacetime applications, atomic power possesses tremendous possibilities in providing cheap power and heat that promise to revolutionize our present concepts of living,” Popular Mechanics explained. “A power plant the size of a typewriter, in its heart a one-pound package of uranium containing the same amount of power extracted from 250,000 gallons of gasoline.”
Despite committing what American progressives and libertarians alike argue were war crimes by dropping two atomic bombs on Japanese cities, the so-called Nuclear Age in America was just beginning in 1945. With the Atomic Energy Act of 1946, the U.S. started planning for the harnessing of atomic reactions for power generation. In 1957, the first commercial reactor went online, and the first protests of a nuclear power plant happened the same year. No commercial power plant has ever been the size of a typewriter—in fact, they’ve only grown larger in the decades since the first plants.
In the Atomic Age
During the height of the “Atomic Age,” the health risks from unprotected exposure to radiation were less understood. Radioactive elements were used to fight cancer for decades before the development of the atomic bomb, and even experts didn’t yet widely grasp the danger posed by radiation—the exception of cancer treatment triggered a wave of radiation treatments and supplements that brought poison into everyday life, echoing a previous fad for mercury tonics.
That extended to consumer goods. Decades after the “Radium girls” won court settlements over painting radioactive watches, Ford modeled a nuclear concept car called the Nucleon. A popular 1950s children’s chemistry kit contained jars of uranium with a mealymouthed warning not to open them. Today, a handful of consumer goods still contain radioactive materials for specific reasons, but none is so careless as a children’s toy.
Nuclear power, too, fell out of favor after high-profile disasters at Chernobyl and Three Mile Island. Even before 2011’s meltdown at the Fukushima Daiichi plant in Okuma, Japan, much of the public hated and feared nuclear power. That event, where a tsunami triggered a cascading critical failure that caused meltdown in three reactors, showed that an “act of God” (in insurance parlance) could destroy a plant that may pass inspections and seem safe otherwise. And that wasn’t the end of the discussion, because governing utility Tokyo Electric Power Company (TEPCO) had confessed to falsifying inspection records in the decade prior.
In the 1970s, Germany became a beacon for nuclear energy advocacy, building power plants during a boom caused by the 1974 oil crisis there. But after the Chernobyl disaster in 1986, the country started to sour on nuclear power, a turn made notable because of how enthusiastic it was before.
Within the same time frame, France had built up its nuclear energy infrastructure. And though it, too, shied away from nuclear following Chernobyl—part of a political cooling toward nuclear that happened around the world—today, Germany is shutting down its entire nuclear infrastructure, while France is building nuclear resources and technology that’s at the cutting edge in the world.
A Future in Fusion
One of the representative next-generation nuclear projects that advocates say could help bring the world into a new nuclear golden age is ITER, the International Thermonuclear Experimental Reactor, an almost unfathomably large fusion experiment made from millions of unprecedented components. Even Germany is part of ITER, not only as part of the European Union, but with a representative at the recent groundbreaking for ITER’s assembly phase.
This points to a seeming disconnect between decades of fission reactors—the ones operating around the world today, many dating back to the 1970s or even earlier—and nuclear fusion, which seems to carry symbolic meaning for science, the energy sector, and the world interest in the effects of climate change.
ITER is, in a sense, too big to fail. It’s the result of decades of work and planning, enormous financial investments from dozens of countries, and proof of concept of the idea of a fusion reactor at the commercial scale. In the classic paradox of a long-distance spacecraft, scientists must decide if sending a ship today is worthwhile when it will be passed and surpassed by a ship that’s built in 20 years.
Critics say ITER is the slower ship, so to speak, when work on much smaller fusion reactors continues in university, government, and even corporate startup laboratories around the world. ITER won’t achieve sustained fusion until at least its stated goal of the 2030s, leaving 10 years for a smaller reactor to surpass it. But the investment in ITER arguably helped those smaller projects to take off.
Indeed, ITER and its micro colleagues represent a broader divide in nuclear energy going into the 2020s. The power plants being built right now are mostly gigantic, traditional fission reactors to provide energy for China’s growing cities, for example, or to replace retiring reactors in established nuclear societies like France.
But at the research level, the most exciting work is at the totally opposite end. Scientists are making smaller reactors that use less fuel and have more carefully designed safety protocols. Instead of huge, custom-designed plants that are geographically regulated and positioned miles outside of the nearest city or town, these scientists envision contained reactors that take up a city block or sit at the edge of the small town they’ll supply with power.
The Next Steps in Nuclear
There’s no updated nuclear consumer car, but scientists today have carried forward a legacy of nuclear spaceflight, too. Elon Musk has, credulously or not, suggested using nuclear warheads to terraform Mars and using a small nuclear reactor to propel SpaceX’s Starships on the way there. The argument is that a shorter exposure to nuclear radiation is preferable to a longer exposure to deadly cosmic radiation.
NASA has researched its own nuclear engines and is soliciting proposals for a nuclear plant that will work on the moon. There are huge swaths of “moonshot” plans in the works for private and public spaceflight that can only work with nuclear power, where fossil fuels are too heavy or inefficient, and alternatives like hydrogen aren’t developed enough to include in plans.
The long shadow of nuclear war falls over any conversation about nuclear energy. Using hundreds of thousands of Japanese people’s lives as the proving ground for technology that came afterward is an atrocity in every sense of the word, and it still seems naive to believe such a dangerous chemical reaction can be applied for the scientific or humanitarian moral good Obama suggested.
Even Truman made the decision to bomb Hiroshima less than a month after the technology was demonstrated at all, and he vowed not to use the atomic bomb in future conflicts. No one else has ever used an atomic bomb outside of tests, but the most cutting-edge nuclear power plant in use today still contains the ingredients for a violent meltdown. We can’t call this a legacy of nuclear power, because it’s still the present.
You Might Also Like