logo
Editorial: The risk of nuclear war waned after the Cold War. It's back with a vengeance.

Editorial: The risk of nuclear war waned after the Cold War. It's back with a vengeance.

Chicago Tribune14-07-2025
When the first nuclear bomb test took place on this date 80 years ago, the scientists who gathered to observe the explosion in the New Mexico desert recognized they were playing with fire.
Physicist Enrico Fermi tried to break the tension by taking bets on whether the bomb would ignite the atmosphere and destroy the world. J. Robert Oppenheimer wagered $10 the bomb wouldn't work at all, and Edward Teller conspicuously applied sunscreen in the predawn darkness, offering to pass it around.
The bomb exploded in a fireball hotter than the surface of the sun, producing far more destructive power than the scientists anticipated. Within weeks, the U.S. nuked the Japanese cities of Hiroshima and Nagasaki, hastening the end of World War II while killing more than 200,000 civilians.
The bomb hasn't been used since, apart from test blasts, and after the Cold War ended in 1991, the risk of nuclear war mercifully declined. Now the risk is back on the rise, as an alarming new nuclear age dawns.
This week, the University of Chicago will host what it's billing as a 'Nobel Laureate Assembly for the Prevention of Nuclear War.' The conference will take place near the campus location where Fermi oversaw the first self-sustaining nuclear chain reaction in the run-up to that fateful July 16 bomb test.
Even just the conference agenda makes for an alarming read.
Panel One will explore how a public once acutely aware of nuclear arms' catastrophic effects has largely forgotten those Cold War-era fears and lost its focus on avoiding nuclear war at all costs.
Panel Two outlines how artificial intelligence and cybersecurity breaches stand to increase the likelihood of nuclear war. Subsequent panels cover the alarming history of nuclear 'close calls,' the weaponization of space and how the disarmament efforts of 30 years ago have fizzled — which brings us to what one of the organizers calls today's 'uniquely dangerous moment.'
Unfortunately, the nuclear landscape is changing for the worse. For starters, the main players are no longer two global superpowers. During the Cold War, the U.S. and the Soviet Union largely controlled the potential for conflict, which made the risks relatively straightforward to analyze.
These days, the politics of nuclear arms have become more complicated and unpredictable. Nine nations are said to possess the weapons today, including the rogue state of North Korea, and others could build them quickly. Most people have forgotten that South Africa once developed a bomb but gave up its program voluntarily. Iraq and Libya also had active nuclear-weapon programs that were stopped under intense international pressure.
At the moment, the focus is on Iran's nuclear program, which the U.S. bombed on June 22, alongside Israel. The U.S. launched its attack even though Iran continued to pursue diplomacy about its nuclear ambitions.
Iran may conclude that it needs a nuclear capability for self-defense, to deter future attacks. The same could be said for other states threatened by nuclear-armed rivals. Consider Ukraine, which voluntarily gave up the nuclear arms based on its soil after the fall of the Soviet Union. Would Russia's 2022 invasion still have occurred against a Ukraine bristling with doomsday weapons? Doubtful.
Besides the chilling political calculations, the weapons used to deliver nuclear warheads have become more dangerous. Hypersonic glide missiles could elude defense systems before striking their targets with practically no warning, while smaller, low-yield nukes threaten to blur the lines between conventional and nuclear warfare, making all-out war more likely.
Defense spending is soaring across the globe, and, with it, faster and deadlier weapons are likely to be deployed. At the same time, treaties restricting nuclear arms are in decline. The most impactful of them — the Treaty on the Non-Proliferation of Nuclear Weapons — was undermined in 2003 when North Korea withdrew from it and built an atomic arsenal.
It's time for the targets of these terrible weapons — us, that is — to rise up and say, 'No!'
The 1980s witnessed mass demonstrations demanding a nuclear freeze. Today, the threat of nuclear war is beginning to enter the public consciousness again.
The movie 'Oppenheimer' about the Trinity bomb test 80 years ago was a box-office hit. The 2024 book, 'Nuclear War: A Scenario,' became a bestseller. Star movie director James Cameron has committed to making, 'Ghosts of Hiroshima,' a Japan-set movie said to be a nightmarish look at the A-bomb blasts.
During the Cold War, pop culture helped convince everyday people to stand against the march toward Armageddon, and here's hoping it can do so again. At the same time, events like the University of Chicago conference can help to get actionable recommendations into the hands of global decision-makers.
For 80 years, the world has lived with the threat of nuclear destruction. Let's act now to curb it, before it's too late.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Russia and China tick Doomsday Clock toward midnight as Hiroshima bombing hits 80 years
Russia and China tick Doomsday Clock toward midnight as Hiroshima bombing hits 80 years

Fox News

timea day ago

  • Fox News

Russia and China tick Doomsday Clock toward midnight as Hiroshima bombing hits 80 years

Wednesday marks the 80th anniversary of when the U.S. employed the first ever nuclear bomb over the Japanese city of Hiroshima, followed by the bombing of Nagasaki three days later on Aug. 9. But despite nearly a century of lessons learned, nuclear warfare still remains a significant threat. "This is the first time that the United States is facing down two nuclear peer adversaries – Russia and China," Rebeccah Heinrichs, nuclear expert and senior fellow at the Hudson Institute, told Fox News Digital. Heinrichs explained that not only are Moscow and Beijing continuing to develop new nuclear capabilities and delivery systems, but they are increasingly collaborating with one another in direct opposition to the West, and more pointedly, the U.S. "It's a much more complex nuclear threat environment than what the United States even had to contend with during the Cold War, where we just had one nuclear peer adversary in the Soviet Union," she said. "In that regard, it's a serious problem, especially when both China and Russia are investing in nuclear capabilities and at the same time have revanchist goals." Despite the known immense devastation that would accompany an atomic war between two nuclear nations, concern has been growing that the threat of nuclear war is on the rise. The bombings of Hiroshima and Nagasaki – which collectively killed some 200,000 people, not including the dozens of thousands who later died from radiation poisoning and cancer – have been attributed with bringing an end to World War II. But the bombs did more than end the deadliest war in human history – they forever changed military doctrine, sparked a nuclear arms race and cemented the concept of deterrence through the theory of mutually assured destruction. Earlier this year the Bulletin of Atomic Scientists moved forward the "Doomsday Clock" by one second – pushing it closer to "midnight," or atomic meltdown, than ever before. In January, the board of scientists and security officials in charge of the 78-year-old clock, which is used to measure the threat level of nuclear warfare, said that moving the clock to 89 seconds to midnight "signals that the world is on a course of unprecedented risk, and that continuing on the current path is a form of madness." Despite the escalated nuclear threats coming out of North Korea, and international concern over the Iranian nuclear program, the threat level largely came down to the three biggest players in the nuclear arena: Russia, the U.S. and China. The increased threat level was attributed to Russia's refusal to comply with international nuclear treaties amid its continuously escalating war in Ukraine and its hostile opposition to NATO nations, as well as China's insistence on expanding its nuclear arsenal. But the Bulletin, which was founded by scientists on the Manhattan Project in 1945 to inform the public of the dangers of atomic warfare, also said the U.S. has a role in the increased nuclear threat level. "The U.S. has abdicated its role as a voice of caution. It seems inclined to expand its nuclear arsenal and adopt a posture that reinforces the belief that 'limited' use of nuclear weapons can be managed," the Bulletin said. "Such misplaced confidence could have us stumble into a nuclear war." But Heinrichs countered the "alarmist" message and argued that deterrence remains a very real protectant against nuclear warfare, even as Russia increasingly threatens Western nations with atomic use. "I do think that it's a serious threat. I don't think it's inevitable that we're sort of staring down nuclear Armageddon," she said. Heinrichs argued the chief threat is not the number of nuclear warheads a nation possesses, but in how they threaten to employ their capabilities. "I think that whenever there is a threat of nuclear use, it's because adversaries, authoritarian countries, in particular Russia, is threatening to use nuclear weapons to invade another country. And that's where the greatest risk of deterrence failure is," she said. "It's not because of the sheer number of nuclear weapons." Heinrichs said Russia is lowering the nuclear threshold by routinely threatening to employ nuclear weapons in a move to coerce Western nations to capitulate to their demands, as in the case of capturing territory in Ukraine and attempting to deny it NATO access. Instead, she argued that the U.S. and its allies need to improve their deterrence by not only staying on top of their capabilities but expanding their nuclear reach in regions like the Indo-Pacific. "The answer is not to be so afraid of it or alarmed that you capitulate, because you're only going to beget more nuclear coercion if you do that," she said. "The answer is to prudently, carefully communicate to the Russians they are not going to succeed through nuclear coercion, that the United States also has credible response options. "We also have nuclear weapons, and we have credible and proportional responses, and so they shouldn't go down that path," Heinrichs said. "That's how we maintain the nuclear peace. That's how we deter conflict. And that's how we ensure that a nuclear weapon is not used."

Nuclear Experts Say Mixing AI and Nuclear Weapons Is Inevitable
Nuclear Experts Say Mixing AI and Nuclear Weapons Is Inevitable

WIRED

timea day ago

  • WIRED

Nuclear Experts Say Mixing AI and Nuclear Weapons Is Inevitable

Aug 6, 2025 6:30 AM Human judgement remains central to the launch of nuclear weapons. But experts say it's a matter of when, not if, artificial intelligence will get baked into the world's most dangerous systems. The University of Chicago campus in Chicago, Illinois, on Tuesday May 27, 2025. Photograph:The people who study nuclear war for a living are certain that artificial intelligence will soon power the deadly weapons. None of them are quite sure what, exactly, that means. In the middle of July, Nobel laureates gathered at the University of Chicago to listen to nuclear war experts talk about the end of the world. In closed sessions over two days, scientists, former government officials, and retired military personnel enlightened the laureates about the most devastating weapons ever created. The goal was to educate some of the most respected people in the world about one of the most horrifying weapons ever made and, at the end of it, have the laureates make policy recommendations to world leaders about how to avoid nuclear war. AI was on everyone's mind. 'We're entering a new world of artificial intelligence and emerging technologies influencing our daily life, but also influencing the nuclear world we live in,' Scott Sagan, a Stanford professor known for his research into nuclear disarmament, said during a press conference at the end of the talks. It's a statement that takes as given the inevitability of governments mixing AI and nuclear weapons—something everyone I spoke with in Chicago believed in. 'It's like electricity,' says Bob Latiff, a retired US Air Force major general and a member of the Bulletin of the Atomic Scientists' Science and Security Board. 'It's going to find its way into everything.' Latiff is one of the people who helps set the Doomsday Clock every year. 'The conversation about AI and nukes is hampered by a couple of major problems. The first is that nobody really knows what AI is,' says Jon Wolfsthal, a nonproliferation expert who's the director of global risk at the Federation of American Scientists and was formerly a special assistant to Barack Obama. 'What does it mean to give AI control of a nuclear weapon? What does it mean to give a [computer chip] control of a nuclear weapon?' asks Herb Lin, a Stanford professor and Doomsday Clock alum. 'Part of the problem is that large language models have taken over the debate.' First, the good news. No one thinks that ChatGPT or Grok will get nuclear codes anytime soon. Wolfsthal tells me that there are a lot of 'theological' differences between nuclear experts, but that they're united on that front. 'In this realm, almost everybody says we want effective human control over nuclear weapon decisionmaking,' he says. Still, Wolfsthal has heard whispers of other concerning uses of LLMs in the heart of American power. 'A number of people have said, 'Well, look, all I want to do is have an interactive computer available for the president so he can figure out what Putin or Xi will do and I can produce that dataset very reliably. I can get everything that Xi or Putin has ever said and written about anything and have a statistically high probability to reflect what Putin has said,'' he says. 'I was like, 'That's great. How do you know Putin believes what he's said or written?' It's not that the probability is wrong, it's just based on an assumption that can't be tested,' Wolfsthal says. 'Quite frankly, I think very few of the people who are looking at this have ever been in a room with a president. I don't claim to be close to any president, but I have been in the room with a bunch of them when they talk about these things, and they don't trust anybody with this stuff.' Last year, Air Force General Anthony J. Cotton, the military leader in charge of America's nukes, gave a long speech at a conference about the importance of adopting AI. He said the nuclear forces were 'developing artificial intelligence or AI-enabled, human led, decision support tools to ensure our leaders are able to respond to complex, time-sensitive scenarios.' What keeps Wolfsthal up at night is not the idea that a rogue AI will start a nuclear war. 'What I worry about is that somebody will say we need to automate this system and parts of it, and that will create vulnerabilities that an adversary can exploit, or that it will produce data or recommendations that people aren't equipped to understand, and that will lead to bad decisions,' he says. Launching a nuclear weapon is not as simple as one leader in China, Russia, or the US pushing a button. Nuclear command and control is an intricate web of early warning radar, satellites, and other computer systems monitored by human beings. If the president orders the launch of an intercontinental ballistic missile, two human beings must turn keys in concert with each other in an individual silo to launch the nuke. The launch of an American nuclear weapon is the end result of a hundred little decisions, all of them made by humans. What will happen when AI takes over some of that process? What happens when an AI is watching the early warning radar and not a human? 'How do you verify that we're under nuclear attack? Can you rely on anything other than visual confirmation of the detonation?" Wolfsthal says. US nuclear policy requires what's called 'dual phenomenology' to confirm that a nuclear strike has been launched: An attack must be confirmed by both satellite and radar systems to be considered genuine. 'Can one of those phenomena be artificial intelligence? I would argue, at this stage, no.' One of the reasons is basic: We don't understand how many AI systems work. They're black boxes. Even if they weren't, experts say, integrating them into the nuclear decisionmaking process would be a bad idea. Latiff has his own concerns about AI systems reinforcing confirmation bias. 'I worry that even if the human is going to remain in control, just how meaningful that control is,' he says. 'I've been a commander. I know what it means to be accountable for my decisions. And you need that. You need to be able to assure the people for whom you work there's somebody responsible. If Johnny gets killed, who do I blame?' Just as AI systems can't be held responsible when they fail, they're also bound by guardrails, training data, and programming. They can not see outside themselves, so to speak. Despite their much-hyped ability to learn and reason, they are trapped by the boundaries humans set. Lin brings up Stanislav Petrov, a lieutenant colonel of the Soviet Air Defence Forces who saved the world in 1983 when he decided not to pass an alert from the Soviet's nuclear warning systems up the chain of command. 'Let's pretend, for a minute, that he had relayed the message up the chain of command instead of being quiet … as he was supposed to do … and then world holocaust ensues. Where is the failure in that?' Lin says. 'One mistake was the machine. The second mistake was the human didn't realize it was a mistake. How is a human supposed to know that a machine is wrong?' Petrov didn't know the machine was wrong. He guessed based on his experiences. His radar told him that the US had launched five missiles, but he knew an American attack would be all or nothing. Five was a small number. The computers were also new and had worked faster than he'd seen them perform before. He made a judgement call. 'Can we expect humans to be able to do that routinely? Is that a fair expectation?' Lin says. 'The point is that you have to go outside your training data. You must go outside your training data to be able to say: 'No, my training data is telling me something wrong.' By definition, [AI] can't do that.' Donald Trump and the Pentagon have made it clear that AI is a top priority, and have invoked the nuclear arms race to do it. In May, the Department of Energy declared in a post on X that 'AI is the next Manhattan Project, and the UNITED STATES WILL WIN.' The administration's 'AI Action Plan' depicted the rush towards artificial intelligence as an arms race, a competition against China that must be won. 'I think it's awful,' Lin says of the metaphors. 'For one thing, I knew when the Manhattan Project was done, and I could tell you when it was a success, right? We exploded a nuclear weapon. I don't know what it means to have a Manhattan Project for AI.'

The true cost of abandoning science
The true cost of abandoning science

Los Angeles Times

timea day ago

  • Los Angeles Times

The true cost of abandoning science

Any trip to the dark night skies of our Southern California deserts reveals a vista full of wonder and mystery — riddles that astrophysicists like myself spend our days unraveling. I am fortunate to study how the first galaxies formed and evolved over the vast span of 13 billion years into the beautiful structures that fill those skies. NASA's crown jewel, the James Webb Space Telescope, has delivered measurements of early galaxies so puzzling that, more than three years after its launch, we are still struggling to understand them. My work on ancient galaxies may seem to have no relevance to the enormous challenges that confront our nation every day. But if we look back over the last 80 years, ever since World War II turned America into the epicenter of global science, curiosity-driven investigation — in astronomy, quantum materials, evolutionary biology and more — has been a pillar of American progress. But science in America is now under dire threat. President Trump's administration is laying waste to both national laboratories and federal support for academic science. Scientific staff is being sharply reduced from the National Park Service to the National Science Foundation and everywhere in between. Looking at the president's science funding proposals across many agencies, the 2026 fiscal year budget calls for a 34% cut to basic research. The plan slashes NASA's budget to the lowest amount since human space flight began more than 60 years ago, canceling or defunding dozens and dozens of NASA missions. Already, the NSF has halved support for the most promising American graduate students. Scientists are speaking up against this destruction, of course. There are strong practical reasons to back science: It is a powerful engine for economic growth, and it is essential for understanding and mitigating the dangers of the natural world — whether they be the Los Angeles wildfires (which my family fled in January) or the tragic floods in Texas last month. As important as these pragmatic arguments are, their focus on quantifiable, short-term benefits undervalues the true worth of the scientific enterprise. Occasionally, curiosity-driven inquiry — basic science — rapidly enables new technology, but more often its first impact is the wonder we experience at novel measurements, whether contemplating ripples in space-time generated by colliding black holes, underwater ecosystems that draw energy from geothermal vents rather than the sun, or the relic microwave radiation of the Big Bang. The practical impacts that follow are unpredictable; if the goal is to explore the unknown, then the benefits are also unknown. (Let us not forget that even Columbus was sorely mistaken about what his journey would uncover!) Only through hard work to understand and unpack new discoveries do their full benefits become clear, and that can take decades, as with how Einstein's theory of relativity (published from 1905 to 1915) eventually enabled GPS technology. Government support is essential in this process. Although Hollywood often portrays scientific discovery as the work of lone geniuses, far more often it is an incremental process, inching ahead through insights from disparate research groups leveraging cutting-edge infrastructure (such as Arctic research facilities and orbiting telescopes), which can only be built through the focused resources of government investment. Every American taxpayer has helped enable innumerable scientific advancements because they are largely due to our nation's investments in the public goods of people and facilities. Of course, these advances have cost money, and we must always ask how best to balance the long-term benefits of science against our country's other urgent needs. (The enormously popular James Webb Space Telescope, for example, was massively over budget, which led to budget-estimation reforms at NASA.) In 2024, the total science budget, outside of medical research (and its obvious practical benefits), was about $28 billion. This is a large number, but it is still just over one-half of 1% of all spending outside of Social Security and Medicare: For every $1,000 in spending, about $6 — one tall Starbucks Caffè Mocha or Big Mac in California — supports fundamental scientific inquiry. Yet the current administration has chosen to hack away at budgets rather than do the hard work of self-examination and improvement. American science, and especially the emerging generation of young scientists, will not survive these cuts. If implemented, the administration's framework will choke off new technologies before they are only half an idea, leave fundamental questions about the universe unanswered and chase a generation of scientists to other countries. By any measure, American science is the envy of the world, and we now face a choice: to remain at the vanguard of scientific inquiry through sound investment, or to cede our leadership and watch others answer the big questions that have confounded humanity for millennia — and reap the rewards and prestige. Only one of those options will make the future America great. Steven R. Furlanetto is a professor of physics and astronomy at UCLA.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store