logo
#

Latest news with #Heisenberg

Quantum Computing: Journey from bits to qubits still has far to go
Quantum Computing: Journey from bits to qubits still has far to go

Indian Express

time14 hours ago

  • Business
  • Indian Express

Quantum Computing: Journey from bits to qubits still has far to go

'Nature isn't classical, dammit; and if you want to make a simulation of nature, you'd better make it quantum mechanical.' — Richard Feynman, 1981 With that blunt provocation, the legendary physicist threw down a gauntlet that still challenges science today. If the universe runs on the strange rules of quantum mechanics — with particles existing in multiple states at once and influencing each other instantaneously across space — why are we using computers built on classical logic to understand it? Wouldn't a quantum world be best understood by a quantum machine? That simple idea planted the seed for one of the most radical technologies in the making: the quantum computer. But first, how we got to that point. For much of the 20th century, computing meant tinkering with mechanical contraptions — from slide rules and punch cards to room-sized mainframes wired with vacuum tubes. These machines solved problems by following step-by-step instructions, manipulating electric signals or gears to simulate logic and arithmetic. The real revolution came in the 1950s, with the arrival of digital computing. Suddenly, everything could be broken down into bits, tiny switches that could be either on (1) or off (0). These humble 0s and 1s gave us a universal language: one machine, given the right code, could simulate anything from weather patterns to word processors. As these digital systems grew in power, scientists naturally wondered: how far could this go? Could we simulate the behavior of nature itself — atoms, molecules, and the building blocks of reality? That's when they hit a wall. Classical computers, no matter how fast, struggled to model the weirdness of quantum systems. Every additional particle increased the complexity. Even the most powerful supercomputers couldn't keep up. That is when Feynman posed his provocative question: if nature is quantum mechanical, why are we trying to simulate it with classical machines? What if we built a computer that itself obeyed the rules of quantum physics? To understand that vision, we need to grasp how quantum objects differ from the familiar ones around us. A classical object — a coin, a car, a bit in your laptop — has definite properties that can be measured without changing them. A quantum object, like an electron or a photon, behaves differently. It can exist in a superposition of states; meaning it can be in multiple configurations at once, and its properties become definite only when observed. What's more, it can be entangled with others, so that measuring one instantly affects the other, no matter how far apart they are. These strange behaviors aren't just curiosities. They're powerful. If harnessed correctly, they could unlock new kinds of computation — not just faster, but fundamentally different. That was Feynman's vision: a machine that speaks nature's own language. The Heisenberg uncertainty principle, part of the bedrock of quantum mechanics, tells us that certain pairs of properties — such as position and momentum — cannot both be known exactly at the same time. This fuzziness gives rise to superposition, where a quantum system exists in a blend of states simultaneously. For a qubit, superposition means it can be 0 and 1 at once, like a spinning coin undecided until it lands. Only upon measurement does its state 'collapse' into either 0 or 1, enabling parallel exploration of possibilities. Even more astonishing is entanglement, a uniquely quantum link between qubits. When qubits become entangled, their individual states have no independent meaning; you can only describe the system as a whole. Measuring one qubit instantly determines its partner's state, no matter the distance between them — a phenomenon Albert Einstein dubbed 'spooky action at a distance.' The challenge is to harness the potential of the quantum states for use in computing. Quantum states are exquisitely fragile. Tiny disturbances — thermal vibrations, stray fields, or cosmic rays — can collapse superpositions in a process called decoherence. Today's qubits remain coherent for just 10⁻⁵ to 10⁻⁴ seconds before errors arise, whereas classical memory holds data intact for milliseconds to years. To combat decoherence, researchers therefore cool qubits to near absolute zero, isolate them in vacuum, and use error-correction schemes that trade many physical qubits for one robust 'logical' qubit. These logical qubits can detect and correct small quantum errors on the fly, preserving the fragile quantum information long enough for useful computation. Despite these hurdles, milestone demonstrations have arrived. In 2019, Google's Sycamore processor executed a special sampling task in 200 seconds — an operation estimated to take a classical supercomputer 10,000 years. While that benchmark had no immediate practical use and was a contrived problem, it proved the principle of 'quantum advantage.' Since then, other companies and research groups have made steady progress: IBM has built devices with over 100 qubits and is pursuing a 1,000-qubit machine; China's Jiuzhang photonic quantum computer has performed similar advantage demonstrations using light; and startups like IonQ and PsiQuantum are exploring alternative qubit architectures with an eye on scalability. # If successfully developed, quantum computers could transform industries across the board. In pharmaceuticals and materials science, they could revolutionize molecular design by simulating chemical reactions and protein folding with atomic precision, paving the way for faster drug discovery and novel materials. # In logistics, transportation, and finance, quantum optimization algorithms could deliver vastly improved solutions for traffic management, supply chain efficiency, and portfolio risk balancing. # In the field of cybersecurity, quantum communication promises virtually unhackable networks through quantum key distribution, a technology already being tested in several countries. # High-precision sensing, enabling ultraprecise clocks, gravity detectors for mineral exploration, and next-generation medical imaging. They could also threaten today's security. Shor's ( quantum) algorithm can factor large numbers exponentially faster than classical methods, putting public-key systems (RSA, ECC) that secure internet banking, e-commerce, and government communications at risk. When large, error-corrected quantum computers arrive, they could decrypt decades of digital traffic overnight. This has spurred a global push toward post-quantum cryptography, new codes believed safe even against quantum attacks. A nominal 100-qubit system can, in theory, represent 2¹⁰⁰ (≈1×10³⁰) states simultaneously, requiring some 10³¹ numbers to emulate on a classical machine. Yet with current error rates and no full error correction, those 100 physical qubits effectively yield fewer than 5 fully reliable logical qubits – enough to hold only 2⁵=32 basis states in superposition. By contrast, a typical laptop's 1 TB drive stores about 8×10¹² classical bits reliably for years. Today's devices host tens to a few hundred qubits but suffer from limited coherence and high error rates, so only small-scale algorithm demonstrations are possible. The total public and private investment in quantum technologies has surpassed US $55 billion over the last decade. China leads with over $15 billion in public spending, the U.S. follows with about $4 billion, and the EU's €1 billion Quantum Flagship rounds out the top three. Each nation seeks both technological leadership and safeguards against quantum-enabled threats. In India, the 2020 National Quantum Mission committed ₹8,000 crore (≈US $1 billion) over five years. Research groups at the IITs, IISc, and TIFR, along with several startups, operate 5–10-qubit systems today and aim for 50–100 qubits by 2030 — enough to begin tackling more complex problems and cement India's role in the quantum ecosystem. India's initial funding injection places it among the top five investors, alongside the U.K., Canada, and Japan. A fully fault-tolerant quantum computer, with millions of physical qubits supporting error-corrected logical qubits, remains years or decades away. Yet the work today in improving qubit stability, scaling control electronics, and rolling out quantum-safe encryption lays the groundwork. Quantum machines will not replace classical computers but will augment them, tackling specialised problems – the computationally toughest subroutines like simulating quantum materials, solving large-scale optimization problems, and breaking cryptographic codes – that classical systems struggle with. As this new paradigm matures, we stand on the brink of an era defined not by what's possible with bits, but by what we can achieve with qubits. Shravan Hanasoge is an astrophysicist at the Tata Institute of Fundamental Research.

No One Owns Quantum Science: The First Principle of the International Year of Quantum
No One Owns Quantum Science: The First Principle of the International Year of Quantum

Associated Press

time06-05-2025

  • Science
  • Associated Press

No One Owns Quantum Science: The First Principle of the International Year of Quantum

By Jenn Mullen As we celebrate 2025—the International Year of Quantum (IYQ)—we embark on a journey through the eight guiding principles that have shaped a century of quantum discovery. This series begins with perhaps the most foundational principle: 'No one owns quantum science.' The Declaration The IYQ's official declaration of this principle states: 'No individual, society, country, institution, or discipline can claim ownership of the past or future of quantum science; it is knowledge that should be free to all. IYQ recognizes those who put effort into studying, developing, investigating, using, and teaching quantum science and technology, while seeking to solicit and answer questions from anyone who wants to learn more about it.' This statement isn't merely aspirational—it reflects the very essence of how quantum mechanics emerged and continues to evolve. But what does it mean in practice, and why does it matter as we commemorate 100 years of quantum mechanics? Born from Collective Genius When we trace quantum science to its origins, we find not a single inventor but a tapestry of brilliant minds wrestling with the universe's most profound mysteries. In July 1925, Werner Heisenberg published his groundbreaking paper on quantum theoretical reinterpretation, followed shortly by the seminal " Three-Man Paper " with Max Born and Pascual Jordan that September. Yet these works didn't emerge from isolation. They built upon the quantum theories of Max Planck and Albert Einstein's work on the photoelectric effect. They were influenced by Niels Bohr's atomic model and Louis de Broglie's matter waves. The mathematics drew from the work of numerous mathematicians across Europe. No single genius 'invented' quantum mechanics—it evolved through conversation, correspondence, and spirited debate among physicists from Germany, Denmark, Austria, France, the Netherlands, and beyond. The famous Solvay Conferences, beginning in 1911, exemplified this spirit of international collaboration. Scientists gathered across national boundaries—sometimes even as their countries were in political conflict—to grapple with quantum's deepest questions. The iconic 1927 Fifth Solvay Conference photo captures this perfectly: 29 scientists from different countries and backgrounds united by a shared quest to understand the quantum world . Today's Quantum Commons A century later, this collaborative ethos thrives in initiatives like the European Quantum Flagship, the US National Quantum Initiative, China's national quantum projects, and international research partnerships that span continents. While nations may compete in quantum advancement, the underlying science remains part of our shared human heritage. Modern quantum computing companies—despite operating in a competitive landscape—have fostered remarkable openness. Many maintain open-source tools and frameworks that invite global contribution. A number of companies and platforms provide cloud access to quantum processors, allowing researchers, educators, and curious minds worldwide to run quantum experiments without needing to build multi-million-dollar hardware. Repositories of quantum algorithms and educational resources created by and for developers from every corner of the globe are widely available. Graduate students in Bangalore collaborate with professors in Berkeley. Researchers in Beijing build upon discoveries made in Boston. A high school student in rural areas can access the same quantum computing platforms as scientists at elite institutions. Why It Matters This principle—that no one owns quantum science—isn't merely philosophical; it's pragmatic. Quantum mechanics tackles questions so profound and complex that they require diverse perspectives. The field advances most rapidly when knowledge flows freely, when insights from condensed matter physics inform quantum computing, when theoretical mathematics inspires experimental breakthroughs. As quantum technologies approach potential commercial applications—from secure communications to powerful computing paradigms and precise sensors—economic pressures might tempt some to enclose quantum knowledge behind proprietary walls. The IYQ's first principle reminds us that while specific implementations may be owned, the foundational science belongs to humanity. This openness also ensures that quantum science doesn't become the exclusive domain of wealthy nations or institutions. When knowledge is freely shared, brilliant minds everywhere can contribute, regardless of geography or resources. Some of tomorrow's most transformative quantum breakthroughs may come from unexpected places if we maintain this commitment to open access. The Centennial Invitation As we mark the 100th anniversary of quantum mechanics' formalization, the principle that 'no one owns quantum science' serves as both a celebration of the field's collaborative history and a recommitment to its open future. It invites all of us—regardless of background—to engage with quantum concepts, to ask questions, to contribute where we can. Whether you're a seasoned quantum physicist, a student just beginning to explore wave functions, or someone simply curious about the strange and beautiful rules that govern our reality at its smallest scales, the quantum world belongs to you too. That's the promise and the challenge of the International Year of Quantum's first principle: this revolutionary science—with all its wonder and potential—is our collective inheritance and responsibility. In the coming weeks, we'll explore the remaining seven principles that guide the IYQ celebration, each illuminating different facets of quantum science's past, present, and promising future. For now, remember that quantum science has no single owner because it belongs to us all. Learn with Keysight Keysight is committed to empowering the next generation of engineers and innovators. Explore Learn to find a rich and growing library of resources spanning a range of technical areas, industries, and specialties. Explore Learn now . _________________________________ This is the first in an eight-part series exploring the guiding principles of the 2025 International Year of Quantum. Next week: 'Everyone is invited—Making quantum science accessible to all.' Visit 3BL Media to see more multimedia and stories from Keysight Technologies Page 2

MIT captures first image of free-range atoms, can help visualize quantum phenomena
MIT captures first image of free-range atoms, can help visualize quantum phenomena

Yahoo

time06-05-2025

  • Science
  • Yahoo

MIT captures first image of free-range atoms, can help visualize quantum phenomena

Yahoo is using AI to generate takeaways from this article. This means the info may not always match what's in the article. Reporting mistakes helps us improve the experience. Yahoo is using AI to generate takeaways from this article. This means the info may not always match what's in the article. Reporting mistakes helps us improve the experience. Yahoo is using AI to generate takeaways from this article. This means the info may not always match what's in the article. Reporting mistakes helps us improve the experience. Generate Key Takeaways Scientists from the Massachusetts Institute of Technology (MIT) in the U.S. have made a groundbreaking achievement after they captured the first images of individual atoms freely interacting in space. The images, which show interactions between free-range particles that had only been theorized until now, will reportedly allow the scientists to directly observe quantum phenomena in real space. To capture detailed images of the atomic interactions, the team, led by Martin Zwierlein, PhD, an MIT physicist and lead author of the study, developed a novel technique that allows the atoms to move freely before briefly freezing and illuminating them to capture their positions. The team used the technique to observe clouds of various atom types, capturing several groundbreaking images for the first time. "We are able to see single atoms in these interesting clouds of atoms and what they are doing in relation to each other, which is beautiful," Zwierlein said. Exploring the cloud Atoms are among the tiniest building blocks of the universe, each just one-tenth of a nanometer wide or roughly a million times thinner than a strand of human hair. They additionally follow the strange rules of quantum mechanics making their behavior incredibly difficult to observe and understand. It's impossible to know both an atom's exact position and its speed at the same time - a fundamental principle of quantum physics known as the Heisenberg uncertainty principle. This uncertainty has long challenged scientists trying to observe atomic behavior directly, however, traditional imaging methods, such as absorption imaging, provide only a blurry view, capturing the overall shape of an atom cloud but not the atoms themselves. Now, to overcome the challenge, the team developed a new approach called atom-resolved microscopy, which begins by allowing a cloud of atoms to move and interact freely within a loose laser trap. Bottom: Images show a ²³Na condensate, single-spin ⁶Li, and paired fermions in a Fermi mixture. Credit: Top: Atoms are frozen by an optical lattice and imaged with Raman Images show a ²³Na condensate, single-spin ⁶Li, and paired fermions in a Fermi MIT / Courtesy of the researchers The researchers then switch on a lattice of light to freeze the atoms in place and use a finely tuned laser to illuminate them, causing the atoms to fluoresce - a state when an atom or molecules relaxes through vibrational relaxation to its ground state after being electrically excited - and reveal their exact positions. Capturing this light without disturbing the delicate system was no small feat. "You can imagine if you took a flamethrower to these atoms, they would not like that," Zwierlein explained. "So, we've learned some tricks through the years on how to do this." According to the physicist, what truly makes the technique more powerful than previous methods is that it's the first time they've done it in situ by freezing atoms' motion as they strongly interact and observing them one after another. Quantum snapshots Zwierlein and his colleagues used their new imaging technique to capture quantum interactions between two fundamental types of particles: bosons and fermions. Bosons - among which photons, gluons, the Higgs boson, and the W and Z bosons - which tend to attract, were observed bunching together in a cloud of sodium atoms at low temperatures, forming a Bose-Einstein condensate (BEC) where all particles share the same quantum state. This confirmed a long-standing prediction based on Louis de Broglie's theory that boson bunching is a direct result of their ability to share one quantum wave - a hypothesis known as the de Broglie wave, which helped spark the rise of modern quantum mechanics. "We understand so much more about the world from this wave-like nature," Zwierlein stated. "But it's really tough to observe these quantum, wave-like effects. However, in our new microscope, we can visualize this wave directly." The researchers also imaged a cloud with two types of lithium atoms, each a fermion that typically repels others of its kind but can strongly interact with specific other fermion types. They then captured these opposite fermions pairing up, revealing a key mechanism behind superconductivity. They now plan to apply the technique to explore more complex and less investigated quantum states, including the puzzling behaviors seen in quantum Hall physics. These include scenarios where interacting electrons exhibit unusual correlated behaviors under the influence of a magnetic field. 'That's where theory gets really hairy - where people start drawing pictures instead of being able to write down a full-fledged theory because they can't fully solve it," Zwierlein concludes in a press release. "Now we can verify whether these cartoons of quantum Hall states are actually real. Because they are pretty bizarre states." The study has been published in the journal Physical Review Letters.

How Scientists Can Be Good Citizens
How Scientists Can Be Good Citizens

Yahoo

time04-05-2025

  • Science
  • Yahoo

How Scientists Can Be Good Citizens

On July 3, 1945, ten German scientists who had worked on Germany's nuclear program were interned by the Allies at a country mansion called Farm Hall, in Godmanchester England, about 20 miles northwest of Cambridge. The purpose of incarcerating the physicists was to find out how close Nazi Germany had been to building an atomic bomb, and possibly also to keep them from falling into the hands of the Russians. The scientists included Otto Hahn, who in 1938 had discovered that uranium could fission and who had received the Nobel Prize in 1944; Werner Heisenberg, one of the inventors of quantum mechanics and a Nobel Prize winner in 1932; and Friedrich von Weizsäcker, who made important contributions to the physics of energy production in stars. Roughly one month later, in the afternoon of August 6, 1945, the German scientists learned that an atomic bomb had been dropped on Hiroshima. At first, they didn't believe the news, as they had previously concluded that the construction of such a weapon would be prohibitively expensive. Then, as more information began trickling in, they accepted that it was true. Otto Hahn would later describe feeling enormous guilt that 'his greatest scientific discovery now bears the taint of unimaginable horror.' A remarkable conversation followed between Heisenberg and von Weizsäcker about the ethics of science and responsibilities of scientists, one that took place during the incarceration at Farm Hall. 'The word guilt does not really apply,' Heisenberg said to von Weizsäcker, 'even though all of us were links in the causal chain that led to this great tragedy. Otto Hahn and all of us have merely played our part in the development of modern science … We know from experience that it can lead to good or to evil.' Then von Weizsäcker responded: There will, of course, be quite a few who will contend that science has gone far enough … They may, of course, be right, but all those who think like them fail to grasp that, in the modern world, man's life has come to depend on the development of science. If we were to turn our backs on the continuous extension of knowledge, the number of people inhabiting the earth in the fairly near future would have to be cut down radically … For the present, the development of science is a vital need of all mankind, so that any individual contributing toward it cannot be called guilty. Our task, now as in the past, is to guide this development toward the right ends, to extend the benefits of knowledge to all mankind, not to prevent the development itself. Hence the correct question is: What can the individual scientist do to help in this task; what are the precise obligations of the scientific research worker? What is more, we must probably make a clear distinction between the discoverer and the inventor. As a rule, the former cannot predict the practical consequences of his contribution before he actually makes it, the less so as many years may go by before it can be exploited. Heisenberg then replied that whether discoverer or inventor, 'the individual tackling a scientific or technical task must nevertheless try to think of the broader issues. And, indeed, if he did not, why did he exert himself in the first place?' And von Weizsäcker again: 'In that case, if [the scientist] wants to act for the best and not just leave it at noble thoughts, he will probably have to play a more deliberate part in public life, try to have a greater say in public affairs. Perhaps we should welcome this trend, for inasmuch as scientific and technical advances serve the good of society, those responsible for them will be given a greater say than they currently enjoy. Obviously, this does not mean that physicists or technicians could make better political decisions than the politicians themselves. But their scientific work has taught them to be objective and factual, and, what is more important, to keep the wider context in view.' The ethics of science and the responsibilities of scientists do not have simple formulations or prescriptions. Yet the questions that animated Heisenberg and von Weizsäcker 80 years ago are as urgent as ever today. The role of scientists in their society is especially relevant when science and evidence-based thinking are under attack, and scientists are sometimes portrayed as driven by financial or political interests. Heisenberg said that modern science can lead to good or to evil. But sometimes defining the 'good' is not easy. For example, is it morally justified to build a weapon to kill people, if by killing a few, we can save the lives of many? Is it morally justified to alter the DNA of human embryos in order to make the resulting human beings smarter or more athletic? Should a scientist stop working on a fundamental research problem, such as how memory is stored in the brain or the behavior of solid matter under extreme pressure, if she thinks that it might lead to harmful applications? Our view is that science and the technology resulting from science do not have values in themselves. It is we human beings who possess values. And we should employ those values in how we use science and technology. (In this view, we disagree with the AI entrepreneur Mustafa Suleyman, who argues in his recent book The Coming Wave that technology is inherently political.) The 'good' referred to by Heisenberg probably meant—as it does for many people—increasing the well-being (happiness and quality of life) of the largest number of people. And the 'bad' diminishes that well-being. We further suggest that scientists, as citizens of their society, have a responsibility to ensure that their discoveries and innovations are used for good and not for bad. Such a responsibility, of course, means that scientists will have to take some time away from their lab benches and equations to engage with the public and with policy makers. We also suggest that scientists, as citizens of the world, share a responsibility to help relieve the world's economic inequalities, including the global South's relative lack of access to energy, food, health care, and technology. As von Weizsäcker said, scientists are not policy makers, nor do they have the required skills. But their special expertise and evidence-based thinking should be resources for policy makers to improve the lives of everyone. And, because we live in a scientific and technological age, buffeted by rapid developments in biotechnology, artificial intelligence, and many other areas, scientists have a responsibility to educate the public in scientific matters. Policy makers may often be motivated by self-interest, but ultimately, in democratic societies, they must answer to the public. In our view, the areas of science and technology now posing the greatest ethical dilemmas and challenges are artificial intelligence, biotechnology and 'synthetic biology,' advanced medical procedures, and climate change. Artificial intelligence is already revolutionizing many aspects of our lives, including health care, banking, transportation, information exchange, and even warfare. New computer programs are able to learn things by themselves, as well as utilize vast data banks, and will someday become fully autonomous, operating without human input. Biotechnology—the manipulation of biological processes and the DNA of microorganisms to produce novel products—is already being used to create such things as batteries, drugs, improved fertilizer and other agricultural products, and new engineering devices. This rapidly developing field began with the understanding of the structure of DNA in the 1950s. Advanced medical procedures include the ability to edit the DNA of human embryos, extend the lives of permanently bedridden patients, and rapidly sequence and analyze a person's full DNA, revealing psychological tendencies, origins of personality, and potential illnesses. People sometimes used the word science to include both science and technology, but there is certainly a distinction between 'pure science,' dedicated to learning the nature of the physical world, and technology, which is the production of materials designed to improve the lives of human beings and solve their problems. (We will later raise the question of whether all technology actually improves the lives of human beings.) Technology might also be called 'applied science.' Certainly there is not always a clear demarcation between pure and applied science. Many discoveries in pure science later lead to applications, such as the invention of the transistor in 1947 (used in electronic equipment and telecommunication devices), unravelling the structure of DNA in 1953 (now used to identify pathogens, in the treatment of cancer, and in other applications), the discovery of mRNA in 1961 (the basis for the COVID vaccines), and the discovery of carbon nanotubes in 1991 (used to make plastics with enhanced electrical conductivity and for delivering drugs and for the regeneration of nerve cells). Today we live in a world more dependent on technology than ever before, and ever more vulnerable to its failures or misdirection. To be at ease in this fast-changing world, and to be effective citizens, everyone needs at least a basic grasp of science's concepts and discoveries. Scientific education and communication aren't just for scientists. Obviously pandemics, climate change, and AI have been at the forefront of our minds recently, but policies on health, energy, and the environment all have a scientific dimension. To understand their essence isn't so difficult: Most of us appreciate music even if we can't compose or even perform it. Likewise, the key ideas of science can be accessed and enjoyed by almost everyone; the technicalities may be daunting, but these are less important for most of us, and can be left to the specialists. In this respect, one of the most frightening outcomes from the recent populist movements across the globe has been the death of facts. In today's 'post-truth' era, there is little agreement on what defines reliable sources. The occupational risk to scientists of their deliberate focus on biotechnology, solid state physics, and artificial intelligence is that they forget that these narrow problems are worthwhile only insofar as they are steps toward answering some big questions. And that is why it is good for scientists to engage with general audiences. In fact, when one discusses the 'great unknowns,' there is less of a gap between the specialist and the audience. When even the experts haven't much of a clue, they are in a sense in the same position as the public. Even if we scientists explain ourselves badly, we benefit from exposure to general audiences who focus on the big questions and remind us how much we still don't know. Robert Wilson, the radio engineer who made the serendipitous discovery of the cosmic background radiation—which clinched the case for a Big Bang—said that he himself didn't fully appreciate the import of his momentous work until he read an article in The New York Times headlined 'The Afterglow of Creation.' Good journalists offer a breadth and critical perspective that can, in professional scientists, atrophy through overspecialization, so their work benefits specialists as well as the wider public. The interconnectedness of today's world, by virtue of global trade, the internet, and global challenges such as climate change, requires scientists to engage with the international community, not only their own society. Our interconnected world depends on elaborate networks: electric-power grids, air-traffic control, international finance, just-in-time delivery, and so forth. Unless these are highly resilient, their manifest benefits could be outweighed by catastrophic (albeit rare) breakdowns cascading through the system. Pandemics can spread at the speed of jet aircraft, causing maximal havoc in the shambolic but growing megacities of the developing world. Social media can spread psychic contagion—rumors and panic—literally at the speed of light. The issues impel us to plan internationally. For example, whether a pandemic gets a global grip may hinge on how quickly a Vietnamese poultry farmer can report any strange sickness. And many other challenges—energy and climate change, for instance—involve multi-decade timescales, plainly far outside the concern and 'comfort zone' of most politicians. Nevertheless, politicians need the best 'in house' scientific advice in forming their policies. But more than that, these issues should be part of a wide public debate, and such debate must be leveraged by 'scientific citizens'—engaging, from all political perspectives, with the media, and with a public attuned to the scope and limit of science. Scientists can act through campaigning groups, via blogging and journalism, or through political activity. There is a role for national academies too. Politicians, informed by their scientific advisers, should aim to lift long-term global issues higher on the political agenda, where they are habitually trumped by the urgent and parochial. Scientists should present policy options based on a consensus of expert opinion; but if they engage in advocacy, they should recognize that on the economic, social, and ethical aspects of any policy, they speak as citizens and not as experts. Likewise, scientists shouldn't be indifferent to the fruits of their ideas—their creations. They should try to foster benign spin-offs—commercial or otherwise. They should resist, so far as they can, dubious or threatening applications of their work and alert politicians when appropriate. We need to foster a culture of 'responsible innovation,' especially in fields such as biotech and advanced AI. Of course, scientists have special obligations over and above their responsibility as citizens. Obviously, ethical obligations confront scientific research itself: avoiding experiments that have even the tiniest risk of leading to catastrophe and respecting a code of ethics when research involves animals or human subjects. But less tractable issues arise when research has ramifications beyond the laboratory and a potential social, economic, and ethical impact that concerns all citizens—or when it reveals a serious but still-unappreciated threat. One can highlight some fine exemplars from the past: for instance, the atomic scientists who developed the first nuclear weapons during World War II. Fate had assigned them a pivotal role in history. Many of them—men such as Joseph Rotblat, Hans Bethe, Rudolf Peierls, and John Simpson—returned with relief to peacetime academic pursuits. But for them the ivory tower wasn't a sanctuary. They continued not just as academics but as engaged citizens—promoting efforts to control the power they had helped unleash, through national academies, the Pugwash movement, aimed at ridding the world of weapons of mass destruction, and other public forums. They were the alchemists of their time, possessors of secret specialized knowledge. Nuclear physics was 20th-century science. But other technologies now have implications just as momentous as nuclear weapons. In contrast to the 'atomic scientists,' those engaged with the new challenges span almost all the sciences, are broadly international, and work in the commercial sector as well as in academia and government. Their findings and concerns need to inform planning and policy. So how is this best done? Direct ties forged with politicians and senior officials can help—and links with NGOs and the private sector too. But many experts who serve as government advisers have frustratingly little influence. Politicians are, however, influenced by their inbox, and by the press. Scientists can sometimes achieve more as 'outsiders' and activists, leveraging their message via widely read books, campaigning groups, blogging and journalism, or through political activity. If their voices are echoed and amplified by a wide public and by the media, long-term global causes will rise on the political agenda. Rachel Carson and Carl Sagan, for instance, were both preeminent in their generation as exemplars of the concerned scientist—and they had immense influence through their writings and speeches. And that was before the age of social media. A special responsibility resides with scientists in academia or self-employed entrepreneurs. They have more freedom to engage in public debate than those in government service or in industry. And those of us who are academics have a special privilege to influence successive generations of students. We should try to sensitize them to the issues that will confront them in their careers. Indeed, polls show, unsurprisingly, that young people are more engaged and anxious about long-term and global issues than those in earlier generations. Although this is an extraordinarily difficult time to be a scientist, there are grounds for optimism. For most people in most nations, there's never been a better time to be alive. The innovations driving economic advancement can boost the developing as well as the developed world. Creativity in science and the arts is nourished by a wider range of influences—and is accessible to many more people worldwide than in the past. We're becoming embedded in a cyberspace that can link anyone, anywhere, to all the world's information and culture and to most other people on the planet. Twenty-first-century technologies have the potential to offer everyone a lifestyle comparable to what Europeans enjoy today, while being environmentally benign and making lower demands on energy. More should be done to assess and then minimize the risks and challenges we've discussed here. But we can be technological optimists, even though many leaders in technology need redirection. And that redirection must be guided by values that science itself can't provide. Once again, science and technology do not have values in themselves. It is we human beings who have values. And it is the responsibility of scientists and technologists, both as specialists and as citizens of the world, to help advise policy makers and governments. There are certainly difficulties. Politicians look to their voters and the next election. Stockholders expect a payoff in the short run. We downplay what's happening even now in faraway countries. And we discount too heavily the problems we'll leave for new generations. Without a broader perspective— without realizing that we're all on this crowded planet together—governments won't properly prioritize projects that are long-term from a political perspective, even if a mere instant in the history of Earth. Knowing all we owe to past generations, it would be shameful if we weren't 'good ancestors' and left a depleted heritage and damaged planet to our descendants. Today's young people are coming of age in a world that is at once wondrous and challenged, exhilarating and frightening, a world of potential and uncertainty, trembling, majestic, unpredictable and predictable, mysterious. A world to celebrate, to understand, and to preserve. This article was adapted from Alan Lightman and Martin Rees's forthcoming book, The Shape of Wonder: How Scientists Live, Work, and Think. Article originally published at The Atlantic

How Scientists Can Be Good Citizens
How Scientists Can Be Good Citizens

Atlantic

time04-05-2025

  • Science
  • Atlantic

How Scientists Can Be Good Citizens

On July 3, 1945, ten German scientists who had worked on Germany's nuclear program were interned by the Allies at a country mansion called Farm Hall, in Godmanchester England, about 20 miles northwest of Cambridge. The purpose of incarcerating the physicists was to find out how close Nazi Germany had been to building an atomic bomb, and possibly also to keep them from falling into the hands of the Russians. The scientists included Otto Hahn, who in 1938 had discovered that uranium could fission and who had received the Nobel Prize in 1944; Werner Heisenberg, one of the inventors of quantum mechanics and a Nobel Prize winner in 1932; and Friedrich von Weizsäcker, who made important contributions to the physics of energy production in stars. Roughly one month later, in the afternoon of August 6, 1945, the German scientists learned that an atomic bomb had been dropped on Hiroshima. At first, they didn't believe the news, as they had previously concluded that the construction of such a weapon would be prohibitively expensive. Then, as more information began trickling in, they accepted that it was true. Otto Hahn would later describe feeling enormous guilt that 'his greatest scientific discovery now bears the taint of unimaginable horror.' A remarkable conversation followed between Heisenberg and von Weizsäcker about the ethics of science and responsibilities of scientists, one that took place during the incarceration at Farm Hall. 'The word guilt does not really apply,' Heisenberg said to von Weizsäcker, 'even though all of us were links in the causal chain that led to this great tragedy. Otto Hahn and all of us have merely played our part in the development of modern science … We know from experience that it can lead to good or to evil.' Then von Weizsäcker responded: There will, of course, be quite a few who will contend that science has gone far enough … They may, of course, be right, but all those who think like them fail to grasp that, in the modern world, man's life has come to depend on the development of science. If we were to turn our backs on the continuous extension of knowledge, the number of people inhabiting the earth in the fairly near future would have to be cut down radically … For the present, the development of science is a vital need of all mankind, so that any individual contributing toward it cannot be called guilty. Our task, now as in the past, is to guide this development toward the right ends, to extend the benefits of knowledge to all mankind, not to prevent the development itself. Hence the correct question is: What can the individual scientist do to help in this task; what are the precise obligations of the scientific research worker? What is more, we must probably make a clear distinction between the discoverer and the inventor. As a rule, the former cannot predict the practical consequences of his contribution before he actually makes it, the less so as many years may go by before it can be exploited. Heisenberg then replied that whether discoverer or inventor, 'the individual tackling a scientific or technical task must nevertheless try to think of the broader issues. And, indeed, if he did not, why did he exert himself in the first place?' And von Weizsäcker again: 'In that case, if [the scientist] wants to act for the best and not just leave it at noble thoughts, he will probably have to play a more deliberate part in public life, try to have a greater say in public affairs. Perhaps we should welcome this trend, for inasmuch as scientific and technical advances serve the good of society, those responsible for them will be given a greater say than they currently enjoy. Obviously, this does not mean that physicists or technicians could make better political decisions than the politicians themselves. But their scientific work has taught them to be objective and factual, and, what is more important, to keep the wider context in view.' The ethics of science and the responsibilities of scientists do not have simple formulations or prescriptions. Yet the questions that animated Heisenberg and von Weizsäcker 80 years ago are as urgent as ever today. The role of scientists in their society is especially relevant when science and evidence-based thinking are under attack, and scientists are sometimes portrayed as driven by financial or political interests. Heisenberg said that modern science can lead to good or to evil. But sometimes defining the 'good' is not easy. For example, is it morally justified to build a weapon to kill people, if by killing a few, we can save the lives of many? Is it morally justified to alter the DNA of human embryos in order to make the resulting human beings smarter or more athletic? Should a scientist stop working on a fundamental research problem, such as how memory is stored in the brain or the behavior of solid matter under extreme pressure, if she thinks that it might lead to harmful applications? Our view is that science and the technology resulting from science do not have values in themselves. It is we human beings who possess values. And we should employ those values in how we use science and technology. (In this view, we disagree with the AI entrepreneur Mustafa Suleyman, who argues in his recent book The Coming Wave that technology is inherently political.) The 'good' referred to by Heisenberg probably meant—as it does for many people—increasing the well-being (happiness and quality of life) of the largest number of people. And the 'bad' diminishes that well-being. We further suggest that scientists, as citizens of their society, have a responsibility to ensure that their discoveries and innovations are used for good and not for bad. Such a responsibility, of course, means that scientists will have to take some time away from their lab benches and equations to engage with the public and with policy makers. We also suggest that scientists, as citizens of the world, share a responsibility to help relieve the world's economic inequalities, including the global South's relative lack of access to energy, food, health care, and technology. As von Weizsäcker said, scientists are not policy makers, nor do they have the required skills. But their special expertise and evidence-based thinking should be resources for policy makers to improve the lives of everyone. And, because we live in a scientific and technological age, buffeted by rapid developments in biotechnology, artificial intelligence, and many other areas, scientists have a responsibility to educate the public in scientific matters. Policy makers may often be motivated by self-interest, but ultimately, in democratic societies, they must answer to the public. In our view, the areas of science and technology now posing the greatest ethical dilemmas and challenges are artificial intelligence, biotechnology and 'synthetic biology,' advanced medical procedures, and climate change. Artificial intelligence is already revolutionizing many aspects of our lives, including health care, banking, transportation, information exchange, and even warfare. New computer programs are able to learn things by themselves, as well as utilize vast data banks, and will someday become fully autonomous, operating without human input. Biotechnology—the manipulation of biological processes and the DNA of microorganisms to produce novel products—is already being used to create such things as batteries, drugs, improved fertilizer and other agricultural products, and new engineering devices. This rapidly developing field began with the understanding of the structure of DNA in the 1950s. Advanced medical procedures include the ability to edit the DNA of human embryos, extend the lives of permanently bedridden patients, and rapidly sequence and analyze a person's full DNA, revealing psychological tendencies, origins of personality, and potential illnesses. People sometimes used the word science to include both science and technology, but there is certainly a distinction between 'pure science,' dedicated to learning the nature of the physical world, and technology, which is the production of materials designed to improve the lives of human beings and solve their problems. (We will later raise the question of whether all technology actually improves the lives of human beings.) Technology might also be called 'applied science.' Certainly there is not always a clear demarcation between pure and applied science. Many discoveries in pure science later lead to applications, such as the invention of the transistor in 1947 (used in electronic equipment and telecommunication devices), unravelling the structure of DNA in 1953 (now used to identify pathogens, in the treatment of cancer, and in other applications), the discovery of mRNA in 1961 (the basis for the COVID vaccines), and the discovery of carbon nanotubes in 1991 (used to make plastics with enhanced electrical conductivity and for delivering drugs and for the regeneration of nerve cells). Today we live in a world more dependent on technology than ever before, and ever more vulnerable to its failures or misdirection. To be at ease in this fast-changing world, and to be effective citizens, everyone needs at least a basic grasp of science's concepts and discoveries. Scientific education and communication aren't just for scientists. Obviously pandemics, climate change, and AI have been at the forefront of our minds recently, but policies on health, energy, and the environment all have a scientific dimension. To understand their essence isn't so difficult: Most of us appreciate music even if we can't compose or even perform it. Likewise, the key ideas of science can be accessed and enjoyed by almost everyone; the technicalities may be daunting, but these are less important for most of us, and can be left to the specialists. In this respect, one of the most frightening outcomes from the recent populist movements across the globe has been the death of facts. In today's 'post-truth' era, there is little agreement on what defines reliable sources. The occupational risk to scientists of their deliberate focus on biotechnology, solid state physics, and artificial intelligence is that they forget that these narrow problems are worthwhile only insofar as they are steps toward answering some big questions. And that is why it is good for scientists to engage with general audiences. In fact, when one discusses the 'great unknowns,' there is less of a gap between the specialist and the audience. When even the experts haven't much of a clue, they are in a sense in the same position as the public. Even if we scientists explain ourselves badly, we benefit from exposure to general audiences who focus on the big questions and remind us how much we still don't know. Robert Wilson, the radio engineer who made the serendipitous discovery of the cosmic background radiation—which clinched the case for a Big Bang—said that he himself didn't fully appreciate the import of his momentous work until he read an article in The New York Times headlined 'The Afterglow of Creation.' Good journalists offer a breadth and critical perspective that can, in professional scientists, atrophy through overspecialization, so their work benefits specialists as well as the wider public. The interconnectedness of today's world, by virtue of global trade, the internet, and global challenges such as climate change, requires scientists to engage with the international community, not only their own society. Our interconnected world depends on elaborate networks: electric-power grids, air-traffic control, international finance, just-in-time delivery, and so forth. Unless these are highly resilient, their manifest benefits could be outweighed by catastrophic (albeit rare) breakdowns cascading through the system. Pandemics can spread at the speed of jet aircraft, causing maximal havoc in the shambolic but growing megacities of the developing world. Social media can spread psychic contagion—rumors and panic—literally at the speed of light. The issues impel us to plan internationally. For example, whether a pandemic gets a global grip may hinge on how quickly a Vietnamese poultry farmer can report any strange sickness. And many other challenges—energy and climate change, for instance—involve multi-decade timescales, plainly far outside the concern and 'comfort zone' of most politicians. Nevertheless, politicians need the best 'in house' scientific advice in forming their policies. But more than that, these issues should be part of a wide public debate, and such debate must be leveraged by 'scientific citizens'—engaging, from all political perspectives, with the media, and with a public attuned to the scope and limit of science. Scientists can act through campaigning groups, via blogging and journalism, or through political activity. There is a role for national academies too. Politicians, informed by their scientific advisers, should aim to lift long-term global issues higher on the political agenda, where they are habitually trumped by the urgent and parochial. Scientists should present policy options based on a consensus of expert opinion; but if they engage in advocacy, they should recognize that on the economic, social, and ethical aspects of any policy, they speak as citizens and not as experts. Likewise, scientists shouldn't be indifferent to the fruits of their ideas—their creations. They should try to foster benign spin-offs—commercial or otherwise. They should resist, so far as they can, dubious or threatening applications of their work and alert politicians when appropriate. We need to foster a culture of 'responsible innovation,' especially in fields such as biotech and advanced AI. Of course, scientists have special obligations over and above their responsibility as citizens. Obviously, ethical obligations confront scientific research itself: avoiding experiments that have even the tiniest risk of leading to catastrophe and respecting a code of ethics when research involves animals or human subjects. But less tractable issues arise when research has ramifications beyond the laboratory and a potential social, economic, and ethical impact that concerns all citizens—or when it reveals a serious but still-unappreciated threat. One can highlight some fine exemplars from the past: for instance, the atomic scientists who developed the first nuclear weapons during World War II. Fate had assigned them a pivotal role in history. Many of them—men such as Joseph Rotblat, Hans Bethe, Rudolf Peierls, and John Simpson—returned with relief to peacetime academic pursuits. But for them the ivory tower wasn't a sanctuary. They continued not just as academics but as engaged citizens—promoting efforts to control the power they had helped unleash, through national academies, the Pugwash movement, aimed at ridding the world of weapons of mass destruction, and other public forums. They were the alchemists of their time, possessors of secret specialized knowledge. Nuclear physics was 20th-century science. But other technologies now have implications just as momentous as nuclear weapons. In contrast to the 'atomic scientists,' those engaged with the new challenges span almost all the sciences, are broadly international, and work in the commercial sector as well as in academia and government. Their findings and concerns need to inform planning and policy. So how is this best done? Direct ties forged with politicians and senior officials can help—and links with NGOs and the private sector too. But many experts who serve as government advisers have frustratingly little influence. Politicians are, however, influenced by their inbox, and by the press. Scientists can sometimes achieve more as 'outsiders' and activists, leveraging their message via widely read books, campaigning groups, blogging and journalism, or through political activity. If their voices are echoed and amplified by a wide public and by the media, long-term global causes will rise on the political agenda. Rachel Carson and Carl Sagan, for instance, were both preeminent in their generation as exemplars of the concerned scientist—and they had immense influence through their writings and speeches. And that was before the age of social media. A special responsibility resides with scientists in academia or self-employed entrepreneurs. They have more freedom to engage in public debate than those in government service or in industry. And those of us who are academics have a special privilege to influence successive generations of students. We should try to sensitize them to the issues that will confront them in their careers. Indeed, polls show, unsurprisingly, that young people are more engaged and anxious about long-term and global issues than those in earlier generations. Although this is an extraordinarily difficult time to be a scientist, there are grounds for optimism. For most people in most nations, there's never been a better time to be alive. The innovations driving economic advancement can boost the developing as well as the developed world. Creativity in science and the arts is nourished by a wider range of influences—and is accessible to many more people worldwide than in the past. We're becoming embedded in a cyberspace that can link anyone, anywhere, to all the world's information and culture and to most other people on the planet. Twenty-first-century technologies have the potential to offer everyone a lifestyle comparable to what Europeans enjoy today, while being environmentally benign and making lower demands on energy. More should be done to assess and then minimize the risks and challenges we've discussed here. But we can be technological optimists, even though many leaders in technology need redirection. And that redirection must be guided by values that science itself can't provide. Once again, science and technology do not have values in themselves. It is we human beings who have values. And it is the responsibility of scientists and technologists, both as specialists and as citizens of the world, to help advise policy makers and governments. There are certainly difficulties. Politicians look to their voters and the next election. Stockholders expect a payoff in the short run. We downplay what's happening even now in faraway countries. And we discount too heavily the problems we'll leave for new generations. Without a broader perspective— without realizing that we're all on this crowded planet together—governments won't properly prioritize projects that are long-term from a political perspective, even if a mere instant in the history of Earth. Knowing all we owe to past generations, it would be shameful if we weren't 'good ancestors' and left a depleted heritage and damaged planet to our descendants. Today's young people are coming of age in a world that is at once wondrous and challenged, exhilarating and frightening, a world of potential and uncertainty, trembling, majestic, unpredictable and predictable, mysterious. A world to celebrate, to understand, and to preserve.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store