logo
#

Latest news with #VannevarBush

8 Times Taxpayer Money Led to Historic Leaps in Medical Care
8 Times Taxpayer Money Led to Historic Leaps in Medical Care

Medscape

time5 days ago

  • Health
  • Medscape

8 Times Taxpayer Money Led to Historic Leaps in Medical Care

Since World War II, a quiet partnership between the US government and academic researchers has helped shape the course of modern medicine. Public funding has underwritten discoveries that changed how we detect, treat, and prevent disease — sometimes in ways that were barely imaginable when the research began. This relationship traces its roots to the 1945 report Science, The Endless Frontier , written by Vannevar Bush, who was then the head of the wartime Office of Scientific Research and Development. Bush argued that continued investment in basic research — the kind driven by curiosity, not short-term profit — was essential not only for national security but also for public health and economic growth. 'Basic research is the pacemaker of technological progress,' Bush wrote. His report helped shape the creation of the National Science Foundation and guided peacetime funding efforts at agencies like the National Institutes of Health (NIH), which would go on to support generations of US scientists. In 2023, the federal government spent nearly $200 billion on research and development (R&D), much of it through NIH and other science-focused agencies. That money supports everything from molecular biology to drug development to health data infrastructure, often with payoffs that take decades to emerge. But this investment model is now under threat. The Trump administration's proposed 2026 federal budget calls for sharp reductions in R&D spending, including 40% less for NIH (though a Senate committee has rejected that proposal, calling instead for an increase in funding for the NIH for next year). Experts warn this could impede medical breakthroughs, slow the development of new treatments, and increase the burden of preventable disease. 'It's hard to even comprehend what's lost when federal funding dries up,' says Christopher Worsham, MD, a critical care physician and researcher at Harvard Medical School, Boston, and coauthor of Random Acts of Medicine: The Hidden Forces That Sway Doctors, Impact Patients, and Shape Our Health . 'There are the obvious setbacks — ongoing projects shut down, discoveries delayed by years. But there are also the invisible losses. Labs that never form. Scientists who never get trained. A career's worth of discovery, gone before it began.' The eight breakthroughs highlighted below were selected with guidance from Worsham; David Jones, MD, PhD, a physician and professor of the culture of medicine at Harvard University, Boston; and Anupam Jena, MD, PhD, a physician and health economist at Harvard Medical School. But they're just a sample of how federal research support shaped the landscape of modern medicine. 1. The Framingham Heart Study A landmark, long-term investigation into cardiovascular disease and its risk factors. With funding from what is now the National Heart, Lung, and Blood Institute, researchers began tracking the health of more than 5000 residents in Framingham, Massachusetts. The goal was to understand the root causes of heart disease, which at the time was the leading cause of death in the US but poorly understood. The study followed participants over decades, collecting information on blood pressure, cholesterol, smoking habits, physical activity, and more. It provided the first conclusive evidence linking high blood pressure and high cholesterol to cardiovascular illness. It also helped establish the role of smoking, obesity, and lack of exercise in heart disease. This led to the widely used Framingham Risk Score, which estimates a person's 10-year risk of developing cardiovascular disease. Jena says this first epidemiologic effort 'helped steer the development of both preventive guidelines and treatments.' Anupam Jena, MD, PhD Now in its 77th year, the Framingham Heart Study continues to follow the children and grandchildren of the original participants. Its scope has broadened to include genetics, dementia, cancer, and social determinants of health — making it one of the longest-running and most influential population studies in medical history. 2. The Surgeon General's Report on Smoking and Health The official wake-up call on tobacco's deadly toll. On January 11, 1964, Surgeon General Dr. Luther Terry delivered a message that would reverberate across the nation: 'Cigarette smoking is a health hazard of sufficient importance to the US to warrant remedial action.' The Report of the Advisory Committee to the Surgeon General of the Public Health Service marked the first time the US government formally linked cigarette smoking to serious disease. Previous warnings didn't carry the weight of this 387-page document, published under the authority of the US Public Health Service and backed by decades of evidence — much of it supported, directly or indirectly, by federal research funding. At the time, 42% of American adults smoked cigarettes daily. Tobacco advertising was ubiquitous, and tobacco companies were politically powerful. But the report flipped a switch: Within a year, Congress mandated warning labels on cigarette packages. The findings helped lay the groundwork for tobacco control policies that led to dramatic declines in smoking rates and prevented millions of premature deaths. Jones calls it 'likely the most important public health innovation of the post-World War II era.' The report established a precedent for rigorous, government-backed assessments of environmental and behavioral health risks. Subsequent Surgeon General reports would expand on the dangers of secondhand smoke, the effects of nicotine addiction, and more. Dr. Luther Terry with the landmark Surgeon General report on smoking and health, funded by the US Public Health Service and informed by federally supported research. 3. Oral Rehydration Therapy A simple sugar-and-salt solution that has saved tens of millions of lives. In the late 1960s, cholera remained a deadly global threat. The disease, which causes severe diarrhea, could kill patients within hours by rapidly draining the body of water and essential salts. At the time, intravenous fluids were the standard treatment, but access was limited, particularly in the poorer countries where cholera outbreaks were most severe. Enter Dr. Richard Cash, a young physician who joined the NIH during the Vietnam War as an alternative to military service. The NIH sent him to what was then East Pakistan (now Bangladesh), where he and colleagues helped develop and test a stunningly simple solution: a mixture of water, salt, and glucose that patients could drink themselves. Plain water can't reverse cholera's rapid dehydration. Cash and his team showed that this precisely balanced oral formula could enable the body to absorb both water and electrolytes through the intestinal wall. Even patients in critical condition could recover — so long as they were conscious and able to drink. The impact was staggering. 'Oral rehydration therapy, pioneered by Richard Cash and others, has saved tens of millions of lives globally,' says Jones. Families can be trained to administer it at home. It doesn't require refrigeration, a sterile environment, or high-tech equipment. David Jones, MD, PhD Field trials in the 1970s showed a 93% effectiveness rate. The Lancet in 1978 called it 'potentially the most significant medical advance of the century.' 4. CRISPR Gene-Editing Technology A revolutionary tool for editing DNA. CRISPR emerged through decades of federally funded research into bacterial immune systems, molecular biology, and the intricate machinery of DNA repair. Today, it's among the most promising medical technologies of the 21st century — a gene-editing technique that could treat or even cure a wide range of genetic diseases. The foundation was laid in 2008, when researchers Erik Sontheimer and Luciano Marraffini identified CRISPR as a general purpose gene-editing mechanism. But the breakthrough came in 2012, when Emmanuelle Charpentier and Jennifer Doudna showed that CRISPR-Cas9 could be used to precisely cut DNA in a test tube. Doudna, a Nobel laureate in chemistry and professor of biochemistry and molecular biology at the University of California, Berkeley, says the potential now exists to 'cure genetic disease, breed disease-tolerant crops, and more.' 'CRISPR is a great example of the success of the long-standing US model for supporting science,' Doudna says. 'The NSF and DOE supported the early, curiosity-driven research that led to the discovery of CRISPR gene editing, and later funding from the NIH supported the development of applications of CRISPR in human health.' 5. Vaccines for Measles, Polio, and COVID-19 Immunizations have nearly eliminated devastating infectious diseases. Over the past century, publicly funded vaccine development has helped eradicate polio from most of the world, curb measles transmission in the Americas, and sharply reduce the global toll of COVID-19. 'Is there any doubt about the value of those vaccines?' says Jones. 'Polio was a massive source of fear, with summer epidemics shutting down pools, movie theaters, and other public spaces across the US….Now polio has been nearly eradicated from Earth.' Measles, meanwhile, was declared eliminated from the Western Hemisphere in 2016 (though recent outbreaks are raising concerns about that status). Public investment was crucial to the development of these vaccines. The measles vaccine, developed by John Enders and his team at Harvard, was made possible through NIH-supported research into how to culture the virus — a critical step toward producing a safe and effective vaccine, licensed in 1963. It laid the groundwork for the combination MMRV (measles, mumps, rubella vaccine) developed in 1971. In 2005, the varicella (chickenpox) vaccine was added, creating the now-standard MMRV shot for children. The polio vaccine emerged from a public fundraising campaign that started when President Franklin D. Roosevelt (a polio survivor) and Basil O'Connor founded the National Foundation for Infantile Paralysis — later renamed the March of Dimes — which channeled donations into research and care. Their support enabled Dr. Jonas Salk to develop the first inactivated polio vaccine at the University of Pittsburgh in the early 1950s, leading to mass immunization efforts that would all but eliminate the disease from most of the world. The COVID-19 pandemic spurred the fastest large-scale vaccine development in history. Within 12 months of the SARS-CoV-2 genome being published, researchers — backed by tens of billions in US public funding — had developed multiple highly effective vaccines. That NIH investment (estimated at just shy of $32 billion) helped accelerate development and manufacturing, allowing the US to lead a global vaccination effort. Over 13 billion COVID-19 vaccine doses have since been administered worldwide. 'The evidence is quite good that COVID vaccines saved lives and reduced suffering,' says Jones. A new study from JAMA Health Forum offered one of the most comprehensive and conservative estimates to date: COVID-19 vaccines averted 2.5 million deaths in the US between 2020 and 2024 — reinforcing the enormous public health return, even under modest assumptions. 6. The Agency for Healthcare Research and Quality The federal agency is quietly making healthcare safer, smarter, and more efficient. Despite a modest staff of around 300 people and a budget of just 0.02% of total federal healthcare spending, the Agency for Healthcare Research and Quality (AHRQ) has a far-reaching impact on American medicine. AHRQ plays a critical role in improving the quality, safety, and effectiveness of healthcare delivery. AHRQ was established by a law signed in 1999 by President Bill Clinton, succeeding an agency created in 1989. The need was obvious following two landmark reports from the Institute of Medicine: To Err Is Human (1999), which revealed that medical error was a leading cause of death in the US, and Crossing the Quality Chasm (2001), which called for systemic reform. Since then, AHRQ has become the backbone of the patient safety and quality improvement movement in the US, supporting thousands of research projects and building essential infrastructure for analyzing healthcare delivery. One example: An AHRQ-funded study evaluated the use of a standardized sterile checklist to prevent central line infections in ICU patients. As hospitals adopted these practices, 'infection rates plummeted,' a study showed. 'There was no new technology,' Worsham says, 'just a change in practice behavior.' Christopher Worsham, MD AHRQ has also helped bring data science into modern health services research, giving researchers access to standardized, national healthcare data. 7. The Human Genome Project A global effort that decoded the blueprint of human life — and revolutionized medicine. On June 26, 2000, President Bill Clinton declared the completion of 'the most important, most wondrous map ever produced by humankind.' He was referring to the successful first draft of the human genome: a complete survey of the genetic code that underlies all human biology. The Human Genome Project began in 1988 as a joint initiative of the US Department of Energy and the NIH, with an initial investment of $3 billion. Over the next 15 years, it evolved into a massive international collaboration that delivered the first full sequence in 2003. The work laid the foundation for modern genomics and enabled entirely new approaches to understanding, diagnosing, and treating disease. Dr. Francis Collins, who led the project between 1993 and 2008, told the White House gathering, 'We have caught the first glimpse of our own instruction book, previously known only to God.' Collins, the former director of the National Human Genome Research Institute, told NPR this summer that he knew then 'this would become fundamental to pretty much everything we would do in the future in human biology. And I was also convinced as a physician that this was going to open the door to much better ways to diagnose, treat, and prevent a long list of diseases that we didn't understand very well.' The impact has been profound. The project sparked advances in personalized medicine, cancer genomics, and rare disease diagnostics. It led to the creation of tools that are now standard in medical research and enabled a generation of scientists to ask more precise, data-driven questions about human health. Francis Collins (alongside Craig Venter, CEO of Celera Genomics) announces the first draft of the human genome — a $3 billion federal investment — at the White House, June 26, 2000. 8. Protease Inhibitors for HIV/AIDS Antiretroviral drugs that turned HIV into a manageable chronic illness. By 1994, AIDS had become the leading cause of death for Americans aged 25-44 years. Treatment options were limited, and a diagnosis often meant a sharply shortened life expectancy. That changed in 1995, when a new class of drugs — protease inhibitors — was introduced as part of a novel treatment approach known as highly active antiretroviral therapy. The results were immediate and dramatic. Protease inhibitors work by targeting an enzyme called HIV protease, which is essential to the virus's ability to replicate. The drugs disrupt the virus's life cycle, reducing viral loads to undetectable levels when taken consistently. The first FDA-approved protease inhibitor, saquinavir, was quickly followed by others, including ritonavir, indinavir, and nelfinavir. The scientific foundation for these breakthroughs was laid by researchers at the National Cancer Institute, the federal agency that played a central role in both mapping the structure of the HIV protease enzyme and designing early versions of the drugs. Jones says protease inhibitors have 'saved tens of millions of lives.' Globally, the number of new HIV infections has fallen by more than 60% since the mid-1990s. UNAIDS officials have warned that without continued investment, particularly from major funders like the US, the world could see a dramatic resurgence in HIV-related deaths and infections.

The forgotten 80-year-old machine that scientists say could be the key to surviving AI
The forgotten 80-year-old machine that scientists say could be the key to surviving AI

Daily Mail​

time16-07-2025

  • Science
  • Daily Mail​

The forgotten 80-year-old machine that scientists say could be the key to surviving AI

Today's youngsters will never know the painstaking task of going to a library and searching for an article or a particular book. This tedious undertaking involved hours upon hours of trawling through drawers filled with index cards – typically sorted by author, title or subject. An explosion in research publications during the 1940s made it especially time-consuming to locate what you wanted, especially as this was before the invention of the internet. Now, an expert has lifted the lid on the man and the device that changed everything – and it could also be the key to surviving AI. Dr Martin Rudorfer, a lecturer in Computer Science at Aston University, said an American engineer called Vannevar Bush first came up with a solution, dubbed the 'memex'. 'He could see that science was being drastically slowed down by the research process, and proposed a solution that he called the "memex",' Dr Rudorfer wrote in an article for The Conversation. This revolutionary invention was billed as a personal device built into a desk that could store large numbers of documents. Some say the hypothetical design – which never quite made it to production lines – laid the foundation for the internet. Dr Rudorfer believes it could also teach us valuable lessons about AI – and how to avoid machines taking over our lives. [The memex] would rely heavily on microfilm for data storage, a new technology at the time,' he explained. 'The memex would use this to store large numbers of documents in a greatly compressed format that could be projected onto translucent screens.' At the time, microfilm was a relatively new invention and was a method of storing miniature photographic reproductions of documents and books. One of the most important parts of the memex design was a form of indexing that would allow the user to click on a code number alongside a document and jump to a linked document or view them at the same time – without needing to sift through an index. In an influential essay titled 'As We May Think', published in The Atlantic in July 1945, Mr Bush acknowledged that this kind of keyboard click-through wasn't yet technologically feasible. However he believed it wasn't far off, citing existing systems for handling data such as punched cards as potential forerunners. His idea was that a user would create connections between items as they developed their personal research library with 'associative trails' running through them – much like today's Wikipedia. 'Bush thought the memex would help researchers to think in a more natural, associative way that would be reflected in their records,' Dr Rudorfer said. 'He is thought to have inspired the American inventors Ted Nelson and Douglas Engelbart, who in the 1960s independently developed hypertext systems, in which documents contained hyperlinks that could directly access other documents. 'These became the foundation of the world wide web as we know it.' When Mr Bush reflected on his vision in 1970, he said that in the last 25 years he had witnessed technological advances in computing that were bringing his invention closer to reality. However he felt the crux of his vision – to enhance human reasoning and creativity – was being missed. 'In 1945 I dreamed of machines that would think with us,' he wrote in his book Pieces of the Action. 'Now, I see machines that think for us – or worse, control us.' These concerns, written down more than 50 years ago, still feel 'strikingly relevant' today, Dr Rudorfer said. 'While it's great that we do not need to search for a book by flipping through index cards in chests of drawers, we might feel more uneasy about machines doing most of the thinking for us,' he wrote. 'Is this technology enhancing and sharpening our skills, or is it making us lazy?' He warned the danger is that we end up losing skills as machines continue to do them for us. Meanwhile the younger generations may not even get the opportunity to learn them in the first place. The memex may help save us from AI, he said, because it reminds us to try and protect our creativity and reasoning at the same time as developing technology. AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn. ANNs can be trained to recognise patterns in information - including speech, text data, or visual images - and are the basis for a large number of the developments in AI over recent years. Conventional AI uses input to 'teach' an algorithm about a particular subject by feeding it massive amounts of information. Practical applications include Google's language translation services, Facebook's facial recognition software and Snapchat's image altering live filters. The process of inputting this data can be extremely time consuming, and is limited to one type of knowledge. A new breed of ANNs called Adversarial Neural Networks pits the wits of two AI bots against each other, which allows them to learn from each other. This approach is designed to speed up the process of learning, as well as refining the output created by AI systems.

The forgotten 80-year-old machine that shaped the internet – and could help us survive AI
The forgotten 80-year-old machine that shaped the internet – and could help us survive AI

Yahoo

time11-07-2025

  • Science
  • Yahoo

The forgotten 80-year-old machine that shaped the internet – and could help us survive AI

Many years ago, long before the internet or artificial intelligence, an American engineer called Vannevar Bush was trying to solve a problem. He could see how difficult it had become for professionals to research anything, and saw the potential for a better way. This was in the 1940s, when anyone looking for articles, books or other scientific records had to go to a library and search through an index. This meant drawers upon drawers filled with index cards, typically sorted by author, title or subject. When you had found what you were looking for, creating copies or excerpts was a tedious, manual task. You would have to be very organised in keeping your own records. And woe betide anyone who was working across more than one discipline. Since every book could physically only be in one place, they all had to be filed solely under a primary subject. So an article on cave art couldn't be in both art and archaeology, and researchers would often waste extra time trying to find the right location. Get your news from actual experts, straight to your inbox. Sign up to our daily newsletter to receive all The Conversation UK's latest coverage of news and research, from politics and business to the arts and sciences. This had always been a challenge, but an explosion in research publications in that era had made it far worse than before. As Bush wrote in an influential essay, As We May Think, in The Atlantic in July 1945: There is a growing mountain of research. But there is increased evidence that we are being bogged down today as specialisation extends. The investigator is staggered by the findings and conclusions of thousands of other workers – conclusions which he cannot find time to grasp, much less to remember, as they appear. Bush was dean of the school of engineering at MIT (the Massachusetts Institute of Technology) and president of the Carnegie Institute. During the second world war, he had been the director of the Office of Scientific Research and Development, coordinating the activities of some 6,000 scientists working relentlessly to give their country a technological advantage. He could see that science was being drastically slowed down by the research process, and proposed a solution that he called the 'memex'. The memex was to be a personal device built into a desk that required little physical space. It would rely heavily on microfilm for data storage, a new technology at the time. The memex would use this to store large numbers of documents in a greatly compressed format that could be projected onto translucent screens. Most importantly, Bush's memex was to include a form of associative indexing for tying two items together. The user would be able to use a keyboard to click on a code number alongside a document to jump to an associated document or view them simultaneously – without needing to sift through an index. Bush acknowledged in his essay that this kind of keyboard click-through wasn't yet technologically feasible. Yet he believed it would be soon, pointing to existing systems for handling data such as punched cards as potential forerunners. He envisaged that a user would create the connections between items as they developed their personal research library, creating chains of microfilm frames in which the same document or extract could be part of multiple trails at the same time. New additions could be inserted either by photographing them on to microfilm or by purchasing a microfilm of an existing document. Indeed, a user would be able to augment their memex with vast reference texts. 'New forms of encyclopedias will appear,' said Bush, 'ready-made with a mesh of associative trails running through them, ready to be dropped into the memex'. Fascinatingly, this isn't far from today's Wikipedia. Bush thought the memex would help researchers to think in a more natural, associative way that would be reflected in their records. He is thought to have inspired the American inventors Ted Nelson and Douglas Engelbart, who in the 1960s independently developed hypertext systems, in which documents contained hyperlinks that could directly access other documents. These became the foundation of the world wide web as we know it. Beyond the practicalities of having easy access to so much information, Bush believed that the added value in the memex lay in making it easier for users to manipulate ideas and spark new ones. His essay drew a distinction between repetitive and creative thought, and foresaw that there would soon be new 'powerful mechanical aids' to help with the repetitive variety. He was perhaps mostly thinking about mathematics, but he left the door open to other thought processes. And 80 years later, with AI in our pockets, we're automating far more thinking than was ever possible with a calculator. If this sounds like a happy ending, Bush did not sound overly optimistic when he revisited his own vision in his 1970 book Pieces of the Action. In the intervening 25 years, he had witnessed technological advances in areas like computing that were bringing the memex closer to reality. Yet Bush felt that the technology had largely missed the philosophical intent of his vision – to enhance human reasoning and creativity: In 1945, I dreamed of machines that would think with us. Now, I see machines that think for us – or worse, control us. Bush would die just four years later at the age of 84, but these concerns still feel strikingly relevant today. While it's great that we do not need to search for a book by flipping through index cards in chests of drawers, we might feel more uneasy about machines doing most of the thinking for us. Is this technology enhancing and sharpening our skills, or is it making us lazy? No doubt everyone is different, but the danger is that whatever skills we leave to the machines, we eventually lose, and younger generations may not even get the opportunity to learn them in the first place. The lesson from As We May Think is that a purely technical solution like the memex is not enough. Technology still needs to be human-centred, underpinned by a philosophical vision. As we contemplate a great automation in human thinking in the years ahead, the challenge is to somehow protect our creativity and reasoning at the same time. This article is republished from The Conversation under a Creative Commons license. Read the original article. Martin Rudorfer does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Science cuts endanger research that improves the economy, national security and your life
Science cuts endanger research that improves the economy, national security and your life

Yahoo

time12-06-2025

  • Science
  • Yahoo

Science cuts endanger research that improves the economy, national security and your life

Since its creation, the National Science Foundation has partnered with agencies across the government, including those dealing with national security and space exploration. During this 1969 mission to the moon's surface, astronaut Buzz Aldrin deploys instruments to measure the distance to the Earth, as well as to analyze the chemical composition of solar wind and measure the moon's seismic activity. (NASA photo) Look closely at your mobile phone or tablet. Touch-screen technology, speech recognition, digital sound recording and the internet were all developed using funding from the U.S. National Science Foundation. No matter where you live, NSF-supported research has also made your life safer. Engineering studies have reduced earthquake damage and fatalities through better building design. Improved hurricane and tornado forecasts reflect NSF investment in environmental monitoring and computer modeling of weather. NSF-supported resilience studies reduce risks and losses from wildfires. Using NSF funding, scientists have done research that amazes, entertains and enthralls. They have drilled through mile-thick ice sheets to understand the past, visited the wreck of the Titanic and captured images of deep space. NSF investments have made America and American science great. At least 268 Nobel laureates received NSF grants during their careers. The foundation has partnered with agencies across the government since it was created, including those dealing with national security and space exploration. The Federal Reserve estimates that government-supported research from the NSF and other agencies has had a return on investment of 150% to 300% since 1950, meaning for every dollar U.S. taxpayers invested, they got back between $1.50 and $3. However, that funding is now at risk. Since January, layoffs, leadership resignations and a massive proposed reorganization have threatened the integrity and mission of the National Science Foundation. Hundreds of research grants have been terminated. The administration's proposed federal budget for fiscal year 2026 would cut NSF's funding by 55%, an unprecedented reduction that would end federal support for science research across a wide range of discipines. At my own geology lab, I have seen NSF grants catalyze research and the work of dozens of students who have collected data that's now used to reduce risks from earthquakes, floods, landslides, erosion, sea-level rise and melting glaciers. I have also served on advisory committees and review panels for the NSF over the past 30 years and have seen the value the foundation produces for the American people. In the 1940s, with the advent of nuclear weapons, the space race and the intensification of the Cold War, American science and engineering expertise became increasingly critical for national defense. At the time, most basic and applied research was done by the military. Vannevar Bush, an electrical engineer who oversaw military research efforts during World War II, including development of the atomic bomb, had a different idea. He articulated an expansive scientific vision for the United States in Science: The Endless Frontier. The report was a blueprint for an American research juggernaut grounded in the expertise of university faculty, staff and graduate students. On May 10, 1950, after five years of debate and compromise, President Harry Truman signed legislation creating the National Science Foundation and putting Bush's vision to work. Since then, the foundation has become the leading funder of basic research in the United States. NSF's mandate, then as now, was to support basic research and spread funding for science across all 50 states. Expanding America's scientific workforce was and remains integral to American prosperity. By 1952, the foundation was awarding merit fellowships to graduate and postdoctoral scientists from every state. There were compromises. Control of NSF rested with presidential appointees, disappointing Bush. He wanted scientists in charge to avoid political interference with the foundation's research agenda. Today, American tax dollars supporting science go to every state in the union. The states with the most NSF grants awarded between 2011 and 2024 include several that voted Republican in the 2024 election – Texas, Florida, Michigan, North Carolina and Pennsylvania – and several that voted Democratic, including Massachusetts, New York, Virginia and Colorado. More than 1,800 public and private institutions, scattered across all 50 states, receive NSF funding. The grants pay the salaries of staff, faculty and students, boosting local employment and supporting college towns and cities. For states with major research universities, those grants add up to hundreds of millions of dollars each year. Even states with few universities each see tens of millions of dollars for research.' As NSF grant recipients purchase lab supplies and services, those dollars support regional and national economies. When NSF budgets are cut and grants are terminated or never awarded, the harm trickles down and communities suffer. Initial NSF funding cuts are already rippling across the country, affecting both national and local economies in red, blue and purple states alike. An analysis of a February 2025 proposal that would cut about US$5.5 billion from National Institutes of Health grants estimated the ripple effect through college towns and supply chains would cost $6.1 billion in GDP, or total national productivity, and over 46,000 jobs. America's scientific research and training enterprise has enjoyed bipartisan support for decades. Yet, as NSF celebrates its 75th birthday, the future of American science is in doubt. Funding is increasingly uncertain, and politics is driving decisions, as Bush feared 80 years ago. A list of grants terminated by the Trump administration, collected both from government websites and scientists themselves, shows that by early May 2025, NSF had stopped funding more than 1,400 existing grants, totaling over a billion dollars of support for research, research training and education. Most terminated grants focused on education – the core of science, technology and engineering workforce development critical for supplying highly skilled workers to American companies. For example, NSF provided 1,000 fewer graduate student fellowships in 2025 than in the decade before − a 50% drop in support for America's best science students. American scientists are responding to NSF's downsizing in diverse ways. Some are pushing back by challenging grant terminations. Others are preparing to leave science or academia. Some are likely to move abroad, taking offers from other nations to recruit American experts. Science organizations and six prior heads of the NSF are calling on Congress to step up and maintain funding for science research and workforce development. If these losses continue, the next generation of American scientists will be fewer in number and less well prepared to address the needs of a population facing the threat of more extreme weather, future pandemics and the limits to growth imposed by finite natural resources and other planetary limits. Investing in science and engineering is an investment in America. Diminishing NSF and the science it supports will hurt the American economy and the lives of all Americans. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Medical and Scientific Research Makes America Great
Medical and Scientific Research Makes America Great

Yahoo

time19-05-2025

  • Politics
  • Yahoo

Medical and Scientific Research Makes America Great

Exterior view of the National Institutes of Health in Bethesda, Maryland. Credit - Grandbrothers—Getty Images While the Trump Administration's sharp increases in tariffs have received much of the political and economic attention in our public discourse, there is another subject that could have a far more profound and longer-term negative effect on America which deserves equal, if not greater, attention. This is the serious threat to America's basic, early stage medical and scientific research. During World War II, Dr. Vannevar Bush took a leadership role in ensuring U.S. preeminence in science and research by creating the Office of Scientific Research and Development, which led to the creation of the National Science Foundation (NSF). Other similar scientific research institutions followed. And our academic, government, and business innovation collaboration has over time attracted some of the best talent—among both our own citizens and top scientists worldwide. But now, support for basic science and medical research is in danger of flatlining. We have witnessed a decline in trust for basic research in recent years. This has been accompanied by significant cuts in government financial support leading to sharp cuts in vital personnel in critical scientific and medical research programs at universities, laboratories, and highly regarded medical and scientific centers throughout the country. These cuts do not discriminate. They are occurring in red and blue states alike—in Middle America as well as along the East and West Coasts. And this is not just occuring in highly prestigious and heavily funded institutions, but also on a broader scale. As a second order effect, innovation and advanced research in this country is not, and should not be, a monopoly of a few schools and institutions; and we run the risk that these cuts could also be imposed on smaller universities and research institutions in a wide range of cities and regions. Over a decade ago, as Undersecretary of State responsible for overseeing the Department's international economic policy, I wrote about the role of innovative, basic science in sustaining America's global economic power, enhancing domestic prosperity, producing lifesaving new medicines, modernizing the technology used by our military, and creating breakthroughs in many transformational technologies. Advancements in these areas have been critical to the prosperity and wellbeing of our society, as they are now with accelerating advances in AI and quantum computing. The ultimate benefits of these advances—if sustained—could last for generations to come, potentially providing the basis for many successful new businesses in our country, attracting the world's top research talent to our nation, and generating millions of high quality jobs for those creating and applying new technologies. Sustaining this progress will be especially vital to future domestic medical advances and the prosperity of millions of Americans in future generations. It could also enable this country to maintain a strong and technologically preeminent military—an ever more critical need today given the formidable technological competition from China—and shape the future of international scientific cooperation. Preeminence in science, advanced technology, and medicine is also a key element of America's soft power, strengthening friendships and alliances with countries that want to develop closer collaboration with U.S. universities, companies, and research centers. It benefits not only Americans but also provides better health and faster economic progress for others around the world. Preeminence is also needed for continued U.S. leadership in setting rules, norms, and common international practices to ensure that the results of this research are used for the good of our citizens—as opposed to doing harm to our national or global interests. This dynamic created institutions, and an environment, in the U.S. which supports the best and brightest scientists and researchers from within this country and attracts many thousands who come here from around the world. Today, approximately 16% of all scientists and other STEM workers in the U.S. are foreign born. This is why leaders on both sides of the aisle have strategically prioritized research, as well as the cultivation and attraction of scientific and technological talent, during much of the post-war period. Many German scientists who could, fled here before or during World War II. After the war, one of our most important innovative initiatives was the highly clandestine "Operation Paper Clip" which brought German scientists here. And during the Cold War era, many of these scientists were critical to our successes in space, nuclear, and broader military achievements. Albert Einstein, being a prime example, taught at America's advanced universities, but large numbers also taught in small colleges and junior colleges. And over the past several decades, talented scientists from places such as India, Israel, Japan, China, South Korea and Eastern Europe came to the U.S. as well. Many of them became students and researchers across the country. Some, such as Andy Grove and Sergei Brim, founded, built, and ultimately ran America's most successful tech companies. It was in part because of our pro-science, pro-innovation environment and openness to foreign talent, that so many brilliant scholars and researchers came to the United States. They did so for freedom and the opportunity to do innovative research. One did not see such foreign-born scientists falling over one another to go to the Soviet Union, a nation characterized by a heavy-handed and oppressive environment that was not conducive to this. In the end, many of these scientists—and the investment in American science in general—played instrumental roles in winning the Cold War and the Space Race. But now, support for vital basic early-stage research is falling sharply. Recent, large, and sudden cuts seriously endanger American scientific preeminence and thus America's prosperity, social wellbeing, and national security. For example, as the result of NSF and NIH funding being dramatically cut, or placed on hold, we are seeing the reversal of talent out of the United States. Companies and governments in many countries are actively advertising to attract researchers being let go by our government institutions, our companies, and our universities due to funding and personnel cuts. Many are highly unlikely to return. This it is hardly encouraging, considering how vital AI is to our future, that, as Nvidia 's CEO recently reminded us, roughly one half of all AI researchers in the world now are Chinese. We used to be the magnet for talent from around the world. Now China, Singapore, other countries in Asia and several European countries as well aim to be—and to create a reverse brain drain. Those who care about America's global research and technology leadership, should be alarmed by the deep cuts we are witnessing in research funding, the firing of personnel with enormous scientific expertise, and a sharp shift away from the attraction of foreign scientists—many of whom are critical to our healthcare system, technological innovation, and entrepreneurial success. For example, recent budget proposals call for a 40% cut in funds for NIH, which would lead to similarly large terminations in vital medical research projects, and an over 50% cut in NSF funding, which will mean terminations in critical STEM education programs. Fortunately, business has significantly increased research and development over the last two decades, but only a small fraction of that goes to critical early stage, basic research. I am a great believer in reducing this nation's debt and deficits, which endanger our economy and place an enormous burden on future generations. Cutting unproductive personnel from some parts of the government also makes sense and I commend those in both parties aiming to do this. These cuts, however, must be strategic, justified, and well placed to be effective. Large and sudden across-the-board cuts, however, inevitably end up cutting or killing essential research funding and the firing of highly talented people. They frequently come at a cost to the health outcomes and economic wellbeing of large numbers of Americans now and for generations to come. Lastly, they threaten our country's research and technological preeminence on which our strategic and defense capabilities depend—which is especially troubling given the rapid rise of Chinese technical and military prowess. These are not isolated events that have occurred only in the recent months of this administration or even in recent years. This broad erosion has been taking place for several decades during periods of both Democratic and Republican leadership. Large numbers of Americans have become skeptical about, and some openly hostile to, advanced science and basic research. Some groups are especially skeptical, in most cases due to a plethora of inaccurate information over social media, about immunizations and therapies produced by modern medicine. This sadly stands in sharp contrast with the virtually unanimous social and bipartisan support and sense of national pride in America's scientific achievements in decades past, which have led to remarkable breakthroughs: the dramatically successful Salk polio vaccine, various lifesaving HIV/AIDs therapies, mapping the human genome, remarkable U.S. feats in the space race, military technologies that helped win the Cold War, computer and wireless communications technologies that have dramatically boosted business, and countless new medicines being developed in labs throughout our country that have cured or helped prevent, as in Operation Warp Speed, so many diseases in the last several decades. We need to ensure renewed American leadership and dynamism in these areas. To do so we must reverse this downward course and ensure sustained U.S. leadership in basic medical and scientific research and 21st Century technology development. President Donald Trump has an enormous opportunity to exercise world class leadership in this area, as American leaders did after World War II. The opportunity is now and the need is urgent, especially while the budget is being debated in Washington. Making and keeping "America great" will depend heavily on a robust, well-funded, and multifaceted basic research effort by our institutions and scientists, sustaining America's already remarkable technological and medical advances. By shifting their emphasis from sharply cutting back support for basic research to enthusiastically funding supporting it now, our leaders in Washington can transcend any one period, party, or administration and instead benefit large numbers of Americans now and for generations to come, who could enjoy opportunities for a longer, healthier and higher quality of life, better jobs, and more defense security. If America is to have a new "golden age,' investing in research is an indispensably important path to achieve it. Contact us at letters@

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store