logo
#

Latest news with #FunctionalRepairofNeocorticalTissue

Brain-repair research 2.0
Brain-repair research 2.0

Politico

time11-07-2025

  • Health
  • Politico

Brain-repair research 2.0

THE LAB The Department of Health and Human Services unveiled a program Thursday to advance brain-repair research and develop treatments for brain injuries and devastating neurological diseases like stroke and Alzheimer's. Why it matters: The program, called the Functional Repair of Neocortical Tissue, or FRONT, aims to find cures for these neurodegenerative and other traumatic diseases long regarded as irreversible by leveraging stem-cell technology to regenerate brain tissue and restore brain function. It's part of a line of research scientists have explored for years: Can they develop early-stage cells into brain or other normal tissue to replace disrupted tissue? The agency is expected to turn in a solution summary on Aug. 18. The research comes as the number of Americans afflicted by neurodegenerative diseases increases. A study published in Nature in January found that dementia prevalence among Americans over 55 could double by 2060, but scientists haven't yet discovered technology that fully repairs damaged brain tissue or restores lost brain function. The intensifying caseload has left millions of Americans bearing 'the overwhelming costs of brain damage, a crisis that drains the U.S. health care system by over a trillion dollars annually,' Jason Roos, acting director of HHS' Advanced Research Projects Agency for Health, which will oversee the program, said in a news release. And … The initiative comes as the federal government cleaves its research and public health infrastructure by shrinking funds and its workforce. The directives were part of the Trump administration's moveto eliminate research it identified as wasteful and related to gender ideology and diversity, equity and inclusion. Those efforts have also impacted projects on HIV prevention and cancer rates among firefighters. The agency's announcement illuminates the type of research the administration aims to emphasize. In a statement, HHS cast FRONT as particularly beneficial to military personnel, a key Trump constituency that contends with pronounced rates of traumatic brain injuries. 'This initiative will provide direct support to our nation's servicemen and women, ensuring they receive the care they deserve for their sacrifice,' the agency wrote. 'This program will provide new hope to millions who have suffered severe brain damage and now rely on caregivers for daily living.' WELCOME TO FUTURE PULSE This is where we explore the ideas and innovators shaping health care. Earlier this year, the Trump administration slashed federal research funding for Johns Hopkins University, one of the largest recipients of government grants. Now, the university is lobbying power brokers on the Hill and adopting 'an urgent plea: Keep funding us because we're actually a good bang for the buck,' The Baltimore Banner's Ellie Wolfe and Meredith Cohn report. Share any thoughts, news, tips and feedback with Danny Nguyen at dnguyen@ Carmen Paun at cpaun@ Ruth Reader at rreader@ or Erin Schumaker at eschumaker@ Want to share a tip securely? Message us on Signal: Dannyn516.70, CarmenP.82, RuthReader.02 or ErinSchumaker.01. TECH MAZE It turns out that AI chatbots are not good therapists. Researchers have found that chatbots expressed stigma toward people with mental health conditions and responded inappropriately to some common and critical conditions in therapeutic settings by encouraging delusional thinking. The study examined the ability of chatbots to replicate the relationship between therapists and clients. The researchers prompted artificial intelligence chatbots to respond to questions about how they would evaluate and respond to someone's mental state. Then, they compared the responses with a benchmark of 'good therapy' to determine whether AI stigmatized patients, enabled suicidal ideation and reinforced hallucinations, among other things. The researchers ran the experiments on two large-language models, OpenAI's GPT-4o and Meta's Llama, and on commercially available therapy bots. The study found that even the newer versions of these chatbots also offered inappropriate responses, 'indicating that current safety practices may not address these gaps,' the researchers wrote. They also found AI couldn't replicate human characteristics critical to building a healthy therapeutic relationship, such as stakes that make a therapist responsible for suggestions or solutions and can serve as a check for shoddy, unempathetic answers. Unlike human therapists, chatbots lack the essential ability to challenge their clients' perspectives and provide reality checks when needed. Chatbots are designed to be 'compliant and sycophantic,' the researchers found. The study was conducted by researchers at Stanford University, Carnegie Mellon University, the University of Minnesota, Twin Cities, and the University of Texas at Austin. Why this matters: The findings come as Americans increasingly turn to chatbots for therapy and advice because of dwindling mental health care access. Some programs aren't marketed or designed to offer therapy, but people still report positive feedback after using them for therapeutic purposes, the researchers acknowledge.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store