22-05-2025
Organ-chips not ready to replace animal studies
Presented by
EXAM ROOM
One of the cutting-edge technologies the Food and Drug Administration wants to use to replace animal studies might not be ready for a solo performance.
Organ-on-a-chip technology, which uses human cells on microfluidic chips to mimic the structure and function of organs in a laboratory setting, can't yet replace animal tests, according to a new Government Accountability Office report.
Standing in the way: Challenges include cost, availability of materials, a time-intensive process and the need for highly trained staff to operate the technology. OOCs aren't standardized, which makes reproducibility difficult. The National Institute of Standards and Technology told the GAO that standards are needed, particularly for multi-organ chips, but the technology is evolving too rapidly to set them.
The report also highlights a lack of agreed-upon benchmarks for OOCs and validation studies.
However, OOCs could work alongside animal studies, particularly for exploring toxicity, the GAO said. It also found that OOCs could be used in lieu of animal studies for certain standardized tests, for example, to assess skin damage from a compound.
Some recommendations: GAO called for policies that:
— Increase access to diverse, high-quality human cells
— Create standards around the technology
— Encourage more research and validation studies
— Provide regulatory guidance
Notably, it said companies were confused about FDA guidance regarding OOCs. And as of the end of last year, the agency hadn't qualified an OOC for use in regulatory review. However, the FDA's Innovative Science and Technology Approaches for New Drugs pilot program accepted a letter of intent for an OOC that would eventually predict drug-induced liver injury.
What's next: 'Body-on-a-chip' is coming. Instead of chips with single organs, the next generation of OOCs will link multiple organs, including intestines, livers and kidneys— to understand how they interact.
WELCOME TO FUTURE PULSE
This is where we explore the ideas and innovators shaping health care.
Kids advocacy group Fairplay and the Electronic Privacy Information Center are asking the Federal Trade Commission to investigate whether a new kid-focused release of Google's AI chatbot Gemini is violating children privacy laws. Google says the technology is available through parent-supervised accounts and parents are free to disable it.
Share any thoughts, news, tips and feedback with Danny Nguyen at dnguyen@ Carmen Paun at cpaun@ Ruth Reader at rreader@ or Erin Schumaker at eschumaker@
Want to share a tip securely? Message us on Signal: Dannyn516.70, CarmenP.82, RuthReader.02 or ErinSchumaker.01.
AROUND THE NATION
States are increasingly interested in making Apple and Google responsible for protecting kids from online harms.
Texas is poised to be the second state to require app stores, like Apple's App Store and Google's Google Play store, to verify their users' ages and — if they're minors — get parental consent to download apps. In March, Utah became the first state to sign an app store age-verification bill into law.
The bill sailed through the Texas House with support from 80 percent of the state Legislature and passed in the Senate by voice vote last week. Now it's awaiting Governor Greg Abbott's signature.
In practice, app stores must verify a user's age. If the user is a minor, the app store must obtain parental consent for each app download. The app stores would then relay this information to the app developer, because some apps provide different experiences based on age. However, certain apps like crisis hotlines and emergency services won't require parental consent.
Pushback: Google isn't happy about the bill's advancement (Apple also opposes this legislation). In particular, the company says there's no commercially reasonable way to verify who a child's parent is. 'Will they need to show a birth certificate or custody document to demonstrate that they have the legal authority to make decisions on behalf of a child?' asked Kareem Ghanem, Google's Senior Director of Government Affairs & Public Policy.
Google prefers a targeted approach: Send 'an age signal' with explicit parental consent only to developers whose apps pose risks to minors.
But such picking and choosing could open this legislation up to legal scrutiny.
Long-time concerns: Doctors, including former Surgeon General Vivek Murthy; parents; and even kids are frustrated with the state of online media. For years, growing evidence has suggested that social media apps wear on kids' mental health.
But social media platforms enjoy protections from a decades-old law that prevents them from being sued their platforms' content.
And states like California and Maryland that have tried to put guardrails on social media have been sued for blocking free speech.
Legal challenges: Requiring app stores to verify ages isn't likely run into First Amendment issues. What's more, the policy rests on a fairly well-established legal foundation: contract law. For years, app stores have required minors to sign lengthy contracts — the ones most people don't read — before creating accounts, and legally, it can't do that. Minors can sign contracts but they aren't legally enforceable. App store age-verification laws, however, require sign-off from a legal guardian.
Supporters hope app store accountability laws will provide a first-line defense, funneling more kids into parent-linked app store accounts. It could also render the 1998 Children's Online Privacy Protection Act, which limits the amount of data that apps and websites can collect on children under 13, more enforceable. However, the law doesn't change social media or the risks associated with those platforms.
What's next: As more states take up app-store age verification, federal lawmakers considering similar legislation are likely to feel more pressure to prioritize it.