logo
Studies in US, UK warn of flaws in AI-powered health guidance

Studies in US, UK warn of flaws in AI-powered health guidance

Coin Geeka day ago
Getting your Trinity Audio player ready...
Two recently published studies have revealed that generative artificial intelligence (AI) tools, including large language models (LLMs) ChatGPT and Gemini, produce misinformation and bias when used for medical information and healthcare decision-making.
In the United States, researchers from a medical school at Mount Sinai published a study on August 2 showing that LLMs were highly vulnerable to repeating and elaborating on 'false facts' and medical misinformation.
Meanwhile, across the Atlantic, the London School of Economics and Political Science (LSE) published a study shortly afterward that found AI tools used by more than half of England's councils are downplaying women's physical and mental health issues, creating a risk of gender bias in care decisions.
Medical AI
LLMs, such as OpenAI's ChatGPT, are AI-based computer programs that generate text using large datasets of information on which they are trained.
The power and performance of such technology have increased exponentially over the past few years, with billions of dollars being spent on research and development in the area. LLMs and AI tools are now being deployed across almost every industry, to different extents, not least in the medical and healthcare sector.
In the medical space, AI is already being used for various functions, such as reducing the administrative burden by automatically generating and summarizing case notes, assisting in diagnostics, and enhancing patient education.
However, LLMs are prone to the 'garbage in, garbage out' problem, relying on accurate, factual data making up their training material or they may reproduce the errors and bias in the datasets. This results in what is often known as 'hallucinations,' which is the generation of content that is irrelevant, made-up, or inconsistent with the input data.
In a medical context, these hallucinations can include fabricated information and case details, invented research citations, or made-up disease details.
US study shows chatbots spreading false medical information
Earlier this month, researchers from the Icahn School of Medicine at Mount Sinai published a paper titled 'multi-model assurance analysis showing large language models are highly vulnerable to adversarial hallucination attacks during clinical decision support.'
The study aimed to test a subset of AI hallucinations that arise from 'adversarial attacks,' in which made-up details embedded in prompts lead the model to reproduce or elaborate on the false information.
'Hallucinations pose risks, potentially misleading clinicians, misinforming patients, and harming public health,' said the paper. 'One source of these errors arises from deliberate or inadvertent fabrications embedded in user prompts—an issue compounded by many LLMs' tendency to be overly confirmatory, sometimes prioritizing a persuasive or confident style over factual accuracy.'
To explore this issue, the researchers tested six LLMs: DeepSeek Distilled, GPT4o, llama-3.3-70B, Phi-4, Qwen-2.5-72B, and gemma-2-27b-it, with 300 pieces of text similar to clinical notes written by doctors, but each containing a single fake laboratory test, physical or radiological sign, or medical condition. They were tested under 'default' (standard settings) as well as with 'mitigating prompts' designed to reduce hallucinations, generating 5,400 outputs. If a model elaborated on the fabricated detail, the case was classified as a 'hallucination.'
The results showed that hallucination rates ranged from 50% to 82% across all models and prompting methods. The use of mitigating prompts lowered the average hallucination rate, but only from 66% without to 44% with a mitigating prompt.
'We find that the LLM models repeat or elaborate on the planted error in up to 83% of cases,' reported the researchers. 'Adopting strategies to prevent the impact of inappropriate instructions can half the rate but does not eliminate the risk of errors remaining.'
They added that 'our results highlight that caution should be taken when using LLM to interpret clinical notes.'
According to the paper, the best-performing model was GPT-4o, whose hallucination rates declined from 53% to 23% when mitigating prompts were used.
However, with even the best-performing model producing potentially harmful hallucinations in almost a quarter of cases—even with mitigating prompts—the researchers concluded that AI models cannot yet be trusted to provide accurate and trustworthy medical data.
'LLMs are highly susceptible to adversarial hallucination attacks, frequently generating false clinical details that pose risks when used without safeguards,' said the paper. 'While prompt engineering reduces errors, it does not eliminate them… Adversarial hallucination is a serious threat for real‑world use, warranting careful safeguards.' The Mount Sinai study isn't the only recent paper published in the U.S. medical space that has brought into question the use of AI.
In another damaging example, on August 5, the Annals of Internal Medicine journal reported a case of a 60-year-old man who developed bromism, also known as bromide toxicity, after consulting ChatGPT on how to remove salt from his diet. According to advice from the LLM, the man swapped sodium chloride (table salt) for sodium bromide, which was used as a sedative in the early 20th century, resulting in the rare condition.
But it's not just the stateside that AI advice is taking a PR hit.
UK study finds gender bias in LLMs
While U.S. researchers were finding less-than-comforting results when testing whether LLMs reproduce false medical information, across the pond a United Kingdom study was turning up equally troubling results related to AI bias.
On August 11, a research team from LSE, led by Dr Sam Rickman, published their paper on 'evaluating gender bias in large language models in long-term care,' in which they evaluated gender bias in summaries of long-term care records generated with two open-source LLMs, Meta's (NASDAQ: META) Llama 3 and Google's (NASDAQ: GOOGL) Gemma.
In order to test this, the study created gender-swapped versions of long-term care records for 617 older people from a London local authority and asked the LLMs to generate summaries of male and female versions of the records.
While Llama 3 showed no gender-based differences across any metrics, Gemma displayed significant differences.
Specifically, male summaries focused more on physical and mental health issues. Language used for men was also more direct, while women's needs were 'downplayed' more often than men's. For example, when Google's Gemma was used to generate and summarize the same case notes for men and for women, language such as 'disabled,' 'unable,' and 'complex' appeared significantly more often in descriptions of men than women.
In other words, the study found that similar care needs in women were more likely to be omitted or described in less severe terms by specific AI tools, and that this downplaying of women's physical and mental health issues risked creating gender bias in care decisions.
'Care services are allocated on the basis of need. If women's health issues are underemphasized, this may lead to gender-based disparities in service receipt,' said the paper. 'LLMs may offer substantial benefits in easing administrative burden. However, the findings highlight the variation in state-of-the-art LLMs, and the need for evaluation of bias.'
Despite the concerns raised by the study, the researchers also highlighted the benefits AI can provide to the healthcare sector.
'By automatically generating or summarizing records, LLMs have the potential to reduce costs without cutting services, improve access to relevant information, and free up time spent on documentation,' said the paper.
It went on to note that 'there is political will to expand such technologies in health and care.'
Despite flaws, UK's all-in on AI
British Prime Minister Keir Starmer recently pledged £2 billion ($2.7 billion) to expand Britain's AI infrastructure, with the funding targeting data center development and digital skills training. This included committing £1 billion ($1.3 billion) of funding to scale up the U.K.'s compute power by a factor of 20.
'We're going to bring about great change in so many aspects of our lives,' said Starmer, speaking to London Tech Week on June 9. He went on to highlight health as an area 'where I've seen for myself the incredible contribution that tech and AI can make.'
'I was in a hospital up in the Midlands, talking to consultants who deal with strokes. They showed me the equipment and techniques that they are using – using AI to isolate where the clot is in the brain in a micro-second of the time it would have taken otherwise. Brilliantly saving people's lives,' said the Prime Minister. 'Shortly after that, I had an incident where I was being shown AI and stethoscopes working together to predict any problems someone might have. So whether it's health or other sectors, it's hugely transformative what can be done here.'
It's unclear how, or if, the LSE study and its equally AI-critical U.S. counterparts may affect such commitments from the government, but for now the U.K. at least seems set on pursuing the advantages AI tools such as LLMs can provide across the public and private sector.
In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out CoinGeek's coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.
Watch: Demonstrating the potential of blockchain's fusion with AI
title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen="">
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Action demanded as North Shields dental surgery limits NHS care
Action demanded as North Shields dental surgery limits NHS care

BBC News

timean hour ago

  • BBC News

Action demanded as North Shields dental surgery limits NHS care

Concern over a dental surgery's decision to stop taking on NHS patients has prompted calls for a solution to be found "urgently".Verne Road Dental Practice in North Shields blamed financial and staffing strains for its move to limit NHS access to children, vulnerable adults and those in acute North East and North Cumbria Integrated Care Board (ICB) said it was working with the company, which had seen three dentists leave in the past two a letter to North Tyneside mayor Karen Clark, Conservative opposition leader Liam Bones said: "Given the urgency of the situation, I am calling on you to immediately convene the North Tyneside Dental Taskforce." Bones said the meeting "should bring together local dentists, NHS England representatives, public health officials, and councillors from all parties".The practice has informed patients it would use its "small NHS contract" to prioritise the selected group with everyone else invited to sign up for a private said it was facing challenges including "funding, increasing staff and material costs as well as recruitment difficulties" and was having to make "crucial decisions to ensure our practice survives". Reassuring dental patients Many patients had expressed their confusion and concerns online, the Local Democracy Reporting Service Labour MP Sir Alan Campbell urged the surgery to provide urgent treatments at commenting on his social media post said the changes were "devastating" and it was "impossible to find NHS dentist in the area".North Tyneside Council director of public health, Wendy Burke, said she was concerned about access to NHS dental services in the area and about the impact of the decision "now and in the future". ICB chief procurement and contracting officer David Gallagher said the practice was in a difficult situation but "they have not asked to end their contract and they remain an NHS service provider"."We are working with the provider with a view to fully understanding the issues, offering support where possible and to provide clarity and reassurance to patients," he said. Follow BBC North East on X, Facebook, Nextdoor and Instagram.

‘I'd rather die than go back there': How a vulnerable teenager was failed by a scandal-hit hospital
‘I'd rather die than go back there': How a vulnerable teenager was failed by a scandal-hit hospital

The Independent

timean hour ago

  • The Independent

‘I'd rather die than go back there': How a vulnerable teenager was failed by a scandal-hit hospital

Traumatised by months of forced feeding and isolation from her family on a secure psychiatric ward, 14-year-old Ruth Szymankiewicz fled her family home through a toilet window and flung herself into a freezing pond in a desperate attempt to get someone to listen. 'I'd rather die than go back to Huntercombe,' the teenager had warned just hours before her father Mark, dragged her from the icy water near their Berkshire home just after Christmas 2021. Two months later, Ruth was dead. The teenager died in hospital two days after having self-harmed when she was left unsupervised at Huntercombe Hospital in Maidenhead, where she should have had constant supervision. Her death sparked a three-year-long police investigation, which found the agency worker responsible for her care was working under a fake name and had completed just a day or a day and a half of online training before his first shift on the understaffed ward Ruth was on. Police were not able to question the worker, known as Ebo Achempong, as he had fled the country. Last week, a jury at her inquest made a rare ruling that her death amounted to an unlawful killing after a litany of failings in her care were uncovered at the 10-day hearing. Ruth, who had an eating disorder, Tourette syndrome and a tic condition, was placed at Huntercombe in October 2021 because her mental health had declined and she required feeding through a tube after refusing to eat. But while there, she was denied therapy to help her manage the often painful and traumatic tube feeding that kept her alive, she was left without regular access to psychologists and was only allowed to see her family twice a week. On a much-anticipated trip home for Christmas in 2021, the teenager was happy to be reunited with her family. But she soon became anxious about her return to Huntercombe and protested over being sent back. The night before she was to return, she self-harmed, and when it was time to leave, she squeezed herself out of a small downstairs toilet window and ran. Her father, Mark, followed her and watched on, horrified, as she entered the "icy bog", pleading with her to come out. When she refused, he pulled her out because she was becoming critically hypothermic, her inquest heard. He carried her back to the house, where the family warmed her up, but when the secure transport finally arrived to take her back to Huntercombe, the situation that unfolded left them 'absolutely broken'. In a statement read at the inquest, Ruth's mother Kate recounted the horrifying ordeal: 'We had to watch our daughter handcuffed and her legs taped together so that they could restrain her to get her into the transport. Our daughter was screaming throughout the whole ordeal. This was a breaking point for Mark and I. 'It absolutely broke us to see our daughter being treated this way and seeing her so distressed, but there was no other option given to us apart from for her to be sectioned and taken to Huntercombe, as nowhere else could help. We felt helpless. I think that Ruth also felt that she could see no way out of this.' Ruth's despair at her time spent on the Thames ward at Huntercombe was laid bare further in a handwritten note found after her death. Addressing it to 'important people', she complained about a lack of therapy for patients like her at the hospital, which she said had an 'unsafe number of staff'. The note read: 'I don't really know who this is really directed to... Huntercombe, it doesn't deserve a capital H. 'It is the s***test mental health institution you could get... the unsafe number of staff, how the place makes you worse, the staff literally sleep on their shifts. I don't want this to happen to any other patients ever. My suggestion is, shut this place down.' Eventually, more than a year after Ruth's death, Huntercombe Maidenhead, or Taplow Manor as it was renamed, was shut after a series of investigations by The Independent and Sky News, which revealed allegations of 'systemic abuse'. Patients sent to Huntercombe Group hospitals revealed how they felt like 'caged animals', with claims they were subjected to painful, bruising restraints, medicated so heavily they felt like 'zombies', and isolated from their families. Ruling that Ruth's death amounted to 'unlawful killing', jurors said there had been systematic failings not only at Huntercombe but by the NHS mental health system that funded her placement at the private equity-owned hospital. Clutching Ruth's stuffed Giraffe as they listened to harrowing evidence of their daughter's time at Huntercombe, her parents described how just a few years earlier, she had been happy. They recounted how their first-born child – a little girl 'with a head of bright red hair' - was a fiery and determined character with a 'huge heart', a 'deep passion for life' and a love of animals and the outdoors. But in December 2020, she suddenly developed physical and vocal tics, and the family faced an 18-month wait to access specialist care. In 2021, she developed an eating disorder and by August, she was admitted to Salisbury Hospital before she was sectioned under the Mental Health Act. Eventually, NHS officials decided she needed to be admitted to a specialist children's mental health unit, and she was taken to Huntercombe. Her parents told the inquest they felt pressured into agreeing to the move, and were soon 'trapped' in a system that was meant to care for Ruth, but instead 'locked her away and harmed her'. During her time at the hospital, the vulnerable teenager was allowed unfettered access to her mobile phone, which she was later found to have used to search for methods of self-harm. Throughout the inquest, multiple staff voiced concerns over short staffing in the unit, with a visibly distressed senior support worker, Michelle Hance, breaking down as she spoke detailed the pressures on workers. Dr Gillian Combe, a senior NHS doctor working for the Thames Valley provider collaborative, which was responsible for Ruth's admission to Huntercombe, admitted that the NHS did not do enough for the 14-year-old. She told the inquest the NHS was aware that the hospital was understaffed daily, and that there were concerns over the care it provided, but there were no other suitable choices available. She appeared to make a plea for more money to build the beds it 'desperately needed'. Standing outside of the Berkshire court room after the inquest concluded, Mark and Kate made a heartfelt plea for Ruth's story to matter: 'Remember Ruth's story. Remember her in the faces of the young people who look to you for help and support.' 'What happened to Ruth is shocking, tragic and harrowing. Whilst there is much more to be said, if change can come from her story, it can make a tangible difference to others.' In response to the ruling, Active Care Group, formerly known as The Huntercombe Group, said: 'We extend our heartfelt condolences to Ruth's family, friends, and all those affected by her passing. We deeply regret the tragic event that occurred, and we are truly sorry for the distress this has caused and recognise the profound impact it has had on everyone who knew her.' The group said it was disappointed that a third-party company it had hired had breached its terms of contract, though it did not state what the breach of contract was. It also said it had made improvements to the quality and safety of its services since.

What is MRSA? Symptoms and how to avoid deadly infection after rise in UK cases
What is MRSA? Symptoms and how to avoid deadly infection after rise in UK cases

The Independent

time2 hours ago

  • The Independent

What is MRSA? Symptoms and how to avoid deadly infection after rise in UK cases

There has been a sharp rise in cases of the superbug MRSA being contracted outside hospitals across the UK. Britons have been urged to avoid sharing items such as towels or razors, particularly in gyms and leisure centres where the bacteria has been spreading. Figures show that 175 people were infected with MRSA in the community between January and March this year – a 47 per cent increase on the 119 cases recorded during the same period in 2019. At the same time, those contracting MRSA have been getting younger. Nearly a quarter of community-onset cases in 2023–24 were recorded in people under 45, compared with just one in 10 in 2007 to 2008. The UK Health Security Agency (UKHSA) said it was 'too early' to know if this rise represents a lasting change, but it added that infection rates are being closely tracked. Here, The Independent takes a look at what MRSA is, what the symptoms are, and how to get treated for it: What is MRSA? According to the NHS, MRSA (meticillin-resistant Staphylococcus aureus) is a type of bacteria that usually lives harmlessly on the skin. However, if it gets inside the body, it can cause a serious infection that requires immediate treatment with antibiotics. The UKHSA explains that Staphylococcus aureus is commonly found on human skin and mucosa (the moist, inner lining of some organs and body cavities). In many cases, it causes no problems. But when it enters the body, through broken skin or a medical procedure, it can lead to illnesses ranging from infected eczema and abscesses to pneumonia, joint infections, or bloodstream infections. Most strains of S. aureus can be treated with standard antibiotics, the NHS says, but MRSA is resistant to meticillin and often requires alternative drugs. What are the symptoms of MRSA? The NHS says that many people carry MRSA on their skin without showing symptoms. Problems only arise if the bacteria cause an infection. If MRSA spreads deeper into the body, it can cause more severe symptoms such as: High temperature Chills Dizziness or confusion Breathing difficulties How is MRSA treated? The treatment of MRSA depends on how serious the infection is. Mild MRSA infections may be managed with antibiotic tablets, the NHS says. However, for more severe infections, hospital treatment may be needed. This often involves antibiotics delivered through an injection or a drip. Courses of antibiotics can last from several days to several months, depending on the severity of the infection, according to the NHS. In cases where abscesses or collections of pus form, surgery may be required to drain the infected area. How can MRSA be prevented? The NHS advises that people staying in hospitals or care homes face a higher risk of MRSA, especially if they are undergoing surgery. Visitors are urged to follow strict hygiene instructions, including washing or sanitising their hands. The spread outside of hospital and care settings can be reduced through everyday hygiene measures: Before surgery, patients may be offered an MRSA screening test, the NHS says. This involves taking swabs from the nostrils, mouth or groin. If MRSA is detected, a short course of antibacterial cream, shampoo and body wash is usually prescribed to clear the bacteria before the procedure. The NHS says MRSA can affect anyone, but some people face a higher risk, including those who: The UKHSA warns that while most infections can be treated, resistant strains such as MRSA present more challenges, making prevention and monitoring vital.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store