logo
Man develops rare condition after ChatGPT query over stopping eating salt

Man develops rare condition after ChatGPT query over stopping eating salt

The Guardian15 hours ago
A US medical journal has warned against using ChatGPT for health information after a man developed a rare condition following an interaction with the chatbot about removing table salt from his diet.
An article in the Annals of Internal Medicine reported a case in which a 60-year-old man developed bromism, also known as bromide toxicity, after consulting ChatGPT.
The article described bromism as a 'well-recognised' syndrome in the early 20th century that was thought to have contributed to almost one in 10 psychiatric admissions at the time.
The patient told doctors that after reading about the negative effects of sodium chloride, or table salt, he consulted ChatGPT about eliminating chloride from his diet and started taking sodium bromide over a three-month period. This was despite reading that 'chloride can be swapped with bromide, though likely for other purposes, such as cleaning'. Sodium bromide was used as a sedative in the early 20th century.
The article's authors, from the University of Washington in Seattle, said the case highlighted 'how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes'.
They added that because they could not access the patient's ChatGPT conversation log, it was not possible to determine the advice the man had received.
Nonetheless, when the authors consulted ChatGPT themselves about what chloride could be replaced with, the response also included bromide, did not provide a specific health warning and did not ask why the authors were seeking such information – 'as we presume a medical professional would do', they wrote.
The authors warned that ChatGPT and other AI apps could ''generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation'.
ChatGPT's developer, OpenAI, has been approached for comment.
The company announced an upgrade of the chatbot last week and claimed one of its biggest strengths was in health. It said ChatGPT – now powered by the GPT-5 model – would be better at answering health-related questions and would also be more proactive at 'flagging potential concerns', such as serious physical or mental illness. However, it stressed that the chatbot was not a replacement for professional help.
The journal's article, which was published last week before the launch of GPT-5, said the patient appeared to have used an earlier version of ChatGPT.
While acknowledging that AI could be a bridge between scientists and the public, the article said the technology also carried the risk of promoting 'decontextualised information' and that it was highly unlikely a medical professional would have suggested sodium bromide when a patient asked for a replacement for table salt.
As a result, the authors said, doctors would need to consider the use of AI when checking where patients obtained their information.
The authors said the bromism patient presented himself at a hospital and claimed his neighbour might be poisoning him. He also said he had multiple dietary restrictions. Despite being thirsty, he was noted as being paranoid about the water he was offered.
He tried to escape the hospital within 24 hours of being admitted and, after being sectioned, was treated for psychosis. Once the patient stabilised, he reported having several other symptoms that indicated bromism, such as facial acne, excessive thirst and insomnia.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Man, 60, poisoned himself after taking medical advice from ChatGPT
Man, 60, poisoned himself after taking medical advice from ChatGPT

Daily Mail​

time2 hours ago

  • Daily Mail​

Man, 60, poisoned himself after taking medical advice from ChatGPT

A man was left fighting for his sanity after replacing table salt with a chemical more commonly used to clean swimming pools after following AI advice. The 60-year-old American spent three weeks in hospital suffering from hallucinations, paranoia and severe anxiety after taking dietary tips from ChatGPT. Doctors revealed in a US medical journal that the man had developed bromism - a condition virtually wiped out since the 20th century - after he embarked on a 'personal experiment' to cut salt from his diet. Instead of using everyday sodium chloride, the man swapped it for sodium bromide, a toxic compound once sold in sedative pills but now mostly found in pool-cleaning products. Symptoms of bromism include psychosis, delusions, skin eruptions and nausea - and in the 19th century it was linked to up to eight per cent of psychiatric hospital admissions. The bizarre case took a disturbing turn when the man turned up at an emergency department insisting his neighbour was trying to poison him. He had no previous history of mental illness. Intrigued and alarmed, doctors tested ChatGPT themselves. The bot, they said, still recommended sodium bromide as a salt alternative, with no mention of any health risk. The case, published in the Annals of Internal Medicine, warns that the rise of AI tools could contribute to 'preventable adverse health outcomes' in a chilling reminder of how machine-generated 'advice' can go horrible wrong. AI chatbots have been caught out before. Last year, a Google bot told users they could stay healthy by 'eating rocks' – advice seemingly scraped from satirical websites. OpenAI, the Silicon Valley giant behind ChatGPT, last week announced that its new GPT-5 update is better at answering health questions. A spokesman told The Telegraph: 'You should not rely on output from our services as a sole source of truth or factual information, or as a substitute for professional advice.' Daily Mail have approached OpenAI for comment. It comes after clinical psychologist Paul Losoff told the that dependency on AI robots is becoming a huge risk, and warned against getting too close to ChatGPT. 'One might come to depend and rely on AI so [much] that they don't seek out human interactions,' he said. He explained that this could be especially detrimental for those who may already be struggling with anxiety or depression. Dr. Losoff explained that by using AI, these people may worsen their conditions and experience cognitive symptoms like chronic pessimism, distorted thinking, or cloudy thinking. And that in itself could create further issues. 'Because of these cognitive symptoms, there is a risk that an individual turning to AI may misinterpret AI feedback leading to harm,' he said. And when it comes to people who may be in crisis, this may only exacerbate issues. Dr. Losoff said that there is always a risk that AI will make mistakes and provide harmful feedback during crucial mental health moments. 'There also is a profound risk for those with acute thought disorders such as schizophrenia in which they would be prone to misinterpreting AI feedback,' he said.

5-year-old boy had his cancer treatments interrupted with his deportation as his family files suit against ICE
5-year-old boy had his cancer treatments interrupted with his deportation as his family files suit against ICE

The Independent

time2 hours ago

  • The Independent

5-year-old boy had his cancer treatments interrupted with his deportation as his family files suit against ICE

The family of a five-year-old boy with a 'rare and aggressive' form of kidney cancer has sued ICE after his treatments were interrupted when the agency deported him. The boy, referred to BY the pseudonym Romeo, was deported along with his seven-year-old sister and their 25-year-old mother on April 25, according to a federal lawsuit filed in Louisiana. The children are U.S. citizens and were born in Louisiana and the young boy was diagnosed with his condition at the age of two. 'Romeo needs regular specialized care and follow-up treatment to this day,' the lawsuit states. The case is being brought by the National Immigration Project on behalf of the family as well as another mother, who also has two children who are U.S. citizens – which claims the families were deported 'without even a semblance of due process.' Documents seen by The Independent allege that ICE violated its own policy and multiple federal laws when officers secretly detained the families in hotel rooms, denied them the opportunity to speak to family and make decisions about or arrangements for their minor children. The suit also alleges that the families were denied access to counsel and deported within less than a day in one case and just over two days in the other. 'In the early morning hours of Friday, April 25, 2025, the United States government illegally deported three U.S. citizen children, along with their non-U.S. citizen family members, to Honduras,' the suit states. 'Earlier that week, Plaintiffs 'Julia' and … 'Rosario' – two mothers of U.S. citizen children – attended what they believed to be regularly scheduled check-ins with a U.S. Immigration and Customs Enforcement contractor. 'In violation of the government's own directive, Julia and Rosario were never given a choice as to whether their children should be deported with them and were prohibited from contacting their counsel or having meaningful contact with their families to arrange for the care of their children. 'Instead, they were held effectively incommunicado with their children and illegally deported without even a semblance of due process.' "ICE's actions in this case are not only unlawful, they are cruel and show a complete disregard for family values and the well-being of children," said Sirine Shebaya, Executive Director of the National Immigration Project. "No government agency should have the power to disappear families, ignore medical needs, and disregard its own policies and constitutional rights simply in order to achieve a goal of unfettered enforcement. 'Without accountability, violations like this will only happen more frequently. Through this lawsuit, we seek justice, accountability, and the immediate safe return of these families to the United States." Speaking via the NIP, Rosario said that the ordeal had been 'scary and overwhelming,' 'After so many years in the United States, it has been devastating to be sent to Honduras. Life in Honduras is incredibly hard. I don't have the resources to care for my children the way they need.' The lawsuit demands that ICE be held accountable for the 'unlawful deportation of U.S. citizens and its disregard for the rights and safety of children.' The plaintiffs seek immediate return to the U.S, recognition of their right to make custodial decisions for their children, and compensation for the harms the families have endured.

Medical hand soap brand recalls four types of its products
Medical hand soap brand recalls four types of its products

The Independent

time3 hours ago

  • The Independent

Medical hand soap brand recalls four types of its products

Medical hand soap brand DermaRite has recalled four types of its products sold across the United States and Puerto Rico. The recall was initiated after the detection of Burkholderia cepacia, a potentially deadly and antibiotic-resistant bacteria, in the soap products. These over-the-counter antiseptic soaps are primarily used by healthcare professionals in settings such as hospitals and nursing homes. The bacteria poses a significant risk, particularly to immunosuppressed individuals and those with chronic lung diseases, potentially leading to life-threatening infections like sepsis. DermaRite has instructed distributors and customers to immediately examine stock and destroy affected products, with no adverse events reported to date.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store