logo
#

Latest news with #AnnalsofInternalMedicine:ClinicalCases

Man Asked ChatGPT For A Healthier Salt Alternative, Got Toxic Tip, Landed In ICU
Man Asked ChatGPT For A Healthier Salt Alternative, Got Toxic Tip, Landed In ICU

News18

time16 hours ago

  • Health
  • News18

Man Asked ChatGPT For A Healthier Salt Alternative, Got Toxic Tip, Landed In ICU

The patient, hospitalised for three weeks, sought AI advice on salt alternatives; the chatbot suggested bromide, which he consumed without further research In a startling incident, a man narrowly escaped death after following dietary advice from an artificial intelligence platform. The individual, who often relied on AI for health guidance, landed in the Intensive Care Unit with severe complications. The crisis began when he asked ChatGPT for alternatives to table salt. Among the suggestions was sodium bromide, a toxic compound. Using this advice led to a life-threatening case of bromide poisoning, marking a rare and alarming first linked to AI recommendations. 3-Month Sodium Bromide Intake Causes Poisoning The case, reported by doctors from Washington University in the Annals of Internal Medicine: Clinical Cases, revealed that the man consumed sodium bromide for three months. He had been informed by ChatGPT that sodium bromide was a safe substitute for chloride. Historically, bromide compounds were used to treat insomnia and anxiety but were discontinued due to numerous side effects. Today, bromide is mainly found in veterinary medicines and some industrial products, making bromide toxicity rare. Hospitalized For 3 Weeks After Poisoning Over time, he experienced confusion and other severe issues, including paranoia. He suspected his neighbour of poisoning him and became increasingly mentally unstable, even refusing to drink water despite thirst. His condition worsened, leading to his hospitalisation. Doctors administered intravenous fluids and antipsychotic medication, which gradually improved his symptoms. Mental health treatments were also provided. A week later, he was able to communicate and explained the cause of his illness. AI Not Reliable For Health Advice Although the original chat log was not available, doctors replicated the query to ChatGPT, which again suggested bromide without warning about its dangers for human consumption. After three weeks of intensive treatment, the man was discharged. Experts caution against using AI for health-related advice, emphasising that AI often fails to mention side effects of its recommendations. For critical health matters, professional medical advice is paramount. For example, symptoms like weight loss can be associated with multiple diseases, not just cancer. Therefore, one should always consult doctors for health concerns to avoid potentially harmful guidance from AI. view comments First Published: Disclaimer: Comments reflect users' views, not News18's. Please keep discussions respectful and constructive. Abusive, defamatory, or illegal comments will be removed. News18 may disable any comment at its discretion. By posting, you agree to our Terms of Use and Privacy Policy.

ChatGPT salt swap advice lands man in hospital with bromide poisoning
ChatGPT salt swap advice lands man in hospital with bromide poisoning

Business Standard

timea day ago

  • Health
  • Business Standard

ChatGPT salt swap advice lands man in hospital with bromide poisoning

A 60-year-old man landed in the hospital after unknowingly poisoning himself by following diet advice from ChatGPT. According to the case report A case of bromism influenced by use of artificial intelligence' published in Annals of Internal Medicine: Clinical Cases, the US resident swapped regular table salt (sodium chloride) with sodium bromide, which is a chemical once used in old medicines but now known to be toxic in high doses. Three months later, he landed in the hospital with severe neuropsychiatric symptoms, paranoia, and skin changes, revealing the hidden dangers of relying solely on AI for health guidance. What exactly happened in this case? The man, with no prior medical or psychiatric history, was admitted to the hospital after believing his neighbour was poisoning him. At first, he didn't reveal any unusual dietary habits. But further questioning uncovered that he had been on a highly restrictive vegetarian diet, distilled his water on his own, and had been consuming sodium bromide instead of salt for three months. He had found the idea of swapping regular table salt with sodium bromide after asking ChatGPT how to remove salt from his diet. While the AI did mention that 'context matters', it suggested bromide as a chloride substitute without providing a health warning. He then bought the chemical online, unaware of its dangers. What is bromism poisoning? Bromism is a form of poisoning caused by excessive bromide intake. In the early 1900s, bromide salts were common in over-the-counter remedies for anxiety, insomnia, and hysteria. But over time, doctors realised bromide could cause neurological, psychiatric, and skin problems, from hallucinations and paranoia to acne and coordination issues. The US banned bromide in ingestible products between 1975 and 1989. However, with bromide-containing substances easily available online, rare cases have reappeared in recent years. During his hospital stay, the man, in this case, developed: Severe paranoia and hallucinations Insomnia and fatigue Acne and cherry angiomas on his face Ataxia (coordination problems) Extreme thirst Lab tests revealed dangerously high bromide levels, 1,700 mg/L, compared to the normal range of 0.9 to 7.3 mg/L. He also had abnormal electrolyte readings, including hyperchloremia and a negative anion gap, which eventually helped doctors suspect bromism. According to the case report, the treatment at the University of Washington in Seattle, Washington, involved stopping bromide intake immediately and giving intravenous fluids to flush it out of his system. His electrolyte levels and mental state gradually improved over three weeks. He was weaned off psychiatric medication before discharge and remained stable during follow-up. Role of AI in this medical emergency The case authors tested ChatGPT themselves, asking what could replace chloride. The AI also suggested bromide, but without explaining its health risks or questioning why the user wanted the substitution. This highlights a critical limitation that AI can present technically correct but contextually dangerous information without medical oversight. The case report highlighted that while AI tools can be helpful for general education, they are not a replacement for professional medical advice. The study warns that AI can generate inaccurate or decontextualised health suggestions as it lacks the critical judgment of a trained medical professional. It also stressed that self-experimentation based on AI advice can have serious health consequences. If you are considering dietary or lifestyle changes, especially those involving chemicals or supplements, consult a qualified doctor first.

Man Hospitalised After Following Harmful Diet Tips Allegedly Given By ChatGPT
Man Hospitalised After Following Harmful Diet Tips Allegedly Given By ChatGPT

India.com

time2 days ago

  • Health
  • India.com

Man Hospitalised After Following Harmful Diet Tips Allegedly Given By ChatGPT

In a rare and alarming case, a man in the United States developed life-threatening bromide poisoning after following diet advice given by ChatGPT. Doctors believe this could be the first known case of AI-linked bromide poisoning, according to a report by Gizmodo. The case was detailed by doctors at the University of Washington in 'Annals of Internal Medicine: Clinical Cases'. They said the man consumed sodium bromide for three months, thinking it was a safe substitute for chloride in his diet. This advice reportedly came from ChatGPT, which did not warn him about the dangers. Bromide compounds were once used in medicines for anxiety and insomnia, but they were banned decades ago due to severe health risks. Today, bromide is mostly found in veterinary drugs and some industrial products. Human cases of bromide poisoning, also called bromism, are extremely rare. The man first went to the emergency room believing his neighbour was poisoning him. Although some of his vitals were normal, he showed paranoia, refused water despite being thirsty, and experienced hallucinations. His condition quickly worsened into a psychotic episode, and doctors had to place him under an involuntary psychiatric hold. After receiving intravenous fluids and antipsychotic medicines, he began to improve. Once stable, he told doctors that he had asked ChatGPT for alternatives to table salt. The AI allegedly suggested bromide as a safe option advice he followed without knowing it was harmful. Doctors did not have the man's original chat records, but when they later asked ChatGPT the same question, it again mentioned bromide without warning that it was unsafe for humans. Experts say this shows how AI can provide information without proper context or awareness of health risks. The man recovered fully after three weeks in hospital and was in good health during a follow-up visit. Doctors have warned that while AI can make scientific information more accessible, it should never replace professional medical advice and, as this case shows, it can sometimes give dangerously wrong guidance.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store