Man who asked ChatGPT about cutting out salt from his diet was hospitalized with hallucinations
Three physicians published a report on the case in the Annals of Internal Medicine earlier this month. According to the report, the man had no prior psychiatric history when he arrived at the hospital "expressing concern that his neighbor was poisoning him."
The man shared that he had been distilling his own water at home, and the report noted he seemed "paranoid" about water he was offered. Bromism, or high levels of bromide, was considered after a lab report and consultation with poison control, the report said.
"In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability," the case report said.
Once his condition improved, the man shared that he had taken it upon himself to conduct a "personal experiment" to eliminate table salt from his diet after reading about its negative health effects. The report said he did this after consulting with the chatbot ChatGPT.
He self-reported that the replacement went on for three months.
The three physicians, all from the University of Washington, noted in the report that they did not have access to the patient's conversation logs with ChatGPT. However, they asked ChatGPT 3.5 what chloride could be replaced with on their own.
According to the report, the response they received included bromide.
"Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do," the report said.
A representative for OpenAI, the company that created ChatGPT, did not immediately respond to a request for comment. The company noted in a statement to Fox News that its terms of service state that the bot is not to be used in the treatment of any health condition.
"We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance," the statement said.
Bromide toxicity was a more common toxic syndrome in the early 1900s, the report said, as it was present in a number of over-the-counter medications and sedatives. It was believed to contribute to 8% of psychiatric admissions at the time, according to the report.
Bromide salt, an inorganic compound, is now typically only used in veterinary medicine as an anti-epileptic medication for cats and dogs, according to the National Library of Medicine.
It's a rare syndrome, but cases have re-emerged recently "as bromide-containing substances have become more readily available with widespread use of the internet," the report said.
This article was originally published on NBCNews.com
Solve the daily Crossword
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Yahoo
6 hours ago
- Yahoo
Inmate died hours after being booked into Bernalillo County jail
Aug. 19—A male inmate died a day after being booked into the Metropolitan Detention Center over the weekend, officials said Tuesday. "Despite rigorous life-saving efforts, an inmate died Sunday at the Bernalillo County Metropolitan Detention Center," MDC spokeswoman Candace Hopkins said in a news release. MDC will not release the 62-year-old man's name until his family has been notified, she said. The man was booked into the jail around 7 a.m. Saturday. The next day, around 9 p.m., the man "experienced a medical emergency, prompting a code to be called," Hopkins said. MDC staff and University of New Mexico Hospital staff stabilized the man before taking him to a medical unit in the detention center. Roughly 22 minutes later, the man had another medical emergency. An ambulance arrived around 10:08 p.m. to assist in life-saving measures, Hopkins said. At 10:26 p.m., UNMH staff pronounced the man dead. The man is the sixth MDC inmate to die at the facility or die after falling ill at the facility this year.
Yahoo
9 hours ago
- Yahoo
Man poisons himself after receiving advice from AI: 'Will give rise to terrible results'
Man poisons himself after receiving advice from AI: 'Will give rise to terrible results' A man was hospitalized with severe physical and psychiatric symptoms after replacing table salt with sodium bromide in his diet, advice he said he received from ChatGPT, according to a case study published in the Annals of Internal Medicine. Experts have strongly cautioned against taking medical advice from artificial intelligence-powered chatbots. "These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense when deciding what to ask these systems and whether to heed their recommendations," said Dr. Jacob Glanville, according to Fox 32 Chicago. What's happening? A 60-year-old man concerned about the potentially negative health impacts of chloride on his body was looking for ways to completely remove sodium chloride, the chemical name for table salt, from his diet. "Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet," the case study's authors wrote. "For three months, he had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning." The "personal experiment" landed the man, who had "no past psychiatric or medical history," in the emergency room, saying he believed he was being poisoned by his neighbor. "In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability," the authors said. With treatment, the man's symptoms gradually improved to the point where he could explain to doctors what had happened. Why does bad medical advice from AI matter? The situation highlighted the high levels of risk involved in obtaining medical advice, or other highly specialized knowledge, from AI chatbots including ChatGPT. As the use of AI-powered tools becomes more popular, incidents such as the one described in the case study are likely to occur more frequently. "Thus, it is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation," the case study's authors warned. Do you worry about companies having too much of your personal data? Absolutely Sometimes Not really I'm not sure Click your choice to see results and speak your mind. They encouraged medical professionals to consider the public's increasingly widespread reliance on AI tools "when screening for where their patients are consuming health information." What's being done about AI misinformation? Unless and until governments enact regulatory guardrails constraining what kinds of advice and information AI can and cannot dole out to people, individuals will be left to rely on their own common sense, as Glanville recommended. However, when it comes to complex, scientifically dense information that requires specialized knowledge and training to properly understand, it is questionable how far "common sense" can go. The subject of the case study had received some level of specialized academic training with regards to nutrition. Apparently, this was not enough for him to recognize that sodium bromide was not a suitable alternative for table salt in his diet. Consequently, the best way to protect oneself and one's family from the harmful effects of AI misinformation is to limit reliance on AI to specific, limited instances and to approach any AI-provided advice or data with a high level of skepticism. To take things a step further, you can use your voice and reach out to your elected representatives to tell them that you are in favor of regulations to rein in AI-generated misinformation. Join our free newsletter for good news and useful tips, and don't miss this cool list of easy ways to help yourself while helping the planet. Solve the daily Crossword
Yahoo
11 hours ago
- Yahoo
Please let me eat the radioactive shrimp from Walmart
The FDA is warning people not to eat frozen shrimp from Walmart that may be radioactive. The agency is recommending a recall, though no radioactive shrimp has entered the US food supply. Don't threaten ME with a good time! With great power comes great responsibility. We all know that to be true thanks to a young man from Queens who happened to be bitten by a radioactive spider and went on to save the world from a careless and egomanical oligarch, a technologist gone rogue, and a guy who is sometimes made of sand. We all admire the bravery and pluck of Peter Parker. So when I heard that there was a shipment of possibly radioactive shrimp at Walmart, I felt the call to serve. I must eat the radioactive shrimp. According to a press release from the Food and Drug Administration, a shipment of frozen shrimp from BMS Foods, an Indonesian supplier, tested positive at customs for Cesium-137, a radioactive isotope. The frozen shrimp was set to be sold under the Great Value brand at Walmart. "At this time, no product that has tested positive or alerted for Cesium-137 (Cs-137) has entered the US commerce," the FDA wrote, although they recommend a recall for three Great Value products. How did the shrimp become radioactive? Are all shrimp tested for radioactivity? Is foul play suspected, or is this some natural occurrence? These are questions I do not have answers to. The only question I definitively know the answer to is: Do I want to eat the shrimp? Yes, yes, I do. I need to try it. I cannot resist knowing there's radioactive shrimp out there and that I could potentially harness its shrimpy powers. Would I get claws? A powerful tail to maneuver through water? Would I just go great with cocktail sauce? [Note: Business Insider does not endorse eating anything radioactive, and you should not do this, no matter how tempting it sounds. Ingesting Cesium-137 allows the radioactive material to accumulate in soft tissues, increasing cancer risk, according to the FDA.] Since apparently the contaminated shipment was stopped before it hit shelves, it sounds like I'll never get a chance to find out. But thanks to the power of ChatGPT, I made a picture of myself as a radioactive shrimp: Read the original article on Business Insider Solve the daily Crossword