Latest news with #sodiumbromide
Yahoo
4 days ago
- Health
- Yahoo
Man took diet advice from ChatGPT, ended up hospitalized with hallucinations
A man was hospitalized for weeks and suffered from hallucinations after poisoning himself based on dietary advice from ChatGPT. A case study published Aug. 5 in the Annals of Internal Medicine, an academic journal, says the 60-year-old man decided he wanted to eliminate salt from his diet. To do so, he asked ChatGPT for an alternative to salt, or sodium chloride, to which the AI chatbot suggested sodium bromide, a compound historically used in pharmaceuticals and manufacturing. Though the journal noted that doctors were unable to review the original AI chat logs and that the bot likely suggested the substitution for another purpose, such as cleaning, the man purchased sodium bromide and used it in place of table salt for three months. As a result, he ended up in the hospital emergency room with paranoid delusions, despite having no history of mental health problems. Convinced that his neighbor was poisoning him, the man was reluctant to even accept water from the hospital, despite reporting extreme thirst. He continued to experience increased paranoia, as well as auditory and visual hallucinations, eventually landing him an involuntary psychiatric hold after he tried to escape during treatment. What was happening to the man? Doctors determined that the man was suffering from bromide toxicity, or bromism, which can result in neurological and psychiatric symptoms, as well as acne and cherry angiomas (bumps on the skin), fatigue, insomnia, subtle ataxia (clumsiness) and polydipsia (excessive thirst). Other symptoms of bromism can include nausea and vomiting, diarrhea, tremors or seizures, drowsiness, headache, weakness, weight loss, kidney damage, respiratory failure and coma, according to iCliniq. Bromism was once far more common because of bromide salts in everyday products. In the early 20th century, it was used in over-the-counter medications, often resulting in neuropsychiatric and dermatological symptoms, according to the study's authors. Incidents of such poisoning saw a sharp decline when the Food and Drug Administration phased out the use of bromides in pharmaceuticals in the mid-1970s and late 1980s. The man was treated at the hospital for three weeks, over which time his symptoms progressively improved. USA TODAY reached out to OpenAI, the maker of ChatGPT, for comment on Aug. 13 but had not received a response. The company provided Fox News Digital with a statement saying: "Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance." AI can 'fuel the spread of misinformation,' doctors say Doctors involved in the case study said they suspected that the patient had used ChatGPT version 3.5 or 4.0, the former of which they tested in an attempt to replicate the answers the man received. The study's authors noted they couldn't know exactly what the man was told without the original chat log, but they did receive a suggestion for bromide as a replacement for chloride in their tests. "Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do," said study authors Dr. Audrey Eichenberger, Dr. Stephen Thielke and Dr. Adam Van Buskirk. AI carries the risk of providing information without context, according to the doctors. For example, it is unlikely that a medical expert would have mentioned sodium bromide at all if a patient asked for a salt substitute. "Thus, it is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation," the study said. This article originally appeared on USA TODAY: Man hospitalized after taking ChatGPT diet advice, study says
Yahoo
4 days ago
- Health
- Yahoo
Man took diet advice from ChatGPT, ended up hospitalized with hallucinations
A man was hospitalized for weeks and suffered from hallucinations after poisoning himself based on dietary advice from ChatGPT. A case study published Aug. 5 in the Annals of Internal Medicine, an academic journal, says the 60-year-old man decided he wanted to eliminate salt from his diet. To do so, he asked ChatGPT for an alternative to salt, or sodium chloride, to which the AI chatbot suggested sodium bromide, a compound historically used in pharmaceuticals and manufacturing. Though the journal noted that doctors were unable to review the original AI chat logs and that the bot likely suggested the substitution for another purpose, such as cleaning, the man purchased sodium bromide and used it in place of table salt for three months. As a result, he ended up in the hospital emergency room with paranoid delusions, despite having no history of mental health problems. Convinced that his neighbor was poisoning him, the man was reluctant to even accept water from the hospital, despite reporting extreme thirst. He continued to experience increased paranoia, as well as auditory and visual hallucinations, eventually landing him an involuntary psychiatric hold after he tried to escape during treatment. What was happening to the man? Doctors determined that the man was suffering from bromide toxicity, or bromism, which can result in neurological and psychiatric symptoms, as well as acne and cherry angiomas (bumps on the skin), fatigue, insomnia, subtle ataxia (clumsiness) and polydipsia (excessive thirst). Other symptoms of bromism can include nausea and vomiting, diarrhea, tremors or seizures, drowsiness, headache, weakness, weight loss, kidney damage, respiratory failure and coma, according to iCliniq. Bromism was once far more common because of bromide salts in everyday products. In the early 20th century, it was used in over-the-counter medications, often resulting in neuropsychiatric and dermatological symptoms, according to the study's authors. Incidents of such poisoning saw a sharp decline when the Food and Drug Administration phased out the use of bromides in pharmaceuticals in the mid-1970s and late 1980s. The man was treated at the hospital for three weeks, over which time his symptoms progressively improved. USA TODAY reached out to OpenAI, the maker of ChatGPT, for comment on Aug. 13 but had not received a response. The company provided Fox News Digital with a statement saying: "Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance." AI can 'fuel the spread of misinformation,' doctors say Doctors involved in the case study said they suspected that the patient had used ChatGPT version 3.5 or 4.0, the former of which they tested in an attempt to replicate the answers the man received. The study's authors noted they couldn't know exactly what the man was told without the original chat log, but they did receive a suggestion for bromide as a replacement for chloride in their tests. "Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do," said study authors Dr. Audrey Eichenberger, Dr. Stephen Thielke and Dr. Adam Van Buskirk. AI carries the risk of providing information without context, according to the doctors. For example, it is unlikely that a medical expert would have mentioned sodium bromide at all if a patient asked for a salt substitute. "Thus, it is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation," the study said. This article originally appeared on USA TODAY: Man hospitalized after taking ChatGPT diet advice, study says
Yahoo
5 days ago
- Health
- Yahoo
Man who asked ChatGPT about cutting out salt from his diet was hospitalized with hallucinations
A 60-year-old man spent three weeks being treated at a hospital after replacing table salt with sodium bromide following consultation with the popular artificial intelligence bot ChatGPT. Three physicians published a report on the case in the Annals of Internal Medicine earlier this month. According to the report, the man had no prior psychiatric history when he arrived at the hospital "expressing concern that his neighbor was poisoning him." The man shared that he had been distilling his own water at home, and the report noted he seemed "paranoid" about water he was offered. Bromism, or high levels of bromide, was considered after a lab report and consultation with poison control, the report said. "In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability," the case report said. Once his condition improved, the man shared that he had taken it upon himself to conduct a "personal experiment" to eliminate table salt from his diet after reading about its negative health effects. The report said he did this after consulting with the chatbot ChatGPT. He self-reported that the replacement went on for three months. The three physicians, all from the University of Washington, noted in the report that they did not have access to the patient's conversation logs with ChatGPT. However, they asked ChatGPT 3.5 what chloride could be replaced with on their own. According to the report, the response they received included bromide. "Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do," the report said. A representative for OpenAI, the company that created ChatGPT, did not immediately respond to a request for comment. The company noted in a statement to Fox News that its terms of service state that the bot is not to be used in the treatment of any health condition. "We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance," the statement said. Bromide toxicity was a more common toxic syndrome in the early 1900s, the report said, as it was present in a number of over-the-counter medications and sedatives. It was believed to contribute to 8% of psychiatric admissions at the time, according to the report. Bromide salt, an inorganic compound, is now typically only used in veterinary medicine as an anti-epileptic medication for cats and dogs, according to the National Library of Medicine. It's a rare syndrome, but cases have re-emerged recently "as bromide-containing substances have become more readily available with widespread use of the internet," the report said. This article was originally published on Solve the daily Crossword
Yahoo
6 days ago
- Health
- Yahoo
Man took diet advice from ChatGPT, ended up hospitalized with hallucinations
A man was hospitalized for weeks and suffered from hallucinations after poisoning himself based on dietary advice from ChatGPT. A case study published on Aug. 5 in the Annals of Internal Medicine, an academic journal, states the 60-year-old man decided he wanted to eliminate salt from his diet completely. To do so, he asked ChatGPT for an alternative to salt, or sodium chloride, to which the AI chatbot suggested sodium bromide, a compound historically used in pharmaceuticals and manufacturing. While the journal noted that doctors were unable to review the original AI chat logs and that the bot likely suggested the substitution for another purpose, such as cleaning, the man purchased sodium bromide and used it in place of table salt for three months. As a result, he ended up in the ER with paranoid delusions, despite having no history of mental health problems. Convinced that his neighbor was poisoning him, the man was reluctant to even accept water from the hospital, despite reporting extreme thirst. He continued to experience increased paranoia, as well as auditory and visual hallucinations, eventually landing him an involuntary psychiatric hold after he tried to escape during treatment. What was happening to the man? Doctors determined that the man was suffering from bromide toxicity (or bromism), which can result in neurological and psychiatric symptoms, as well as others experienced by the man, including acne and cherry angiomas (bumps on the skin), fatigue, insomnia, subtle ataxia (clumsiness) and polydipsia (excessive thirst). Other symptoms of bromism can include nausea and vomiting, diarrhea, tremors or seizures, drowsiness, headache, weakness, weight loss, kidney damage, respiratory failure and coma, according to iCliniq. Bromism was once far more common due to bromide salts having been used in everyday products. In the early 20th Century, it was used in over-the-counter medications, often resulting in neuropsychiatric and dermatological symptoms, according to the study's authors. Incidents of such poisoning saw a sharp decline when the Food and Drug Administration phased out the use of bromides in pharmaceuticals between the mid 70s and late 1980s. The man was treated at the hospital for three weeks, over which time his symptoms progressively improved. USA TODAY reached out to OpenAI, the maker of ChatGPT, for comment on Wednesday, Aug. 13, but has not received a response. The company provided Fox News Digital with a statement, saying, "Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance." AI can 'fuel the spread of misinformation,' doctors say Doctors involved in the case study said they suspected that the patient had used ChatGPT version 3.5 or 4.0, the former of which they tested in an attempt to replicate the answers the man received. While the study's authors noted that they couldn't know exactly what the man was told without the original chat log, they did receive a suggestion for bromide as a replacement for chloride in their tests. "Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do," said study authors Dr. Audrey Eichenberger, Dr. Stephen Thielke and Dr. Adam Van Buskirk. AI carries the risk of providing information like this without context, according to the doctors. For example, it is unlikely that a medical expert would have mentioned sodium bromide at all if a patient asked for a salt substitute. "Thus, it is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation," according to the study. This article originally appeared on USA TODAY: Man hospitalized after taking ChatGPT diet advice, study says Solve the daily Crossword