logo
What If Everything We Think About Obesity Is Wrong?

What If Everything We Think About Obesity Is Wrong?

Bloomberg11 hours ago
The US government is approaching the study of obesity all wrong.
According to leadership at the Department of Health and Human Services, food companies are to blame — they're engineering their snacks, fast food and sweets to addict us the same way cocaine and nicotine do. Others assessing the epidemic say people overeat because they lack willpower and could stop if they really wanted to.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

60-Year-Old Gave Himself Early 20th Century Psychosis After He Went To ChatGPT For Diet Advice
60-Year-Old Gave Himself Early 20th Century Psychosis After He Went To ChatGPT For Diet Advice

Yahoo

time3 minutes ago

  • Yahoo

60-Year-Old Gave Himself Early 20th Century Psychosis After He Went To ChatGPT For Diet Advice

A 60-year-old man gave himself an uncommon psychiatric disorder after asking ChatGPT for diet advice in a case published Tuesday by the American College of Physicians Journals. The man, who remained anonymous in the case study, told doctors he had eliminated sodium chloride, commonly known as table salt, from his diet after reading about its negative health effects. He said he could only find sources telling him how to reduce salt, but not eliminate it completely. Inspired by his nutrition studies in college, the man decided to completely eliminate sodium chloride from his diet as a personal experiment, with consultation from Chat GPT, researchers wrote. He maintained multiple dietary restrictions and even distilled his own water at home. 'For 3 months, he had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning,' the case study read. While excess sodium can raise blood pressure and increase the risk of health issues, it is still necessary to consume a healthy amount of it. The man, who had no psychiatric history, eventually ended up at the hospital, worried that his neighbor was poisoning him. He told doctors he was very thirsty, but paranoid about the water he was offered. 'In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability,' the study read. Doctors concluded that the man was suffering from bromism, or bromide toxicity, a condition that is rare today but was more common in the early 20th century. The research noted that bromide was found in several over-the-counter medicines back then and contributed to up to 8% of bromism-related psychiatric admissions at that time. The hospital treated the man for psychosis and discharged him weeks later. His case highlights the potential pitfalls of using AI to seek medical tips. Dr. Margaret Lozovatsky, a pediatrician, warned last year that AI often misses crucial context. 'Even if the source is appropriate, when some of these tools are trying to combine everything into a summary, it's often missing context clues, meaning it might forget a negative,' she told the American Medical Association. 'So, it might forget the word 'not' and give you the opposite advice.' Related... ChatGPT Was Asked To List Everyone Trump Has Called 'A Low-IQ Individual' — And It's Pretty Racist Elon Musk Soft Launches 'NSFW' AI Companion A Week After Chatbot Goes On Antisemitic Tirade These Tragic AI Fails Are Proof That You Can't Fully Rely On ChatGPT To Plan Your Trip

Luigi Mangione's 120-page healthcare history was accidentally shared by Aetna and his own lawyers, prosecutor says
Luigi Mangione's 120-page healthcare history was accidentally shared by Aetna and his own lawyers, prosecutor says

Yahoo

time13 minutes ago

  • Yahoo

Luigi Mangione's 120-page healthcare history was accidentally shared by Aetna and his own lawyers, prosecutor says

Luigi Mangione last month accused NY prosecutors of fraudulently acquiring his Aetna health history. On Friday, prosecutors blamed Aetna, saying they over-responded to a lawful, limited DA subpoena. "Mistakes do occur," including on the part of the defense, the prosecutor wrote. Luigi Mangione's confidential, 120-page medical history was accidentally emailed to his New York prosecutors not once, but twice — first by Aetna and then by his own defense lawyers, according to a new court filing. Prosecutors took "appropriate measures" both times, forwarding the confidential health records to the judge and deleting their own copy, the lead assistant district attorney, Joel Seidemann, wrote in revealing what he described as a double-snafu on Friday. "Mistakes do occur," Seidemann wrote in his three-page filing — meaning on the part of defense lawyers and Aetna, but not himself. "Aetna erroneously sent us materials," he wrote. "Like Aetna, the defense then erred, compounding Aetna's mistake," he wrote. "Defense counsel sent the People an email attaching the entire Aetna file she now complains about." "Once again, we complied with our ethical obligations by asking counsel if she intended to send us the file," Seidemann wrote. "When she indicated that she did not and asked that we delete it, we complied with her request and did not take advantage of her error." Aetna, meanwhile, defended its own role in the records relay, saying through a spokesman that they got a subpoena, and they answered it. "Our response is the same as before," wrote Phil Blando, executive director for communications for Aetna's parent company, CVS Health. "Aetna received a subpoena for certain medical records, and we provided them appropriately." It's the latest round of finger-pointing in a month-long battle between state-level prosecutors and defense attorneys over the confidential medical records of Mangione, the 27-year-old Maryland native accused in the December shooting murder of UnitedHealthcare CEO Brian Thompson. The records included "different diagnoses as well as specific medical complaints made by Mr. Mangione," his lawyers complained in their own filing last month. Both prosecutors and the defense agree that Seidemann's May 14 subpoena asked Aetna for very limited data, just Mangione's health insurance account number and the period of time he was covered. Beyond that small patch of common ground, the sides diverge widely. The defense, led by attorney Karen Friedman Agnifilo, wrote last month that Seidemann should never have asked directly for Mangione's health insurance account number, arguing that it is protected under HIPAA, the federal Health Insurance Portability and Accountability Act. "The requested information does not appear to be protected by HIPAA, since it did not relate to a condition, treatment, or payment for health care," Seidemann countered in Friday's filing. The sides also differ on what happened once Aetna attached Mangione's entire healthcare history, in four files, to its June 12, supboeana-response email to Seidemann. Seidemann wrote in Friday's filing that his subpoena "was lawful and properly drafted," and that, as required, it directed Aetna to return the requested materials to the judge. The defense accuses Seidemann of sitting on the sensitive records for 12 days before forwarding them to the judge. They additionally want to know how Aetna wound up sending the records directly to the prosecutor. They've asked the judge, New York Supreme Court Justice Gregory Carro, to order "a full evidentiary hearing" to determine possible penalties, including kicking Seidemann off the case. They've asked that the hearing include sworn testimony and the surrender of correspondence between prosecutors and Aetna. By late Friday afternoon, the judge had not issued a decision on calling such a hearing. A defense spokesperson declined to comment on Friday's filing. In addition to the state case, Mangione is charged with murder in a federal indictment that seeks the death penalty. In another, more behind-the-scenes battle, prosecutors in both venues, state and federal, have said they intend to bring Mangione to trial first. The order of trials has yet to be worked out. State court has an advantage, in that Mangione's case is proceeding more quickly there, given the lack of complicated capital-punishment issues. The feds, too, have an advantage, in that Mangione is in federal custody, and they have physical control of where he goes. Judges in both venues have said they hope to bring him to trial in 2026. Read the original article on Business Insider Solve the daily Crossword

Man Follows Diet Advice From ChatGPT, Ends Up With Psychosis
Man Follows Diet Advice From ChatGPT, Ends Up With Psychosis

Gizmodo

timean hour ago

  • Gizmodo

Man Follows Diet Advice From ChatGPT, Ends Up With Psychosis

A case study out this month offers a cautionary tale ripe for our modern times. Doctors detail how a man experienced poison-caused psychosis after he followed AI-guided dietary advice. Doctors at the University of Washington documented the real-life Black Mirror episode in the Annals of Internal Medicine: Clinical Cases. The man reportedly developed poisoning from the bromide he had ingested for three months on ChatGPT's recommendation. Thankfully, his condition improved with treatment, and he successfully recovered. Bromide compounds were once commonly used in the early 20th century to treat various health problems, from insomnia to anxiety. Eventually, though, people realized bromide could be toxic in high or chronic doses and, ironically, cause neuropsychiatric issues. By the 1980s, bromide had been removed from most drugs, and cases of bromide poisoning, or bromism, dropped along with it. Still, the ingredient remains in some veterinary medications and other consumer products, including dietary supplements, and the occasional case of bromism does happen even today. This incident, however, might be the first ever bromide poisoning fueled by AI. According to the report, the man visited a local emergency room and told staff that he was possibly being poisoned by his neighbor. Though some of his physicals were fine, the man grew agitated and paranoid, refusing to drink water given to him even though he was thirsty. He also experienced visual and auditory hallucinations and soon developed a full-blown psychotic episode. In the midst of his psychosis, he tried to escape, after which doctors placed him in an 'involuntary psychiatric hold for grave disability.' Doctors administered intravenous fluids and an antipsychotic, and he began to stabilize. They suspected early on that bromism was to blame for the man's illness, and once he was well enough to speak coherently, they found out exactly how it ended up in his system. The man told the doctors that he started taking sodium bromide intentionally three months earlier. He had read about the negative health effects of having too much table salt (sodium chloride) in your diet. When he looked into the literature, though, he only came across advice on how to reduce sodium intake. 'Inspired by his history of studying nutrition in college,' the doctors wrote, the man instead decided to try removing chloride from his diet. He consulted ChatGPT for help and was apparently told that chloride could be safely swapped with bromide. With the clear-all from the AI, he began consuming sodium bromide bought online. Given the timeline of the case, the man had likely been using ChatGPT 3.5 or 4.0. The doctors didn't have access to the man's chat logs, so we'll never know exactly how his fateful consultation unfolded. But when they asked ChatGPT 3.5 what chloride can be replaced with, it came back with a response that included bromide. It's possible, even likely, that the man's AI was referring to examples of bromide replacement that had nothing to do with diet, such as for cleaning. The doctors' ChatGPT notably did state in its reply that the context of this replacement mattered, they wrote. But the AI also never provided a warning about the dangers of consuming bromide, nor did it ask why the person was interested in this question in the first place. As for the man himself, he did slowly recover from his ordeal. He was eventually taken off antipsychotic medication and discharged from the hospital three weeks after admission. And at a two-week follow-up, he remained in stable condition. The doctors wrote that while tools like ChatGPT can 'provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information.' With some admirable resistance, they added that a human medical expert probably wouldn't have recommended switching to bromide to someone worried about their table salt consumption. Honestly, I'm not sure any living human today would give that advice. And that's why having a decent friend to bounce our random ideas off should remain an essential part of life, no matter what the latest version of ChatGPT is.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store