
A 26-Year-Old With Abnormal Eye Movement and Agitated Delirium
The Case Challenge series includes difficult-to-diagnose conditions, some of which are not frequently encountered by most clinicians, but are nonetheless important to accurately recognize. Test your diagnostic and treatment skills using the following patient scenario and corresponding questions. If you have a case that you would like to suggest for a future Case Challenge, please email us at ccsuggestions@medscape.com with the subject line "Case Challenge Suggestion." We look forward to hearing from you.
Background
A 26-year-old woman presents to the emergency department (ED) with confusion, agitation, sweating, and abnormal involuntary eye movement (diaphoresis and ocular clonus). The symptoms started the previous week; they were mild at first, and the patient attributed them to a week-long heat wave in the city.
The patient felt worse as the week went on and decided to come to the ED because she was not feeling better that morning. She started sweating profusely in the middle of the night, felt her leg muscles become stiff, and was increasingly nauseous. Her fiancé insisted she go to the ED as early as possible because she seemed agitated and confused, pacing around the apartment instead of taking a shower.
The patient's history is reviewed with her fiancé. She has a history of depression, for which she takes fluoxetine; her dose was increased by her psychiatrist 1 month ago. For migraines, she takes sumatriptan and ondansetron. Because her headaches have recently worsened, her neurologist started her on tramadol. She also has had a cough, for which she self-medicated with dextromethorphan. The remainder of her medical history is noncontributory.
She does not smoke, denies drug or alcohol use, and has no allergies or family history of other significant illness.
Physical Examination and Workup
Upon physical examination, her blood pressure is 165/105 mm Hg, pulse 128 beats/min, respiratory rate 20 breaths/min, pulse oximetry 98% on room air, and temperature 100.6°F (38.1°C).
The patient is a thin woman who seems agitated and restless but complies with the examination. The lung examination reveals clear breath sounds in all fields. Her heart has a regular rhythm, and no murmur is appreciated.
Upon neurologic evaluation, she is alert and oriented to date but not to time or place. Her speech is clear and fluent, with good repetition, comprehension, and naming. She recalls 1 out of 3 objects at 5 minutes. No tenderness or signs of trauma are found over the scalp and neck. No proptosis, lid swelling, conjunctival injection, or chemosis is observed.
The patient is able to identify a pen and a clock. She can count fingers and has an intact bitemporal visual field. Extraocular muscles are intact upon examination; she is able to look from right to left as well as up and down. Spontaneous right ocular clonus is observed. Her pupils are 2 mm and are reactive to light. Sensory examination of her face is unremarkable. Her tongue and uvula are midline, with a positive gag reflex. Her hearing test findings are symmetric. Shoulder-shrug findings are equal on both sides. Strength is 5/5 in the upper and lower extremities.
Sensory examination findings reveal symmetry to light touch, pinprick, temperature, vibration, and proprioception. The patient's reflexes are 2+, except in the lower extremities, where bilateral hyperreflexia is observed. She is able to perform rapid alternating movements. The remainder of her physical examination findings are unremarkable.
An ECG reveals normal sinus at a rate of 128 beats/min, without ST-T wave changes. Head CT is performed, and an example similar to the findings in this case is shown below (Figure 1).
Figure 1.
A urine pregnancy test result is negative. Laboratory analyses performed in the ED include a CBC count, metabolic panel, hepatic panel with lipase, and troponin level. Laboratory test findings are remarkable for a WBC count of 12.4 × 103 cells/µL (reference range, 4.2-11.0 × 103 cells/µL), with 69% segmented neutrophils (54%-62%) and bandemia of 2% (3%-5%), and a hemoglobin level of 11.6 g/dL (reference range for women, 12-15 g/dL). Her troponin level is 0.5 ng/mL (reference range, 0-0.4 ng/mL). The remainder of laboratory test findings, including a toxicology screen and creatine phosphokinase level, were within normal limits. Interpretation of the CT scan was normal.
The patient was given acetaminophen for the low-grade fever. Intravenous fluids were started, and she was admitted to the medicine floor.
Discussion
Serotonin syndrome is a potentially life-threatening condition that occurs secondary to serotonin toxicity in the central and peripheral nervous systems. This can be due to a combination of serotonergic agents, an increase in therapeutic dosing of a serotonergic agent, and an overdose or inadvertent interactions of serotonergic agents. Serotonin syndrome is a clinical diagnosis; therefore, a careful and thorough history and physical and neurologic examinations are essential, as is a high level of suspicion.[1]
Serotonin syndrome results from excessive stimulation or agonist activity at postsynaptic serotonin receptors; most often implicated is excessive binding at the serotonin 2A (5-HT 2A ) and serotonin 1A (5-HT 1A ) receptor subtypes. Those two subtypes may be the predominant cause of symptoms.[2] Presenting symptoms can vary widely and range from mild to life-threatening.
Serotonin is produced in the neurons from L-tryptophan, and its concentration is regulated through feedback loops controlling its reuptake and metabolism. Serotonin receptors in the central nervous system regulate attention, behavior, temperature, the sleep/wake cycle, appetite, and muscle tone.[3]
Serotonin receptors are also located in the peripheral nervous system; peripheral serotonin is produced by intestinal enterochromaffin cells and is involved in the regulation of gastrointestinal motility, uterine contraction, bronchoconstriction, and vascular tone. In addition, serotonin in platelets promotes their aggregation.
No specific laboratory test is indicated to diagnose serotonin syndrome, and serotonin levels do not correlate with the severity of symptoms. The Hunter criteria are the most accurate diagnostic set available to diagnose serotonin syndrome, with 84% sensitivity and 97% specificity. The criteria require that a patient be taking a serotonergic agent and meet at least one of the following conditions[4]:
Spontaneous clonus
Inducible clonus with agitation or diaphoresis
Ocular clonus with agitation or diaphoresis
Tremor and hyperreflexia
Hypertonia, temperature > 100.4°F (38°C), and ocular or inducible clonus
A thorough history and physical and neurologic examinations are essential for diagnosis because no specific laboratory test is indicated to diagnose serotonin syndrome. Notably, serotonin levels do not correlate with the severity of symptoms. The Hunter criteria, as outlined previously, are the most accurate diagnostic set available to diagnose serotonin syndrome, with 84% sensitivity and 97% specificity, and the qualifying criteria are independent of one another, not concurrent.
Differential Diagnoses
Besides serotonin syndrome, other differential diagnoses were considered but excluded in this case. Neuroleptic malignant syndrome (NMS) is an idiopathic drug reaction to antipsychotics that has a presentation similar to that of serotonin syndrome; however, NMS presents with bradyreflexia, hyperpyrexia, and lead-pipe rigidity.[5] Myoclonus is rarely seen with NMS, and symptoms typically resolve in days, compared with 24 hours after removal of the offending agent in serotonin syndrome.[6] In addition, patients with NMS have a history of taking a neuroleptic agent (eg, haloperidol, chlorpromazine), atypical antipsychotics, or antiemetic drugs. Vital signs in persons with NMS typically are similar to those in persons with serotonin syndrome and can include hyperthermia, tachycardia, tachypnea, and hypertension.[5] For NMS, dantrolene is the most effective, evidence-based drug treatment available,[6] whereas no evidence-based drug treatments are available for serotonin syndrome.
Malignant hyperthermia is a disorder of skeletal muscle that results from inhalation of halogenated anesthetics (eg, halothane, sevoflurane, desflurane, isoflurane), administration of depolarizing muscle relaxants (eg, succinylcholine), or stressors (eg, vigorous exercise, heat exposure).[7] Malignant hyperthermia is considered a hypermetabolic response of skeletal muscles, and affected patients may present with hyperthermia, tachycardia, tachypnea, increased carbon dioxide production or oxygen consumption, acidosis, hyperkalemia, muscle rigidity, and rhabdomyolysis. Malignant hyperthermia is treated with dantrolene, a specific antagonist that should be available wherever general anesthesia is administered.
Anticholinergic toxicity results from an overdose with an anticholinergic agent and may present with hyperthermia, agitation, altered mental status, mydriasis, dry mucous membranes, urinary retention, and decreased bowel sounds.[8] Patients have normal muscular tone and reflexes in anticholinergic poisoning, compared with serotonin syndrome; the treatment is physostigmine.
Patients with meningitis often have a history of headache, photophobia, neck stiffness, vomiting, and diplopia; they may also present with convulsions, abnormal movements, and/or posturing.
Serotonin syndrome may be distinguished from other causes of agitated delirium on the basis of neuromuscular findings. Patients with sympathomimetic toxicity or infections of the central nervous system typically lack these findings. All the differential diagnoses mentioned can be associated with significant morbidity and mortality without prompt and appropriate treatment; therefore, differentiation based on clinical findings and a high index of suspicion is imperative.
DDx Case Study
A 30-year-old man presents with a 7-month history of worsening hallucinations and delusions. Management initiated by his psychiatrist 1 month ago included risperidone, lithium, olanzapine, and lorazepam. He was brought to the ED because he had been in bed for 3 days in a row and was feeling sluggish, with a temperature of 106°F (41.1°C). Upon examination, the patient is arousable but not oriented to date, time, or place. All extremities are rigid, and reflexes are decreased. His heart rate is 110 beats/min, respiratory rate is 24 breaths/min, and blood pressure is 130/80 mm Hg. No history of illicit drug use is reported.
NMS is an idiopathic drug reaction to antipsychotics that has a presentation similar to that of serotonin syndrome and is characterized by bradyreflexia, hyperpyrexia, and lead-pipe rigidity. Symptoms typically resolve in days compared with 24 hours after removal of offending agent in serotonin syndrome. Patients with NMS have a history of taking a neuroleptic agent (eg, haloperidol, chlorpromazine), atypical antipsychotics, or antiemetic drugs. For NMS, dantrolene is the most effective, evidence-based drug treatment available.
Medications Most Commonly Involved
The medications most commonly involved in serotonin syndrome include selective serotonin reuptake inhibitors (SSRIs), serotonin norepinephrine reuptake inhibitors (SNRIs), monoamine oxidase inhibitors (MAOIs), opioids, cough medications (eg, dextromethorphan), and antibiotics.[9]
Specific drugs that have the potential to cause serotonin syndrome are as follows[1,6-12]:
SSRIs Citalopram Fluoxetine Fluvoxamine Olanzapine/fluoxetine Paroxetine
SNRIs Duloxetine Sibutramine Venlafaxine
Triptans Almotriptan Eletriptan Frovatriptan Naratriptan Rizatriptan Sumatriptan Zolmitriptan
Miscellaneous Buspirone Carbamazepine Cocaine Cyclobenzaprine Dextromethorphan Ergot alkaloids Fentanyl 5-Hydroxytryptophan Linezolid Lithium L-Tryptophan Meperidine Methadone Methamphetamine Methylene blue Metoclopramide Mirtazapine Ondansetron Phenelzine Selegiline St John's wort Tramadol Tranylcypromine Trazodone Tricyclic antidepressants Valproic acid
Avoid prescribing the following opioids, because they precipitate or worsen serotonin syndrome in patients already receiving SSRIs or MAOIs:
Tramadol
Methadone
Meperidine
Fentanyl
Opioids that have not been linked to serotonin syndrome include morphine, codeine, and hydrocodone; these should be administered if no alternative is available.[10]
SSRIs, SNRIs, MAOIs, certain opioids (eg, fentanyl, tramadol), cough medications (eg, dextromethorphan), and antibiotics are the medication types most commonly involved in serotonin syndrome. Agitation and tremors associated with serotonin syndrome can be treated with benzodiazepines, which are nonspecific serotonin antagonists.
Treatment
Most cases of serotonin syndrome are mild and can be treated by withdrawal of the offending agent and supportive care, with complete resolution of the presenting symptoms.[1] Most cases of serotonin syndrome present for care within 6-24 hours of symptom onset and resolve within the following 24 hours.
Agitation and tremors can be treated with benzodiazepines (which are nonspecific serotonin antagonists); however, in severe cases, patients may require neuromuscular paralysis, sedation, or intubation. Hyperthermia > 106°F (41.1°C) usually is associated with a poor prognosis. Patients presenting with hyperthermia and severe muscle rigidity should be managed with antipyretics, neuromuscular paralysis, sedation, or intubation as indicated.[11]
Serotonin syndrome may be complicated by rhabdomyolysis, disseminated intravascular coagulation (DIC), hepatic or renal dysfunction, and lactic acidosis. Therefore, obtaining urinalysis, renal and hepatic function measurement, and a DIC profile should be part of management. Elicit from the patient a confirmation or denial of illicit or recreational drug use, especially in cases of intentional overdose, because this may complicate the clinical picture and delay diagnosis.
Cyproheptadine is the recognized therapy for serotonin syndrome. Cyproheptadine is a histamine-1 receptor antagonist with anticholinergic and antiserotonergic properties. It is taken orally, and the initial dose is 4-12 mg, repeated every 2 hours, and discontinued if the maximum dose of 32 mg is reached without symptom improvement.[12]
Serotonin syndrome resolves over time if promptly diagnosed and appropriately managed, highlighting the importance of a timely and accurate diagnosis. Polypharmacy also increases the risk for serotonin syndrome; therefore, reconciling a patient's medications is important if serotonin syndrome is suspected. Remember that medications such as fluoxetine have a long half-life and may require 5-8 weeks to be cleared from the system; thus, additional serotonergic medications should be cautiously added.
Cyproheptadine is the recognized therapy for serotonin syndrome, given at an initial dose of 4-12 mg, repeated every 2 hours, not to exceed 32 mg. Physostigmine is the first-line treatment option for anticholinergic toxicity. Serotonin syndrome typically resolves within 24 hours of a patient presenting for care. Typical cases of serotonin syndrome are mild and can be treated by withdrawal of the offending agent and supportive care; neuromuscular paralysis, sedation, or intubation may be considered in severe cases with severe muscle rigidity and hyperthermia.
The patient in this case was successfully managed by discontinuing the inciting agents and was treated with cyproheptadine and supportive care. After complete resolution of all symptoms, the patient was discharged (2 days after admission).
Historical Footnote
Although serotonin syndrome is rare, the case of Libby Zion in 1984 was instrumental in influencing and changing medicine in an unprecedented way. Zion was a patient who had been taking phenelzine, an antidepressant.[13] The therapeutic effects of phenelzine may continue for as long as 2 weeks after discontinuation. Zion was given meperidine for agitation, which led to deadly manifestations of serotonin syndrome. That case led to reforms in the grueling hours of medical residents across the United States.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


WebMD
3 minutes ago
- WebMD
The Real Risks of Turning to AI for Therapy
Aug. 20, 2025 — Whenever Luke W Russell needs to work through something, they turn to ChatGPT. (Luke uses they/them pronouns.) 'I've wept as I've navigated things,' said the Indianapolis filmmaker, who uses the chatbot to pick apart intrusive thoughts or navigate traumatic memories. 'I've had numerous times when what ChatGPT is saying to me is so real, so powerful, and I feel so deeply seen.' Russell's experience reflects a broader, growing reality: Many people are turning to chatbots for mental health support — for everything from managing anxiety and processing grief to coping with work conflicts and defusing marital spats. More than half of adults ages 18-54 — and a quarter of adults 55 and up — say they would be comfortable talking with an AI chatbot about their mental health, according to a 2025 survey by the Harris Poll and the American Psychological Association (APA). The catch: OpenAI's ChatGPT and other chatbots — like Anthropic's Claude and Google's Gemini — are not designed for this. Even AI products promoted as emotional health tools — like Replika, Wysa, Youper, and MindDoc — were not built on validated psychological methods, said psychologist C. Vaile Wright, PhD, senior director of the APA's Office of Health Care Innovation. 'I would argue that there isn't really any commercially approved, AI-assisted therapy at the moment,' said Wright. 'You've got a whole lot of chatbots where there is no research, there's no psychological science, and there are no subject matter experts.' Critics warn that AI's potential for bias, lack of true empathy, and limited human oversight could actually endanger users' mental health, especially among vulnerable groups like children, teens, people with mental health conditions, and those experiencing suicidal thoughts. The growing concern has led to the emergence of the terms 'ChatGPT psychosis' or ' AI psychosis ' — referring to the potential harmful mental health effects of interacting with AI. It's even drawing attention from lawmakers: This month, Illinois enacted restrictions on AI in mental health care, banning its use for therapy and prohibiting mental health professionals from using AI to communicate with clients or make therapeutic decisions. (Similar restrictions have already been passed in Nevada and Utah.) But none of this is stopping people from turning to chatbots for support, especially amid clinician shortages, rising therapy costs, and inadequate mental health insurance coverage. 'People have absolutely reported that experiences with chatbots can be helpful,' said Wright. The Draw of Chatbots for Mental Health Data shows we're facing a massive shortage of mental health workers, especially in remote and rural areas, said psychologist Elizabeth Stade, PhD, a researcher in the Computational Psychology and Well-Being Lab at Stanford University in Stanford, CA. 'Of adults in the United States with significant mental health needs, only about half are able to access any form of treatment. With youth, that number is closer to 75%,' said Jessica Schleider, PhD, a child and adolescent psychologist at Northwestern University in Chicago. 'The provider shortage is clearly contributing to why so many folks are turning to their devices and, now increasingly, to generative AI to fill that gap.' Unlike a therapist, a chatbot is available 24/7. 'When [people] need help the most, it is typically after hours,' said Wright, who suggested the right AI tool could potentially supplement human therapy. 'When it's 2 a.m. and you're in crisis, could this help provide some support?' Probably, she said. Results of the first clinical trial of an AI-generative therapy chatbot showed 'significant, clinically meaningful reductions in depression, anxiety, and eating disorder symptoms' within four to eight weeks, said lead study author Michael V. Heinz, MD, a professor at Dartmouth College's Geisel School of Medicine and faculty affiliate at the Center for Technology and Behavioral Health in Lebanon, New Hampshire. The chatbot — Therabot, developed at Dartmouth — combines extensive training in evidence-based psychotherapy interventions with advanced generative AI. 'We saw high levels of user engagement — six-plus hours on average across the study,' Heinz said. Participants said using Therabot was like talking to a human therapist. But results are early, and more studies are needed, Heinz said. Access and affordability drew Russell to ChatGPT, they said. 'I didn't set out to use ChatGPT as a therapist. I quit therapy in January due to income dropping. I was already using ChatGPT on the regular for work, and then I started using it for personal idea exploration. ... I've never had a therapist who could move as fast as ChatGPT and ignore miscellaneous things,' they said. Perhaps one of the most appealing aspects is that chatbots don't judge. 'People are reluctant to be judged, and so they are often reluctant to disclose symptoms,' said Jonathan Gratch, PhD, professor of computer science and psychology at the University of Southern California, who has researched the topic. One of his studies found that military veterans were more likely to share PTSD symptoms with a virtual chatbot than in a survey. When Chatbots Are Harmful Most people don't know how AI works — they might believe it's always objective and factual, said Henry A. Willis, PhD, a psychologist and professor at the University of Maryland in College Park. But often, the data they're trained on is not representative of minority groups, leading to bias and technology-mediated racism, Willis said. 'We know that Black and brown communities are not adequately reflected in the majority of large-scale mental health research studies,' Willis said. So a chatbot's clinical symptom information or treatment recommendations may not be relevant or helpful to those from minority backgrounds. There's also an impersonal aspect. Chatbots do what's called ecological fallacy, said H. Andrew Schwartz, PhD, associate professor of computer science at Stony Brook University in Stony Brook, NY. They treat scattered comments like random data points, making assumptions based on group-level data that may not reflect the reality of individuals. And who's accountable if something goes wrong? Chatbots have been linked to cases involving suggestions of violence and self-harm, including the death of a teen by suicide. Some chatbots marketed for companionship and emotional support were designed with another incentive: to make money. Wright is concerned that they may unconditionally validate patients, telling them what they want to hear so they stay on the platform — 'even if what they're telling you is actually harmful or they're validating harmful responses from the user.' None of these conversations are bound by HIPAA regulations, either, Wright pointed out. 'So even though they may be asking for personal information or sharing your personal information, they have no legal obligation to protect it.' The Psychological Implications of Forming Emotional Bonds With AI In an opinion article published in April in the journal Trends in Cognitive Sciences, psychologists expressed concern about the long-term implications of forming emotional bonds with AI. Chatbots can replace users' real relationships, crowding out romantic partners, co-workers, and friends. This may mean that individuals begin to 'trust' the opinion and feedback of chatbots over real people, said Willis. 'The ongoing positive reinforcement that can happen instantly from interacting with a chatbot may begin to overshadow any reinforcement from interacting with real people,' who may not be able to communicate as quickly, he said. 'These emotional bonds may also impair people's ability to have a healthy level of skepticism and critical evaluation skills when it comes to the responses of AI chatbots.' Gratch compared it to hunger and food. 'We're biologically wired to seek out food when we get hungry. It is the same with social relationships. If we haven't had a relationship in a while, we may feel lonely, and then that motivates us to go out and reach out to people.' But studies suggest that social interaction with a computer program, like a chatbot, can sate a person's social needs and demotivate them to go out with friends, he said. 'That may have long-term consequences for increased loneliness. For example, research has shown people who compulsively use Facebook tend to be much more lonely.' Counseling with a therapist involves 'a natural curiosity about the individual and their experiences that AI cannot replicate,' Willis said. 'AI chatbots respond to prompts, whereas therapists can observe and ask clinical questions based on one's body language, a synthesis of their history, and other things that may not be conscious to the client — or things the client may not even be aware are important to their mental health well-being.' The Future of AI Therapy "I think there is going to be a future where you have really well-developed [chatbots] for addressing mental health that are scientifically driven and where they are ensuring that there are guardrails in place when somebody is in crisis. We're just not quite there yet,' said the APA's Wright. 'We may get to a place where they're even reimbursed by insurance,' she said. 'I do think increasingly we are going to see providers begin to adopt these technology tools as a way to meet their patients' needs.' But for now, her message is clear: The chatbots are not there yet. 'Ideally, chatbot design should encourage sustained, meaningful interaction with the primary purpose of delivering evidence-based therapy,' said Dartmouth's Heinz. Until then, don't rely on them too heavily, the experts cautioned — and remember, they are not a substitute for professional help.


Medscape
3 minutes ago
- Medscape
GLP-1s Don't Increase Suicide Risk
It's not uncommon for me, when discussing a GLP-1 medication with patients, to have them inquire about the drug's purported risk in increasing suicidal ideation. What has never occurred? Patients asking about the possible benefits of GLP-1 medications to mood and to their decreasing the risk of suicidal ideation. On the face of existing data, that's odd considering there is far more substantial evidence that GLP-1 users see improvements in mood: with no increase —and possibly decreased — risk of suicide. It's not odd, though, when considering things through the lens of how risk is overestimated by the public along with the lens of weight bias where, at least with obesity medications, negative findings — however small, tenuous, or early — tend to be readily internalized and amplified, while positive findings are often minimized and ignored. Where and how did the concern about suicide arise? On July 11, 2023, the European Medicines Agency (EMA) released a statement that, consequent to the Icelandic medicines agency's highlighting three case reports involving suicide among GLP-1 users, they would be investigating further. The EMA's statement rightly and explicitly noted, ' The presence of a signal does not necessarily mean that a medicine caused the adverse event in question .' But much of the media didn't seem to care how premature or unsubstantiated the putative risk, and this story definitely had legs gaining scary coverage in most major media outlets. By way of example, the BBC, in its story headlined Weight-loss jabs investigated for suicide risk , rather than responsibly covering the prematurity of concluding anything at all, chose this quote from the EMA's statement to highlight, ' A signal is information on a new or known adverse event that is potentially caused by a medicine and that warrants further investigation .' No doubt this sort of reporting is in part why, to this day, patients still recount suicidality as a concern when discussing these medications. But what has happened on this file since the July 2023 initiated investigation of those three purported cases? While nonexhaustive, here's a brief rundown: In April 2024, the EMA's own investigation exonerated GLP-1 medications as a source of increased suicide risk in April 2024. The US FDA also conducted its own investigation and similarly found no ties between GLP-1s and suicide risk. In June 2024. a paper was published demonstrating no increase risk in suicidality among 36,083 adults prescribed GLP-1 medications. In August 2024, a paper was published investigating GLP-1 medications and their impact on 22 neurological and psychiatric outcomes over 12 months. It found no impact among 23,386 GLP-1 users. In September 2024, a paper was published demonstrating no increased risk in suicidality among a cohort of 124,517 adults prescribed GLP-1 medications over a 1-year period. In October 2024, a paper was published demonstrating a 33% reduction in suicide ideation or attempts among 6912 adolescents with obesity for those who initiated treatment with GLP-1s over 3 years of follow-up. In February 2025. a paper was published of a nationwide (France) case-time-control of individuals who had attempted or died by suicide which found no linkage with GLP-1 use. In May 2025. a paper was published demonstrating in a meta-analysis of doubly blind randomized controlled trials that, among the 107,000 patients studied, GLP-1 use was not associated with increased risk of psychiatric adverse events or worsening depressive symptoms relative to placebo. Instead, GLP-1 use was associated with improvements in both physical and mental health-related quality of life. Finally, in June 2025, a paper was published describing a multinational self-controlled case series analysis of suicide or self-harm attempts in Hong Kong, Taiwan, and the United Kingdom that yes, again, demonstrated no increased risk among GLP-1 users and that, compared with the nontreatment period, lower suicide attempt or self-harm risk following GLP-1 treatment was observed, especially after longer periods of treatment. Negative publication bias appears to extend beyond academia into society at large. However, it manifests somewhat differently. The studies receiving the most attention are often those reporting literally negative (ie, adverse) outcomes. In contrast, more rigorous studies that challenge these negative findings, even if publicized, rarely achieve comparable societal penetration or awareness.


Fast Company
3 minutes ago
- Fast Company
A new FDA-approved device could change how Americans lose weight
Small wearable patches that continuously scan your blood and zap numbers to your smartphone are about to be everywhere. Once the sole domain of people with diabetes, glucose monitors are gaining popularity as a health tracking accessory not that different from an Apple Watch or an Oura Ring. On Wednesday, the FDA approved a glucose monitoring system from the startup Signos for weight management – a first for the tech, which represents a growing corner of the digital wellness space. 'Everyone deserves access to insights that help them live healthier, longer, more vibrant lives,' Signos CEO Sharam Fouladgar-Mercer said in a press release. 'Signos isn't just about data; it's about giving people ownership over their health and weight journeys in a way never before seen.' Signos designed an app that sets glucose range goals based on individual needs and tailors them over time, encouraging its users to make healthy lifestyle choices to meet them. The app pairs with the Stelo continuous glucose monitor (CGM) sensor patch made by Dexcom, which already sells its devices over the counter for around $100 a month. 'This is the first-ever FDA clearance for a system for weight that isn't a pharmacological intervention, isn't a surgical intervention,' Fouladgar-Mercer told STAT. FDA approval is an expensive, time-intensive process, but obtaining the clearance means a Signos subscription could be covered by HSA and FSA plans and potentially be reimbursed by insurance in the future. Signos sells its plans for $129 for six months, which includes glucose sensors and access to its app, which converts real-time blood sugar data into health insights and tips. The company can also now boast about being the 'world's first and only FDA-cleared app and CGM for weight management and wellness,' which might take it pretty far in a soon-to-be crowded market. As is the case with all technology these days, Signos uses AI to offer personalized recommendations based on the data it logs. Users can also track their food and exercise within the company's app, blending that data with glucose monitoring to make changes to their behavior and diet. Fouladgar-Mercer believes that bariatric surgery patients and some people using GLP-1s for weight loss will want the additional insights possible with the Signos app. The app also offers an alternative non-pharmaceutical path for people looking for new ways to succeed in their weight management goals, and could provide an off-ramp for former GLP-1 users looking to maintain their results with behavioral changes. One piece of the MAHA movement Obsessively quantifying the self right down to your blood sugar levels isn't everyone's cup of tea, but it is a major health goal for the Trump administration. Health and Human Services Secretary Robert F. Kennedy Jr. is championing health tracking devices as part of the Make America Healthy Again movement, which is pursuing disparate health goals from regulating ultra-processed foods to investigating well-studied vaccines, sowing doubt about life-saving shots in the process. 'It's a way… people can take control over their own health. They can take responsibility,' Kennedy said at a recent House Subcommittee on Health hearing. 'They can see, as you know, what food is doing to their glucose levels, their heart rates and a number of other metrics as they eat it, and they can begin to make good judgments about their diet, about their physical activity, about the way that they live their lives.' Beyond RFK Jr., the U.S. government is poised to have a major glucose tracking booster in the administration. In June, Trump swapped his previous surgeon general nominee out in favor of Dr. Casey Means, a wellness industry figure who co-founded a blood glucose monitoring startup called Levels. Like Signos, that company sells subscription access to an app, pairing custom software with off-the-shelf glucose monitors made by Dexcom. Unlike Signos, Levels is not FDA approved for weight management and is marketed for general wellness. Kennedy, a skeptic of weight loss drugs, has advocated for dietary and behavioral changes over GLP-1s to reverse the most worrying U.S. health trends. 'We think that wearables are a key to the MAHA agenda — Making America Healthy Again,'' Kennedy said. 'My vision is that every American is wearing a wearable within four years.'