logo
Alaska woman dies from rampant STD after infection spreads throughout her organs

Alaska woman dies from rampant STD after infection spreads throughout her organs

Daily Mail​04-07-2025
An Alaska woman has died after a rare and severe complication of gonorrhea, health officials report.
The unnamed woman, who was in her 50s, died this spring from disseminated gonococcal infection (DGI), which occurs when the sexually transmitted infection gonorrhea invades the bloodstream and travels to vital organs.
According to the Alaska Department of Health, the woman arrived at her local emergency department in Anchorage in heart failure and septic shock, the body's extreme overreaction to an infection.
She had contracted gonorrhea, which affects 700,000 Americans a year, at some point within the previous six months.
It's unclear if she had any other health issues other than opioid addiction and if she contracted gonorrhea from a long-term partner.
The diagnosis of DGI only came after her death because she declined so quickly. Her cause of death was primarily due to the sepsis and heart failure.
The woman's death comes as Alaska records the second-highest rate of STIs in the country, only falling behind Mississippi. Experts believe this is due to weak public health infrastructure and high rates of substance abuse, among other factors.
The latest data shows 25 people per 100,000 Alaska residents have gonorrhea, and cases of syphilis have surged 20-fold since 2016.
The woman in the report was one of eight Alaskans to be identified with DGI between January and May of this year, the health department said in a bulletin.
They ranged in age from 32 to 59, and five of them were women. The average age was 40. There were no other recorded deaths from DGI.
None of the patients in the report are thought to be connected to one another.
The woman who died had been treated twice in the prior six months for opioid addiction, but there was no record of gonorrhea testing.
Gonorrhea is an STI caused by the bacteria Neisseria gonorrhoeae, which spreads through bodily fluids like semen and vaginal fluids.
It can move from person to person through oral sex, intercourse or sharing sex toys with an infected person.
Most people with gonorrhea are between ages 15 and 24 and don't have symptoms, though the infection can cause unusual genital discharge, pain during sex, pain during urination, lower abdominal pain, itching, testicular pain in men and bleeding in between periods for women.
In DGI, gonorrhea infections travel to the bloodstream and infect organs throughout the body due to the infection going untreated.
DGI is thought to occur in just 0.5 percent of all gonorrhea cases.
Health officials writing the Alaska report said risk factors for DGI, based on the women's medical records, were methamphetamine and opioid use, alcoholism, injected drug use, homelessness and having multiple sexual partners within a year.
Cases of STIs in the US have spiked 90 percent in the last 20 years, but a recent slowdown has been observed.
In a 2024 CDC report, reported cases of gonorrhea fell for a second year, declining seven percent from 2022 to below pre-pandemic levels.
Alaska's health department recommends adults be tested for gonorrhea if they have at least one of the following risk factors: being under 25 years old, having a new partner, having more than one partner, previous STIs, a history of prostitution or a history of being incarcerated.
And people who are sexually active and have a new partner, history of drug use or past STI should be tested every three to six months.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Gonorrhoea vaccine becomes available at sexual health clinics in England
Gonorrhoea vaccine becomes available at sexual health clinics in England

The Independent

timean hour ago

  • The Independent

Gonorrhoea vaccine becomes available at sexual health clinics in England

A vaccine for gonorrhoea is now available at sexual health clinics in England as part of a world-first scheme. The vaccination programme is expected to save the NHS £7.9 million over the next decade and combat increasing levels of antibiotic-resistant strains of the disease. The move aims to tackle rising levels of the sexually transmitted infection (STI) after cases in England topped 85,000 in 2023, the highest since records began in 1918. The free jab will be on offer from Monday to patients at the highest risk of the sexually transmitted infection, including gay and bisexual men with a recent history of multiple sexual partners or a bacterial STI. The vaccine is an existing jab, known as 4CMenB, that is currently used to protect people against the meningococcal B disease, a serious bacterial infection that can cause meningitis and sepsis. It is used in the routine childhood programme and given to babies at eight weeks, 16 weeks and one year. The programme is targeted to those most at risk and could prevent up to 100,000 cases. Gonorrhoea disproportionately impacts specific communities, such as those in deprived areas, people of black Caribbean ethnicity, and gay, bisexual and other men who have sex with men, according to the Joint Committee on Vaccination and Immunisation (JCVI). Patients getting the gonorrhoea vaccine will also be offered jabs for mpox, human papillomavirus (HPV), and hepatitis A and B at their appointment. Ashley Dalton, the minister for public health and prevention, said: 'Rolling out this world-leading gonorrhoea vaccination programme in sexual health clinics in England represents a major breakthrough in preventing an infection that has reached record levels. 'This government's world-first vaccination programme will help turn the tide on infections, as well as tackling head-on the growing threat of antibiotic resistance. 'I strongly encourage anyone who is eligible to come forward for vaccination, to protect not only yourselves but also your sexual partners.'

Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot
Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot

The Guardian

time5 hours ago

  • The Guardian

Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot

Tran* sat across from me, phone in hand, scrolling. 'I just wanted to make sure I didn't say the wrong thing,' he explained, referring to a recent disagreement with his partner. 'So I asked ChatGPT what I should say.' He read the chatbot-generated message aloud. It was articulate, logical and composed – almost too composed. It didn't sound like Tran. And it definitely didn't sound like someone in the middle of a complex, emotional conversation about the future of a long-term relationship. It also did not mention anywhere some of Tran's contributing behaviours to the relationship strain that Tran and I had been discussing. Like many others I've seen in therapy recently, Tran had turned to AI in a moment of crisis. Under immense pressure at work and facing uncertainty in his relationship, he'd downloaded ChatGPT on his phone 'just to try it out'. What began as a curiosity soon became a daily habit, asking questions, drafting texts, and even seeking reassurance about his own feelings. The more Tran used it, the more he began to second-guess himself in social situations, turning to the model for guidance before responding to colleagues or loved ones. He felt strangely comforted, like 'no one knew me better'. His partner, on the other hand, began to feel like she was talking to someone else entirely. ChatGPT and other generative AI models present a tempting accessory, or even alternative, to traditional therapy. They're often free, available 24/7 and can offer customised, detailed responses in real time. When you're overwhelmed, sleepless and desperate to make sense of a messy situation, typing a few sentences into a chatbot and getting back what feels like sage advice can be very appealing. But as a psychologist, I'm growing increasingly concerned about what I'm seeing in the clinic; a silent shift in how people are processing distress and a growing reliance on artificial intelligence in place of human connection and therapeutic support. AI might feel like a lifeline when services are overstretched – and make no mistake, services are overstretched. Globally, in 2019 one in eight people were living with a mental illness and we face a dire shortage of trained mental health professionals. In Australia, there has been a growing mental health workforce shortage that is impacting access to trained professionals. Clinician time is one of the scarcest resources in healthcare. It's understandable (even expected) that people are looking for alternatives. Turning to a chatbot for emotional support isn't without risk however, especially when the lines between advice, reassurance and emotional dependence become blurred. Many psychologists, myself included, now encourage clients to build boundaries around their use of ChatGPT and similar tools. Its seductive 'always-on' availability and friendly tone can unintentionally reinforce unhelpful behaviours, especially for people with anxiety, OCD or trauma-related issues. Reassurance-seeking, for example, is a key feature in OCD and ChatGPT, by design, provides reassurance in abundance. It never asks why you're asking again. It never challenges avoidance. It never says, 'let's sit with this feeling for a moment, and practice the skills we have been working on'. Tran often reworded prompts until the model gave him an answer that 'felt right'. But this constant tailoring meant he wasn't just seeking clarity; he was outsourcing emotional processing. Instead of learning to tolerate distress or explore nuance, he sought AI-generated certainty. Over time, that made it harder for him to trust his own instincts. Beyond psychological concerns, there are real ethical issues. Information shared with ChatGPT isn't protected by the same confidentiality standards as registered Ahpra professionals. Although OpenAI states that data from users is not used to train its models unless permission is given, the sheer volume of fine print in user agreements often goes unread. Users may not realise how their inputs can be stored, analysed and potentially reused. There's also the risk of harmful or false information. These large language models are autoregressive; they predict the next word based on previous patterns. This probabilistic process can lead to 'hallucinations', confident, polished answers that are completely untrue. AI also reflects the biases embedded in its training data. Research shows that generative models can perpetuate and even amplify gender, racial and disability-based stereotypes – not intentionally, but unavoidably. Human therapists also possess clinical skills; we notice when a client's voice trembles, or when their silence might say more than words. This isn't to say AI can't have a place. Like many technological advancements before it, generative AI is here to stay. It may offer useful summaries, psycho-educational content or even support in regions where access to mental health professionals is severely limited. But it must be used carefully, and never as a replacement for relational, regulated care. Tran wasn't wrong to seek help. His instincts to make sense of distress and to communicate more thoughtfully were logical. However, leaning so heavily on to AI meant that his skill development suffered. His partner began noticing a strange detachment in his messages. 'It just didn't sound like you', she later told him. It turned out: it wasn't. She also became frustrated about the lack of accountability in his correspondence to her and this caused more relational friction and communication issues between them. As Tran and I worked together in therapy, we explored what led him to seek certainty in a chatbot. We unpacked his fears of disappointing others, his discomfort with emotional conflict and his belief that perfect words might prevent pain. Over time, he began writing his own responses, sometimes messy, sometimes unsure, but authentically his. Good therapy is relational. It thrives on imperfection, nuance and slow discovery. It involves pattern recognition, accountability and the kind of discomfort that leads to lasting change. A therapist doesn't just answer; they ask and they challenge. They hold space, offer reflection and walk with you, while also offering up an uncomfortable mirror. For Tran, the shift wasn't just about limiting his use of ChatGPT; it was about reclaiming his own voice. In the end, he didn't need a perfect response. He needed to believe that he could navigate life's messiness with curiosity, courage and care – not perfect scripts. Name and identifying details changed to protect client confidentiality Carly Dober is a psychologist living and working in Naarm/Melbourne In Australia, support is available at Beyond Blue on 1300 22 4636, Lifeline on 13 11 14, and at MensLine on 1300 789 978. In the UK, the charity Mind is available on 0300 123 3393 and Childline on 0800 1111. In the US, call or text Mental Health America at 988 or chat

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store