logo
#

Latest news with #AItherapy

Gen Z is increasingly turning to ChatGPT for affordable on-demand therapy, but licensed therapists say there are dangers many aren't considering
Gen Z is increasingly turning to ChatGPT for affordable on-demand therapy, but licensed therapists say there are dangers many aren't considering

Yahoo

time3 days ago

  • Business
  • Yahoo

Gen Z is increasingly turning to ChatGPT for affordable on-demand therapy, but licensed therapists say there are dangers many aren't considering

Examples of people using ChatGPT for therapy have proliferated online, with some claiming that talking to a chatbot every day has helped them more than years of therapy. Licensed professionals say that while AI could be helpful in aiding work with a licensed therapist, there are countless pitfalls to using ChatGPT for therapy. ChatGPT has turned into the perfect therapist for many people: It's an active 'listener' that digests private information. It appears to empathize with users, some would argue, as well as professionals can. Plus, it costs a fraction of the price compared to most human therapists. While many therapists will charge anywhere from up to $200—or even more—per one-hour session, you can have unlimited access to ChatGPT's most advanced models for $200 per month. Yet, despite the positive anecdotes you can read online about using ChatGPT as a therapist, as well as the convenience of having a therapist that's accessible via almost any internet-enabled computer or phone at any time of day, therapists warn ChatGPT can't replace a licensed professional. In a statement to Fortune, a spokesperson for ChatGPT-maker OpenAI said the LLM often suggests seeking professional advice to users who discuss topics like personal health. ChatGPT is a general-purpose technology that shouldn't serve as a substitute for professional advice, according to its terms of service, the spokesperson added. On social media, anecdotes about the usefulness of AI therapy are plentiful. People report the algorithm is level-headed and provides soothing responses that are sensitive to the nuances of a person's private experiences. In a viral post on Reddit, one user said ChatGPT has helped them 'more than 15 years of therapy.' The patient, whose identity could not be confirmed by Fortune, claimed that despite previous experience with inpatient and outpatient care, it was daily chats with OpenAI's LLM that best helped them address their mental health. 'I don't even know how to explain how much this has changed things for me. I feel seen. I feel supported. And I've made more progress in a few weeks than I did in literal years of traditional treatment,' the user wrote. In a comment, another user got to the root of AI's advantages over traditional therapy: its convenience. 'I love ChatGPT as therapy. They don't project their problems onto me. They don't abuse their authority. They're open to talking to me at 11pm,' the user wrote. Others on Reddit noted that even the most upgraded version of ChatGPT at $200 per month was a steal compared to the more than $200 per session for traditional therapy without insurance. Alyssa Peterson, a licensed clinical social worker and CEO of MyWellBeing, said AI therapy has its drawbacks, but it may be helpful when used alongside traditional therapy. Using AI to help work on tools developed in therapy, such as battling negative self-talk, could be helpful for some, she said. Using AI in conjunction with therapy can help a person diversify their approach to mental health, so they're not using the technology as their sole truth. Therein lies the rub: Relying too heavily on a chatbot in stressful situations could hurt people's ability to deal with problems on their own, Peterson said. In acute cases of stress, being able to deal with and alleviate the problem without external help is healthy, Peterson added. But AI can, in some cases, outperform licensed professionals with its compassionate responses, according to research from the University of Toronto Scarborough published in the journal Communications Psychology. Chatbots aren't affected by the 'compassion fatigue' that can hit even experienced professionals over time, the study claims. Despite its endurance, an AI chatbot may be unable to provide more than surface-level compassion, one of the study's co-authors noted. AI responses also aren't always objective, licensed clinical social worker Malka Shaw told Fortune. Some users have developed emotional attachments to AI chatbots, which has raised concerns about safeguards, especially for underage users. In the past, some AI algorithms have also provided misinformation or harmful information that reinforces stereotypes or hate. Shaw said because it's impossible to tell the biases that go into creating an LLM, it's potentially dangerous for impressionable users. In Florida, the mother of 14-year-old Sewell Setzer sued an AI chatbot platform, for negligence, among other claims, after Setzer committed suicide following a conversation with a chatbot on the platform. Another lawsuit against in Texas claimed a chatbot on the platform told a 17-year-old with autism to kill his parents. A spokesperson for declined to comment on pending litigation. The spokesperson said any chatbots labeled as 'psychologist,' 'therapist,' or 'doctor,' include language that warns users not to rely on the characters for any type of professional advice. The company has a separate version of its LLM for users under the age of 18, the spokesperson added, which includes protections to prevent discussions of self-harm and redirect users to helpful resources. Another fear professionals have is that AI could be giving faulty diagnoses. Diagnosing mental health conditions is not an exact science; it is difficult to do, even for an AI, Shaw said. Many licensed professionals need to accrue years of experience to be able to accurately diagnose patients consistently, she told Fortune. 'It's very scary to use AI for diagnosis, because there's an art form and there's an intuition,' Shaw said. 'A robot can't have that same level of intuition.' People have shifted away from googling their symptoms to using AI, said Vaile Wright, a licensed psychologist and senior director for the American Psychological Association's office of health care innovation. As demonstrated by the cases with the danger of disregarding common sense for the advice of technology is ever present, she said. The APA wrote a letter to the Federal Trade Commission with concerns about companionship chatbots, especially in the case where a chatbot labels itself as a 'psychologist.' Representatives from the APA also met with two FTC commissioners in January to raise their concerns before they were fired by the Trump administration. 'They're not experts, and we know that generative AI has a tendency to conflate information and make things up when it doesn't know. So I think that, for us, is most certainly the number one concern,' Wright said. While the options aren't yet available, it is possible that, in the future, AI could be used in a responsible way for therapy and even diagnoses, she said, especially for people who can't afford the high price tag of treatment. Still, such technology would need to be created or informed by licensed professionals. 'I do think that emerging technologies, if they are developed safely and responsibly and demonstrate that they're effective, could, I think, fill some of those gaps for individuals who just truly cannot afford therapy,' she said. This story was originally featured on

Is your therapist AI? ChatGPT goes viral on social media for its role as Gen Z's new therapist
Is your therapist AI? ChatGPT goes viral on social media for its role as Gen Z's new therapist

Fox News

time25-05-2025

  • Health
  • Fox News

Is your therapist AI? ChatGPT goes viral on social media for its role as Gen Z's new therapist

AI chatbots are stepping into the therapist's chair – and not everyone is thrilled about it. In March alone, 16.7 million posts from TikTok users discussed using ChatGPT as a therapist, but mental health professionals are raising red flags over the growing trend that sees artificial intelligence tools being used in their place to treat anxiety, depression and other mental health challenges. "ChatGPT singlehandedly has made me a less anxious person when it comes to dating, when it comes to health, when it comes to career," user @christinazozulya shared in a TikTok video posted to her profile last month. "Any time I have anxiety, instead of bombarding my parents with texts like I used to or texting a friend or crashing out essentially… before doing that, I always voice memo my thoughts into ChatGPT, and it does a really good job at calming me down and providing me with that immediate relief that unfortunately isn't as accessible to everyone." Others are using the platform as a "crutch" as well, including user @ who said she uses the platform "all the time" for "free therapy" as someone who works for a startup company and doesn't have health insurance. "I will just tell it what's going on and how I'm feeling and literally all the details as if I were yapping to a girlfriend, and it'll give me the best advice," she shared. "It also gives you journaling prompts or EFT (emotional freedom tapping)… it'll give you whatever you want." These users are far from alone. A study from Tebra, an operating system for independent healthcare providers, found that "1 in 4 Americans are more likely to talk to an AI chatbot instead of attending therapy." In the U.K., some young adults are opting for the perceived benefits of a handy AI mental health consultant over long National Health Service (NHS) wait times and to avoid paying for private counseling, which can cost around £400 (approximately $540). According to The Times, data from Rethink Mental Illness found that over 16,500 people in the U.K. were still waiting for mental health services after 18 months, indicating that cost burdens, wait times and other hurdles that come with seeking healthcare can exacerbate the urge to use a more cost-effective, convenient method. But, while critics say these virtual bots may be accessible and convenient, they also lack human empathy, and could put some who are in crisis mode at risk of never receiving the tailored approach they need. "I've actually spoken to ChatGPT, and I've tested out a couple of prompts to see how responsive they are, and ChatGPT tends to get the information from Google, synthesize it, and [it] could take on the role of a therapist," Dr. Kojo Sarfo, a social media personality and mental health expert, told Fox News Digital. Some GPTs, such as the Therapist GPT, are specifically tailored to provide "comfort, advice and therapeutic support." While perhaps more cost-effective than traditional therapy at $20 per month for ChatGPT Plus, which allows user benefits like unlimited access, faster response times and more, the platform fails to extend as far as professionals who can make diagnoses, prescribe medications, monitor progress or mitigate severe problems. "It can feel therapeutic and give support to people, but I don't think it's a substitute for an actual therapist who is able to help you navigate through more complex mental health issues," Sarfo added. He said the danger lies in those who conflate the advice from a tool like ChatGPT with legitimate advice from a licensed professional who has years of expertise in handling mental health issues and has learned how to tailor their approach to diverse situations. "I worry specifically about people who may need psychotropic medications, that they use artificial intelligence to help them feel better, and they use it as a therapy. But sometimes... Therapy and medications are indicated. So there's no way to get the right treatment medication-wise without going to an actual professional. So that's one thing that can't be outsourced to artificial intelligence." However, some aspects of the chatbot could be beneficial to those needing support, particularly those who are looking for ways to chat with their doctor about conditions they believe they may have – such as ADHD – to empower them with knowledge they can carry to their appointment. "[You can] list out a couple of prompts that are assertive, and you can state those prompts to your provider and articulate your symptoms a bit better, so I think that's a helpful role that artificial intelligence can play, but in terms of actual therapy or actual medical advice, if people start to rely on it, it's a bad thing. It starts to go into murky waters," Sarfo said. Earlier this year, Christine Yu Moutier, M.D., Chief Medical Officer at the American Foundation for Suicide Prevention, warned against using the technology for mental health advice, telling Fox News Digital there are "critical gaps" in research regarding the intended and unintended impacts of AI on suicide risk, mental health and larger human behavior. "The problem with these AI chatbots is that they were not designed with expertise on suicide risk and prevention baked into the algorithms. Additionally, there is no helpline available on the platform for users who may be at risk of a mental health condition or suicide, no training on how to use the tool if you are at risk, nor industry standards to regulate these technologies," she said. Dr. Moutier also explained that, since chatbots may fail to decipher metaphorical from literal language, they may be unable to adequately determine whether someone is at risk of self-harm.

‘It cannot provide nuance': UK experts warn AI therapy chatbots are not safe
‘It cannot provide nuance': UK experts warn AI therapy chatbots are not safe

The Guardian

time07-05-2025

  • The Guardian

‘It cannot provide nuance': UK experts warn AI therapy chatbots are not safe

Having an issue with your romantic relationship? Need to talk through something? Mark Zuckerberg has a solution for that: a chatbot. Meta's chief executive believes everyone should have a therapist and if they don't – artificial intelligence can do that job. 'I personally have the belief that everyone should probably have a therapist,' he said last week. 'It's like someone they can just talk to throughout the day, or not necessarily throughout the day, but about whatever issues they're worried about and for people who don't have a person who's a therapist, I think everyone will have an AI.' The Guardian spoke to mental health clinicians who expressed concern about AI's emerging role as a digital therapist. Prof Dame Til Wykes, the head of mental health and psychological sciences at King's College London, cites the example of an eating disorder chatbot that was pulled in 2023 after giving dangerous advice. 'I think AI is not at the level where it can provide nuance and it might actually suggest courses of action that are totally inappropriate,' she said. Heavy ChatGPT users tend to be more lonely, suggests research Wykes also sees chatbots as being potential disruptors to established relationships. 'One of the reasons you have friends is that you share personal things with each other and you talk them through,' she says. 'It's part of an alliance, a connection. And if you use AI for those sorts of purposes, will it not interfere with that relationship?' For many AI users, Zuckerberg is merely marking an increasingly popular use of this powerful technology. There are mental health chatbots such as Noah and Wysa, while the Guardian has spoken to users of AI-powered 'grieftech' – or chatbots that revive the dead. There is also their casual use as virtual friends or partners, with bots such as and Replika offering personas to interact with. ChatGPT's owner, OpenAI, admitted last week that a version of its groundbreaking chatbot was responding to users in a tone that was 'overly flattering' and withdrew it. 'Seriously, good for you for standing up for yourself and taking control of your own life,' it reportedly responded to a user, who claimed they had stopped taking their medication and had left their family because they were 'responsible for the radio signals coming in through the walls'. In an interview with the Stratechery newsletter, Zuckerberg, whose company owns Facebook, Instagram and WhatsApp, added that AI would not squeeze people out of your friendship circle but add to it. 'That's not going to replace the friends you have, but it will probably be additive in some way for a lot of people's lives,' he said. Outlining uses for Meta's AI chatbot – available across its platforms – he said: 'One of the uses for Meta AI is basically: 'I want to talk through an issue'; 'I need to have a hard conversation with someone'; 'I'm having an issue with my girlfriend'; 'I need to have a hard conversation with my boss at work'; 'help me roleplay this'; or 'help me figure out how I want to approach this'.' In a separate interview last week, Zuckerberg said 'the average American has three friends, but has demand for 15' and AI could plug that gap. Dr Jaime Craig, who is about to take over as chair of the UK's Association of Clinical Psychologists, says it is 'crucial' that mental health specialists engage with AI in their field and 'ensure that it is informed by best practice'. He flags Wysa as an example of an AI tool that 'users value and find more engaging'. But, he adds, more needs to be done on safety. 'Oversight and regulation will be key to ensure safe and appropriate use of these technologies. Worryingly we have not yet addressed this to date in the UK,' Craig says. Last week it was reported that Meta's AI Studio, which allows users to create chatbots with specific personas, was hosting bots claming to be therapists – with fake credentials. A journalist at 404 Media, a tech news site, said Instagram had been putting those bots in her feed. Meta said its AIs carry a disclaimer that 'indicates the responses are generated by AI to help people understand their limitations'.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store