logo
Can we rely on AI chatbots for our mental well-being?

Can we rely on AI chatbots for our mental well-being?

Yahoo27-03-2025
In psychology, empathy is crucial because it fosters understanding, compassion, and strong relationships. It allows individuals to connect emotionally and create a more supportive and harmonious society. Alfred Adler, founder of the School of Individual Psychology, once described empathy as 'Seeing with the eyes of another, listening with the ears of another, and feeling with the heart of another.'
Today, artificial intelligence (AI) chatbots are being explored by psychologists to make therapy more accessible for patients, improve interventions, and aid in training new clinicians.
However, despite AI's potential, there is cause for concern. In tests, chatbots have spread misinformation, professed their inner desires, and even sexually harassed patients, all of which have prompted leaders in tech and science to call for a pause.
The traditional approach in therapy refers to the established methods and practices used in the field of psychology and mental health treatment. This often involves face-to-face sessions, using psychological theories and techniques to address mental health issues and focus on an individual's thoughts, feelings, and behaviours.
Traditional therapy demonstrates significant effectiveness in improving mental health. Many studies consistently show that psychotherapy, including Cognitive Behavioral Therapy (CBT) and other forms of face-to-face therapy, can be highly effective in treating a wide range of mental health issues.
Studies indicate that face-to-face therapy is effective for many mental health issues. According to a survey by the American Psychological Association, 50% of all clients had improved symptoms after they received a face-to-face mental health treatment in eight sessions, with 75% experiencing improved symptoms by six months.
Ultimately, human connection is the foundation of the relationship between the therapist and patient. When a patient feels connected to their therapist, they are likely to engage in therapy and see improvements in their mental health.
Although traditional therapy works, there is a shortage of mental health practitioners around the world. In February 2024, the US Health Resources and Services Administration estimated that 122 million Americans lived in areas with a shortage of mental healthcare providers. It is estimated that the country needs about 6,000 clinicians to cover the gap.
Therapists' work and patient load have risen in response to increased demand. According to the American Psychological Association, the percentage of therapists working overtime before the pandemic grew from 31% in 2020 to 38% in 2022. Amid this increasing workload, more psychologists fail to meet the treatment demands of their patients.
Across the pond, in 2022, the NHS reported a shortage of 2,000 qualified therapists in the UK. A general practitioner in the UK stated in response to a British Medical Association survey: 'Mental healthcare in this country is dysfunctional. It's broken.'
In response to the shortage of practitioners, many therapists and patients are resorting to AI chatbots for therapy and mental health support.
AI chatbots are large language models (LLMs) that can provide mental health support through automated conversations and therapeutic exercises. Apps like Woebot, Youper, and Character.ai had over a million downloads in 2024. These chatbots have been used to support people dealing with mild depression, loneliness, anxiety, and other mental issues. When people come to them with a problem, these bots respond in ways a therapist might—they ask questions and suggest coping mechanisms.
While AI chatbots might seem like a useful and cost-effective way of addressing mental health issues, there is one hurdle they will likely never overcome: a chatbot will never possess' human emotions, no matter how convincingly it mimics them.
Emotions in humans are complex phenomena, deeply intertwined with our sensory and motor systems, influencing our decisions and behaviors. In contrast, AI systems lack an intrinsic emotion module, which fundamentally differentiates AI from human intelligence.
Also, AI is dependent on the context and pre-existing data to which it has access, meaning that any biases existing in that content will manifest in the AI's responses. As a result, the integration of AI chatbots raises the possibility of racist, sexist, ageist, and other types of biased responses finding their way into conversations and inappropriate responses.
Ultimately, while AI offers human-like responses, chatbots will never understand and express human emotion. This poses a substantial challenge when applied in psychology and therapy, which are built on human interaction, trust, emotional intelligence, and a sense of mutual understanding. Educators will need to consider the accuracy and reliability of AI, as well as privacy and data security in psychology.
"Can we rely on AI chatbots for our mental well-being?" was originally created and published by Medical Device Network, a GlobalData owned brand.
The information on this site has been included in good faith for general informational purposes only. It is not intended to amount to advice on which you should rely, and we give no representation, warranty or guarantee, whether express or implied as to its accuracy or completeness. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content on our site.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Illinois becomes third state to restrict use of artificial intelligence in mental health industry as experts warn about ‘AI psychosis'
Illinois becomes third state to restrict use of artificial intelligence in mental health industry as experts warn about ‘AI psychosis'

New York Post

time4 hours ago

  • New York Post

Illinois becomes third state to restrict use of artificial intelligence in mental health industry as experts warn about ‘AI psychosis'

Illinois passed a bill banning therapists from employing artificial intelligence chatbots for assistance with mental health therapy, as experts countrywide warn against people's ever-growing reliance on the machines. The 'Therapy Resources Oversight' legislation prohibits licensed mental health professionals in Illinois from using AI for treatment decisions or communication with clients. It also bans companies from recommending chatbot therapy tools as a be-all alternative to traditional therapy. 5 Illinois became the third state to pass a bill banning therapists from relying on artificial intelligence. terovesalainen – Advertisement Enforcement of the bill will rely on complaints from the public that the Illinois Department of Financial and Professional Regulation will investigate. Anyone determined to be violating the ban could face a civil penalty of up to $10,000, according to the legislation text. Utah and Nevada, two Republican-run states, previously passed similar laws limiting AI's capacity in mental health services in May and late June, respectively. Unregulated chatbots can take harmless conversations in any direction, sometimes incidentally leading people into divulging sensitive information or pushing people who are already in vulnerable situations to do something drastic, like take their own life, experts have warned. Advertisement A Stanford University study released in June found that many chatbots, which are programmed to respond enthusiastically to users, fail to sidestep concerning prompts — including requests for high bridges in specific locations to jump off of. 5 Utah and Nevada previously passed similar bans limiting AI. AnnaStills – Whereas chatbots affirm unequivocally regardless of the circumstance, therapists provide support and the means to help their patients improve, Vaile Wright, senior director for the office of health care innovation at the American Psychological Association, told the Washington Post. 'Therapists are validating, but it's also our job to point out when somebody is engaging in unhealthy thoughts, feelings, behaviors and then help somebody challenge those and find better options,' Wright told the outlet. Advertisement 5 Experts warn that overreliance on AI is creating 'psychosis' for heavy users. Ascannio – The bans, though, are difficult to effectively enforce — and can't prevent everyday people from turning to AI for mental health assistance on their own. New research released in early August found that many bots like ChatGPT are inducing 'AI psychosis' in unwitting users with no history of mental illnesses. 5 Some chatbots have allegedly contributed to users' suicide attempts. PhotoGranary – Advertisement Roughly 75% of Americans have used some form of AI in the last six months, with 33% reporting daily usage for anything from help on homework to desperate romantic connections. This deep engagement is breeding psychological distress in heavy users, according to the digital marketing study. Many youth, in particular, are falling down the chatbot rabbit hole and turning to machines to supplement human interaction. 5 Roughly 75% of Americans have used AI in some capacity over the last six months, according to one study. Vane Nunes – a popular platform where users can create and share chatbots usually based on fictional characters, had to place a warning clarifying that anything the bots say 'should not be relied upon as fact or advice' after a Florida teen fell in love with his 'Game of Thrones' AI character and took his own life. The platform is still dealing with a lawsuit filed against the company for the teen's death. Despite repeated attempts to dismiss it on First Amendment grounds, a federal judge ruled that the suit could move forward in August. Another Texas family sued after a chatbot on the app named 'Shonie' encouraged their autistic son to cut himself.

FDA may revoke Pfizer COVID-19 shot for young, healthy children.
FDA may revoke Pfizer COVID-19 shot for young, healthy children.

USA Today

time8 hours ago

  • USA Today

FDA may revoke Pfizer COVID-19 shot for young, healthy children.

Parents won't be able to vaccinate their healthy children 6 months to 5 years old if the FDA pulls Pfizer's COVID-19 vaccine authorization. The Food and Drug Administration may revoke authorization for Pfizer's COVID-19 vaccine for healthy children under age 5, the pharmaceutical company confirmed, which would limit parents' vaccine options ahead of the winter respiratory virus season. The possibility comes several months after President Donald Trump's Department of Health and Human Services began placing limits on COVID-19 vaccines. For the last four years, updated COVID-19 vaccines have been made available in the fall for most Americans before the cold sets in. The federal agency told Pfizer that it might not renew the emergency use authorization, or EUA, for the Pfizer-BioNTech COVID-19 vaccine Comirnaty for children ages 6 months through 4 years, according to a statement sent to USA TODAY. 'We are currently in discussions with the agency on potential paths forward and have requested that the EUA for this age group remain in place for the 2025-2026 season,' a company spokesperson said. 'It is important to note that these deliberations are not related to the safety and efficacy of the vaccine, which continues to demonstrate a favorable profile.' HHS spokesperson Andrew Nixon declined to predict what the agency might do. 'We do not comment on potential, future regulatory changes,' Nixon said. 'Unless officially announced by HHS, discussion about future agency action should be regarded as pure speculation.' In July, Moderna received full FDA approval for its COVID-19 vaccine for children 6 months to 11 years old who are at increased risk of contracting COVID-19. The vaccine, Spikevax, is expected to be available for eligible populations in the 2025-26 respiratory virus season. The Centers for Disease Control and Prevention still recommends older patients get vaccinated against COVID-19, as well as people whose immune systems have been weakened by illness or medical treatments such as chemotherapy. But if Pfizer loses its EUA, parents won't have the option to vaccinate their healthy young children. HHS Secretary Robert F. Kennedy Jr. has long sown doubt about a wide range of vaccines, while saying parents should be able to choose which vaccines their children should get. In late May, he announced that the COVID-19 vaccine for healthy children and healthy pregnant women was removed from the Centers for Disease Control and Prevention's immunization schedule. 'There's no evidence healthy kids need it today," Dr. Marty Makary, FDA commissioner, said in the May 27 video announcement posted to X. The American Academy of Pediatrics pushed back, saying at the time that the removal 'ignores independent medical experts and leaves children at risk.' Ending access to vaccination for healthy young children would strip families of choice, said Dr. Sean O'Leary, chair of the academy's committee on infectious diseases, in a May statement. 'Those who want to vaccinate may no longer be able to, as the implications for insurance coverage and access remain unclear," he said.

Wegmans recalls cheese sold across 10 US states over listeria risk
Wegmans recalls cheese sold across 10 US states over listeria risk

USA Today

time8 hours ago

  • USA Today

Wegmans recalls cheese sold across 10 US states over listeria risk

Grocery store chain Wegmans is recalling its Medium Camembert Soft Ripened Cheese, as well as products that contain it, due to a potential listeria contamination, the U.S. Food and Drug Administration (FDA) said in a notice. In the FDA alert posted on Wednesday, Aug. 13, Wegmans said the items affected by the recall were sold between July 1 and Aug. 12 in stores located in Connecticut, Delaware, Maryland, Massachusetts, New Jersey, New York, North Carolina, Pennsylvania, Virginia and Washington, D.C. The company said the cheese was supplied to the store by Estancia Holdings from Cumming, Georgia. As of now, no illnesses have been reported in connection with the cheese recall, according to the alert. The FDA said Wegmans has placed automated phone calls to customers who purchased the cheese using their Shoppers Club card. Customers who bought the affected cheese can return it to the store's service desk for a full refund, per the notice. What products are being recalled? The FDA provided a list of products affected by the recall: Listeria poisoning symptoms Listeriosis, or listeria poisoning, is a foodborne bacterial infection most commonly caused by the bacterium Listeria monocytogenes, according to the Centers for Disease Control and Prevention (CDC). It is considered a serious condition and can be dangerous or life-threatening, especially to older adults, people with weak immune systems and pregnant women. Listeria is the third leading cause of death from foodborne illness in the U.S., according to the CDC. The agency estimates that the disease impacts 1,600 Americans each year, with approximately 260 people dying from those infections. Per the CDC, symptoms include: USA TODAY's Saman Shafiq and Natalie Neysa Alund contributed to this report. Fernando Cervantes Jr. is a trending news reporter for USA TODAY. Reach him at and follow him on X @fern_cerv_.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store