23-03-2025
The robot empathy divide
A new digital divide is growing between people who trust AI for emotional support and those who don't.
Why it matters: AI startups are pushing their tools not just as enterprise productivity enhancers, but also as therapists, companions and life coaches.
Driving the news: Two new studies from OpenAI, in partnership with MIT Media Lab, found that users are turning to bots to help cope with difficult situations because they say that the AI is able to "display human-like sensitivity."
The studies found that ChatGPT "power users" are likely to consider the bot a "friend" and find it more comfortable to interact with the bot than with people.
The big picture: On one hand, more than half (55%) of 18-to-29-year-old Americans feel comfortable chatting with AI about mental health concerns, according to a 2024 YouGov survey.
On the other, many mental health professionals and experts view reliance on bot-based therapy as a poor substitute for the real thing.
"We know we can feel better from writing in a diary or talking aloud to ourselves or texting with a machine. That is not therapy," Hannah Zeavin, author of "The Distance Cure: A History of Teletherapy," told the Washington Post in 2023.
AI can't effectively substitute for a human therapist because "a therapeutic relationship is about ... forming a relationship with another human being who understands the complexity of life," argues sociologist Sherry Turkle, who has been studying digital culture for decades.
Between the lines: Lucas LaFreniere, an assistant professor of psychology at Skidmore College who recently taught a seminar called "My Therapist is a Robot," says there are two kinds of people — those who are willing to suspend disbelief to accept that a chatbot could help them with personal problems and those who aren't.
"You can tell in the first five minutes of talking with somebody, whether they think it's all going to be bullshit, or they are really open to it, think it has a lot of potential and could be cool, and can relate to it," LaFreniere told Axios.
Empathy is in the eye of the beholder, he said: "If the client is perceiving empathy, they benefit from the empathy."
But he says there are a lot of people who simply don't feel that empathy — or if they do feel it, it will disappear at the first glitch. "That just kind of reminds the user very starkly that they're talking to software," he said.
The other side: Some experts argue that generative AI can help with thorny emotional questions because it's been trained, in part, on literature.
"Works of art, Shakespeare's plays" and similar works give generative AI the ability to help humans with emotions — or at least to make them feel less alone, Chris Mattmann, chief data and AI officer at UCLA, told Axios.
The fiction in LLM training data includes "characters that don't exist, but they're inherently human. And they mirror our properties, including empathy," he says.
Yes, but: LLMs have also been trained on Reddit, Facebook, Twitter and 4chan.
Even as chatbots get better at "empathetic" and human-like communication, some people will never accept them as companions or therapists because it's too difficult to square this with their own ideas about what it means to be human.
In 1950 computing pioneer Alan Turing described what he called the " heads in the sand objection" to the prospect of artificial intelligence: "The consequences of machines thinking would be too dreadful. Let us hope and believe that they cannot do so."
This argument, Turing wrote, "is likely to be quite strong in intellectual people, since they value the power of thinking more highly than others, and are more inclined to base their belief in the superiority of Man on this power."
The intrigue: Turing was talking about intelligence, not empathy, but some types of empathy are closer to intelligence than others.
Cognitive empathy is the ability to understand what another person is thinking and why they're thinking it.
"All of that is sort of informational and knowledge-based," LaFreniere says. "And AI may actually be able to crunch the words and numbers a lot better than a person could. And cognitive empathy does matter."