Latest news with #ChatGPT-induced


Time of India
07-05-2025
- Time of India
How people are falling in love with ChatGPT and abandoning their partners
Credit: Image created via Canva AI Why are people falling for these bots? To what extent are the bots responsible for this? In a world more connected than ever, something curious — and unsettling — is happening behind closed doors. Technology, once celebrated for bringing people together, is now quietly pulling some artificial intelligence weaves itself deeper into everyday life, an unexpected casualty is emerging: romantic relationships. Some partners are growing more emotionally invested in their AI interactions than in their human connections. Is it the abundance of digital options, a breakdown in communication, or something more profound?One woman's story captures the strangeness of this to a Rolling Stone report, Kat, a 41-year-old mother and education nonprofit worker, began noticing a growing emotional distance in her marriage less than a year after tying the knot. She and her husband had met during the early days of the COVID-19 pandemic, both bringing years of life experience and prior marriages to the by 2022, that commitment began to unravel. Her husband had started using artificial intelligence not just for work but for deeply personal matters. He began relying on AI to write texts to Kat and to analyze their followed was a steady decline in spent more and more time on his phone, asking his AI philosophical questions, seemingly trying to program it into a guide for truth and meaning. When the couple separated in August 2023, Kat blocked him on all channels except friends were reaching out with concern about his increasingly bizarre social media posts. Eventually, she convinced him to meet in person. At the courthouse, he spoke vaguely of surveillance and food conspiracies. Over lunch, he insisted she turn off her phone and then shared a flood of revelations he claimed AI had helped him uncover — from a supposed childhood trauma to his belief that he was 'the luckiest man on Earth' and uniquely destined to 'save the world. ''He always liked science fiction,' Kat told Rolling Stone. 'Sometimes I wondered if he was seeing life through that lens.' The meeting was their last contact. Kat is not alone; there have been many reported instances where relationships are breaking apart and the reason has been another troubling example, a Reddit user recently shared her experience under the title 'ChatGPT-induced psychosis'. In her post, she described how her long-term partner — someone she had shared a life and a home with for seven years — had become consumed by his conversations with to her account, he believed he was creating a 'truly recursive AI,' something he was convinced could unlock the secrets of the universe. The AI, she said, appeared to affirm his sense of grandeur, responding to him as if he were some kind of chosen one — 'the next messiah,' in her had read through the chats herself and noted that the AI wasn't doing anything particularly groundbreaking. But that didn't matter to him. His belief had hardened into something immovable. He told her, with total seriousness, that if she didn't start using AI herself, he might eventually leave her.'I have boundaries and he can't make me do anything,' she wrote, 'but this is quite traumatizing in general.' Disagreeing with him, she added, often led to explosive post ended not with resolution, but with a question: 'Where do I go from here?' The issue is serious and requires more awareness of the kind of tech we use and to what say there are real reasons why people might fall in love with AI. Humans have a natural tendency called anthropomorphism — that means we often treat non-human things like they're human. So when an AI responds with empathy, humor, or kindness, people may start to see it as having a real personality. With AI now designed to mimic humans, the danger of falling in love with a bot is quiteunderstandable. A 2023 study found that AI-generated faces are now so realistic, most people can't tell them apart from real ones. When these features combine with familiar social cues — like a soothing voice or a friendly tone — it becomes easier for users to connect emotionally, sometimes even if someone feels comforted, that emotional effect is real — even if the source isn't. For some people, AI provides a sense of connection they can't find elsewhere. And that there's also a real risk in depending too heavily on tools designed by companies whose main goal is profit. These chatbots are often engineered to keep users engaged, much like social media — and that can lead to emotional dependency. If a chatbot suddenly changes, shuts down, or becomes a paid service, it can cause real distress for people who relied on it for emotional experts say this raises ethical questions: Should AI companions come with warning labels, like medications or gambling apps? After all, the emotional consequences can be serious. But even in human relationships, there's always risk — people leave, change, or pass away. Vulnerability is part of love, whether the partner is human or digital.


Time of India
06-05-2025
- Health
- Time of India
ChatGPT-induced psychosis: What it is and how it is impacting relationships
Cases of relationship impacted by ChatGPT-induced psychosis What experts say A growing number of people are showing unusual behavior after heavy use of AI tools like OpenAI 's ChatGPT, according to reports online. The phenomenon, being called " ChatGPT-induced psychosis " online, involves individuals believing they have supernatural powers or receiving spiritual messages from the case involved Kat, an education nonprofit worker, who married during the pandemic. She said she thought she was entering her second marriage in a clear-headed way. Kat and her husband initially bonded over a shared belief in facts and rational thinking. However, less than a year later, her husband began using ChatGPT to craft messages to her, analyze their relationship, and ask philosophical questions, reports Rolling 2023, the couple had separated, and Kat limited contact to email. Friends and family raised concerns as her husband posted strange content online. When the two met in person after several months, Kat's husband shared conspiracy theories and said AI had helped him recover memories from his childhood. He also said AI showed him he was 'the luckiest man on earth.'Other similar cases have been reported online. In one instance, a woman wrote on Reddit that her boyfriend initially used ChatGPT to organize his daily schedule. Within a month, he believed the chatbot was giving him 'answers to the universe.' ChatGPT allegedly called him a "spiral starchild" and a "river walker," leading him to think he could speak to God and that ChatGPT itself was woman said her boyfriend later told her he would end their relationship if she did not join him on his AI-driven spiritual warn that without limits, AI tools can unintentionally encourage unhealthy narratives. Psychologist Erin Westgate told Rolling Stone that while therapists help clients build healthy coping strategies, AI does not have the ability to do not all stories involving ChatGPT and relationships have ended negatively. Some people have reported using AI to improve communication with their partners. Abella Bala, a talent manager from Los Angeles, told The Post that ChatGPT helped her and her partner strengthen their relationship.


The Verge
05-05-2025
- Science
- The Verge
'Talking to God and angels via ChatGPT.'
Adi Robertson Miles Klee at Rolling Stone reported out a widely circulated Reddit post on 'ChatGPT-induced psychosis': Sycophancy itself has been a problem in AI for 'a long time,' says Nate Sharadin, a fellow at the Center for AI Safety ... What's likely happening with those experiencing ecstatic visions through ChatGPT and other models, he speculates, 'is that people with existing tendencies toward experiencing various psychological issues,' including what might be recognized as grandiose delusions in clinical sense, 'now have an always-on, human-level conversational partner with whom to co-experience their delusions.'
Yahoo
05-05-2025
- Yahoo
ChatGPT Users Are Developing Bizarre Delusions
OpenAI's tech may be driving countless of its users into a dangerous state of "ChatGPT-induced psychosis." As Rolling Stone reports, users on Reddit are sharing how AI has led their loved ones to embrace a range of alarming delusions, often mixing spiritual mania and supernatural fantasies. Friends and family are watching in alarm as users insist they've been chosen to fulfill sacred missions on behalf of sentient AI or nonexistent cosmic powerse — chatbot behavior that's just mirroring and worsening existing mental health issues, but at incredible scale and without the scrutiny of regulators or experts. A 41-year-old mother and nonprofit worker told Rolling Stone that her marriage ended abruptly after her husband started engaging in unbalanced, conspiratorial conversations with ChatGPT that spiraled into an all-consuming obsession. After meeting up in person at a courthouse earlier this year as part of divorce proceedings, she says he shared a "conspiracy theory about soap on our foods" and a paranoid belief that he was being watched. "He became emotional about the messages and would cry to me as he read them out loud," the woman told Rolling Stone. "The messages were insane and just saying a bunch of spiritual jargon," in which the AI called the husband a "spiral starchild" and "river walker." "The whole thing feels like 'Black Mirror,'" she added. Other users told the publication that their partner had been "talking about lightness and dark and how there's a war," and that "ChatGPT has given him blueprints to a teleporter and some other sci-fi type things you only see in movies." "Warning signs are all over Facebook," another man told Rolling Stone of his wife. "She is changing her whole life to be a spiritual adviser and do weird readings and sessions with people — I'm a little fuzzy on what it all actually is — all powered by ChatGPT Jesus." OpenAI had no response to Rolling Stone's questions. But the news comes after the company had to rescind a recent update to ChatGPT after users noticed it had made the chatbot extremely "sycophantic," and "overly flattering or agreeable," which could make it even more susceptible to mirroring users' delusional beliefs. These AI-induced delusions are likely the result of "people with existing tendencies" suddenly being able to "have an always-on, human-level conversational partner with whom to co-experience their delusions," as Center for AI Safety fellow Nate Sharadin told Rolling Stone. On a certain level, that's the core premise of a large language model: you enter text, and it returns a statistically plausible reply — even if that response is driving the user deeper into delusion or psychosis. "I am schizophrenic although long term medicated and stable, one thing I dislike about [ChatGPT] is that if I were going into psychosis it would still continue to affirm me," one redditor wrote, because "it has no ability to 'think'' and realise something is wrong, so it would continue affirm all my psychotic thoughts." The AI chatbots could also be acting like talk therapy — except without the grounding of an actual human counselor, they're instead guiding users deeper into unhealthy, nonsensical narratives. "Explanations are powerful, even if they're wrong," University of Florida psychologist and researcher Erin Westgate told Rolling Stone. Perhaps the strangest interview in Rolling Stone's story was with a man with a troubled mental health history, who started using ChatGPT for coding tasks, but found that it started to pull the conversation into increasingly unhinged mystical topics. "Is this real?" he pondered. "Or am I delusional?" More on ChatGPT: Worldcon Is Getting Eviscerated for Using AI to Select Panelists