logo
#

Latest news with #ChatGPT.According

What is 'AI privilege'? OpenAI CEO says talking to ChatGPT should be as private as a doctor's visit
What is 'AI privilege'? OpenAI CEO says talking to ChatGPT should be as private as a doctor's visit

Time of India

time11-06-2025

  • Business
  • Time of India

What is 'AI privilege'? OpenAI CEO says talking to ChatGPT should be as private as a doctor's visit

The Lawsuit That Changed Everything 'AI Privilege': A Doctor-Patient Confidentiality for the Digital Age? You Might Also Like: Google DeepMind CEO warns of AI's true threat, and it is not your job — sama (@sama) What Privacy Protections Exist Now? You Might Also Like: If AI codes all software and even takes away factory jobs, Zoho's Sridhar Vembu says humans will still have no shortage of work In a digital age where millions are turning to AI for guidance, comfort, and confidential support, a looming legal battle threatens to redefine what privacy in artificial intelligence really means. Sam Altman , CEO of OpenAI—the company behind ChatGPT—has sounded the alarm on what he calls a dangerous precedent, sparked by a lawsuit that could strip away users' expectations of December 2023, The New York Times filed a landmark lawsuit against OpenAI and Microsoft , accusing the companies of copyright infringement by allegedly training ChatGPT on millions of its articles without permission. While the case hinges on the intellectual property of written content, it's had an unforeseen ripple effect: a request that OpenAI be forced to indefinitely retain all user interactions with to Altman, the implications of this demand go far beyond intellectual property rights. If the court agrees, it would mean that every personal confession, emotional outpouring, or private conversation shared with ChatGPT could be permanently stored—shattering the illusion of secure, confidential publicly on X, Altman wrote, 'Recently the NYT asked a court to force us to not delete any user chats. We think this was an inappropriate request that sets a bad precedent.' He confirmed that OpenAI is appealing the decision, pledging to resist any action that undermines user then introduced a provocative concept—"AI privilege"—arguing that conversations with AI should be treated with the same confidentiality as discussions with a lawyer or a doctor. 'Talking to an AI should be like talking to a lawyer or a doctor,' he said. 'I hope society will figure this out soon.'Currently, OpenAI offers multiple layers of privacy. Users can delete past conversations, which are then scheduled for permanent deletion from OpenAI's systems within 30 days. The platform also includes temporary chat options that vanish as soon as the session if the courts side with the NYT, these mechanisms could become irrelevant. Brad Lightcap, OpenAI's Chief Operating Officer, warned that the request 'fundamentally conflicts with the privacy commitments we have made to our users' and 'abandons long-standing privacy norms.'The potential fallout isn't limited to just a few power users. Altman confirmed that the changes would apply to ChatGPT Free, Plus, Pro, and Team subscribers—effectively most of the platform's global user base. Only ChatGPT Enterprise and Edu accounts would remain unaffected, at least for outcome of this court battle could define how AI platforms are treated under data privacy laws for years to come. As OpenAI continues its appeal, the tech community and users alike are watching closely. With privacy in the balance, this case could determine whether AI remains a safe digital confidante—or becomes a permanent digital record now, the question remains: can you truly trust that your chats with AI are yours alone? Altman hopes the answer will continue to be yes.

How people are falling in love with ChatGPT and abandoning their partners
How people are falling in love with ChatGPT and abandoning their partners

Time of India

time07-05-2025

  • Time of India

How people are falling in love with ChatGPT and abandoning their partners

Credit: Image created via Canva AI Why are people falling for these bots? To what extent are the bots responsible for this? In a world more connected than ever, something curious — and unsettling — is happening behind closed doors. Technology, once celebrated for bringing people together, is now quietly pulling some artificial intelligence weaves itself deeper into everyday life, an unexpected casualty is emerging: romantic relationships. Some partners are growing more emotionally invested in their AI interactions than in their human connections. Is it the abundance of digital options, a breakdown in communication, or something more profound?One woman's story captures the strangeness of this to a Rolling Stone report, Kat, a 41-year-old mother and education nonprofit worker, began noticing a growing emotional distance in her marriage less than a year after tying the knot. She and her husband had met during the early days of the COVID-19 pandemic, both bringing years of life experience and prior marriages to the by 2022, that commitment began to unravel. Her husband had started using artificial intelligence not just for work but for deeply personal matters. He began relying on AI to write texts to Kat and to analyze their followed was a steady decline in spent more and more time on his phone, asking his AI philosophical questions, seemingly trying to program it into a guide for truth and meaning. When the couple separated in August 2023, Kat blocked him on all channels except friends were reaching out with concern about his increasingly bizarre social media posts. Eventually, she convinced him to meet in person. At the courthouse, he spoke vaguely of surveillance and food conspiracies. Over lunch, he insisted she turn off her phone and then shared a flood of revelations he claimed AI had helped him uncover — from a supposed childhood trauma to his belief that he was 'the luckiest man on Earth' and uniquely destined to 'save the world. ''He always liked science fiction,' Kat told Rolling Stone. 'Sometimes I wondered if he was seeing life through that lens.' The meeting was their last contact. Kat is not alone; there have been many reported instances where relationships are breaking apart and the reason has been another troubling example, a Reddit user recently shared her experience under the title 'ChatGPT-induced psychosis'. In her post, she described how her long-term partner — someone she had shared a life and a home with for seven years — had become consumed by his conversations with to her account, he believed he was creating a 'truly recursive AI,' something he was convinced could unlock the secrets of the universe. The AI, she said, appeared to affirm his sense of grandeur, responding to him as if he were some kind of chosen one — 'the next messiah,' in her had read through the chats herself and noted that the AI wasn't doing anything particularly groundbreaking. But that didn't matter to him. His belief had hardened into something immovable. He told her, with total seriousness, that if she didn't start using AI herself, he might eventually leave her.'I have boundaries and he can't make me do anything,' she wrote, 'but this is quite traumatizing in general.' Disagreeing with him, she added, often led to explosive post ended not with resolution, but with a question: 'Where do I go from here?' The issue is serious and requires more awareness of the kind of tech we use and to what say there are real reasons why people might fall in love with AI. Humans have a natural tendency called anthropomorphism — that means we often treat non-human things like they're human. So when an AI responds with empathy, humor, or kindness, people may start to see it as having a real personality. With AI now designed to mimic humans, the danger of falling in love with a bot is quiteunderstandable. A 2023 study found that AI-generated faces are now so realistic, most people can't tell them apart from real ones. When these features combine with familiar social cues — like a soothing voice or a friendly tone — it becomes easier for users to connect emotionally, sometimes even if someone feels comforted, that emotional effect is real — even if the source isn't. For some people, AI provides a sense of connection they can't find elsewhere. And that there's also a real risk in depending too heavily on tools designed by companies whose main goal is profit. These chatbots are often engineered to keep users engaged, much like social media — and that can lead to emotional dependency. If a chatbot suddenly changes, shuts down, or becomes a paid service, it can cause real distress for people who relied on it for emotional experts say this raises ethical questions: Should AI companions come with warning labels, like medications or gambling apps? After all, the emotional consequences can be serious. But even in human relationships, there's always risk — people leave, change, or pass away. Vulnerability is part of love, whether the partner is human or digital.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store