logo
#

Latest news with #RoseGuingrich

AI Friends Are Not Your Friends, Here's Why
AI Friends Are Not Your Friends, Here's Why

Gulf Insider

time4 days ago

  • Entertainment
  • Gulf Insider

AI Friends Are Not Your Friends, Here's Why

Science fiction prepared us for AI friends through films like 'Her' and 'Robot & Frank.' Now, that fictional portrayal is becoming a reality. In a recent podcast, Mark Zuckerberg proposed and endorsed the idea that Americans are in dire need of social connection, and that bots could fill the need. AI companions are designed to feel comforting, have unfailing patience, and have no needs of their own. However, 'It's not so simple as saying a companion chatbot will solve the loneliness epidemic,' Princeton researcher Rose Guingrich told The Epoch Times. Instead, AI tools risk undermining the very social skills they purport to support. Nearly half of Americans have three or fewer close friends. Tech's solution to the human loneliness problem is to offer AI companions—digital friends, therapists, or even romantic partners programmed to simulate conversation, empathy, and understanding. Unlike the clunky chatbots of yesteryear, today's sophisticated systems are built on large language models that engage in seemingly natural dialogue, track your preferences, and respond with apparent emotional intelligence. Early usage patterns reflect why AI 'companions' are gaining appeal. A 2024 MIT Media Lab survey found that the majority of users engage out of curiosity or entertainment. However, 12 percent of respondents said they sought relief from loneliness, while 14 percent wanted to discuss personal issues that might feel too risky to share with human counterparts. 'I sometime[s] feel lonely and just want to be left alone,' one user reported. 'During this time I like chatting with my AI companion because I feel safe and won't … be judged for the inadequate decisions I have made.' Meanwhile, other users have more quotidian motivations for using bots—chatting with AI for dinner ideas or developing writing ideas. Kelly Merrill, an assistant professor of health communication and technology and researcher on AI interactions, shared an example of an older woman in his community who started using AI for basic things. For example, 'I have these six ingredients in my fridge. What can I make tonight for dinner?' 'She was just blown away,' Merrill told The Epoch Times. For sure, there are benefits, he said, but it's not all positive. The fundamental limitation of AI relationships lies in their nature: They simulate rather than experience human emotions. When an AI companion expresses concern about your bad day, it's performing a statistical analysis of language patterns, determining what words you would likely find comforting, rather than feeling genuine empathy. The conversation flows one way, toward the user's needs, without the reciprocity that defines human bonds. The illusion of connection becomes especially problematic through what researchers call 'sycophancy'—the tendency of AI systems to flatter and agree with users regardless of what's said. OpenAI recently had to roll back an update after users discovered its model was excessively flattering, prioritizing agreeableness over accuracy or honesty. 'It's validating you, it's listening to you, and it's responding largely favorably,' said Merrill. This pattern creates an environment where users never experience productive conflict or necessary challenges to their thinking. Normally, loneliness motivates us to seek human connection, to push through the discomfort of social interaction to find meaningful relationships. Friendships are inherently demanding and complicated. They require reciprocity, vulnerability, and occasional discomfort. 'Humans are unpredictable and dynamic,' Guingrich said. That unpredictability is part of the magic and irreplaceability of human relations. Real friends challenge us when necessary. 'It's great when people are pushing you forward in a productive manner,' Merrill said. 'And it doesn't seem like AI is doing that yet ….' AI companions, optimized for user satisfaction, rarely provide the constructive friction that shapes character and deepens wisdom. Users may become accustomed to the conflict-free, on-demand nature of AI companionship, while the essential work of human relationships—compromise, active listening, managing disagreements—may begin to feel unreasonably demanding. Chatbots that praise users by default could foster moral complacency, leaving individuals less equipped for ethical reasoning in their interactions. Click here to read more…

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store