logo
#

Latest news with #AIcompanions

Teens increasingly turning to AI for friendship as national loneliness crisis deepens
Teens increasingly turning to AI for friendship as national loneliness crisis deepens

Fox News

time3 days ago

  • Entertainment
  • Fox News

Teens increasingly turning to AI for friendship as national loneliness crisis deepens

A new study shows that a third of American teenagers prefer chatting with artificial intelligence companions over having real friends. Common Sense Media's report, titled "Talk, Trust, and Trade-Offs: How and Why Teens Use AI Companions," revealed that the most widespread uses of AI are aged 13-17. The report explained further that the "use of AI companions is not a niche interest, but rather mainstream teen behavior" and that teens "find conversations with AI companions to be as satisfying or more satisfying than those with real-life friends." "AI companions are emerging at a time when kids and teens have never felt more alone," Common Sense Media Founder and CEO James P. Steyer said in the press release. "This isn't just about a new technology — it's about a generation that's replacing human connection with machines, outsourcing empathy to algorithms, and sharing intimate details with companies that don't have kids' best interests at heart. Our research shows that AI companions are far more commonplace than people may have assumed — and that we have a narrow window to educate kids and families about the well-documented dangers of these products." Although nearly half of teens used AI companions as a tool, the report also stated that 33% of teens use AI companions for social interactions and emotional support. For example, teens would use them for living out relationships, emotional support, role-playing, romantic interactions and friendship. A writer at Daze who cited the study raised awareness about the loneliness epidemic among young people and that it could lead to an invasion of privacy. "Some teenagers are telling AI their most intimate problems and secrets, which poses another problem – it's not a good idea to entrust this information to tech companies, some of whom have an extremely lax approach to data privacy. Would you really want Sam Altman or Elon Musk to have access to the contents of your teenage diary?" James Greig wrote in Daze. He added that it underscores a "larger crisis of youth loneliness" as teenagers stopped hanging out at malls and going to the movies, "which has corresponded with rising rates of depression and anxiety." "Being able to speak to an AI companion might alleviate the feeling of loneliness, and some people may find it helpful, but if it's becoming a replacement for socializing in the real world, then it risks entrenching the problem," Greig added.

‘Obedient' AI partners partly to blame for Hong Kong's low births: lawmaker
‘Obedient' AI partners partly to blame for Hong Kong's low births: lawmaker

South China Morning Post

time7 days ago

  • Politics
  • South China Morning Post

‘Obedient' AI partners partly to blame for Hong Kong's low births: lawmaker

A Hong Kong lawmaker has said the rise of 'obedient and caring' virtual partners generated by artificial intelligence is partly to blame for the city's low birth rate, noting such digital companions offer emotional support and lack interpersonal conflicts. Legislator William Wong Kam-fai, who is also a computational linguistics professor, called on the government to promote 'happy learning' in schools and lower the income threshold for public housing to encourage more people to have children. Wong spoke at a motion debate during a Legislative Council meeting on Thursday that centred on how to encourage more childbirths, conceding it was difficult to reverse trends of people staying unmarried and having no children. 'The younger generation would rather keep pets than have children, and now there is also the new challenge of AI companions,' he told the legislature. Wong, who is part of the Chinese University of Hong Kong's systems engineering and engineering management department, said that technology companies had launched 'obedient and caring' AI companions to address the needs of single people. 'These virtual partners not only know how to provide emotional value, but also save [the users] from the troubles of interpersonal conflicts. Their attraction [to people] is self-evident,' he said.

Teens Are Turning to AI for Emotional Support, and That Says a Lot
Teens Are Turning to AI for Emotional Support, and That Says a Lot

Yahoo

time23-07-2025

  • Yahoo

Teens Are Turning to AI for Emotional Support, and That Says a Lot

A third of teens are leaning on AI 'friends' to vent, rehearse convos, or feel heard, but that doesn't mean they're ditching real life If you doubt your child has used an AI companion, you might want to reconsider that notion. A recent study by Common Sense Media digging into how and why teens turn to AI companions shows that 72 percent of teens, aged 13 to 17, have used it at least once, and over 50 percent are regular users. Of the teens surveyed, 33 percent say that they seek friendship, support, or practice with social interactions from these companions, even confiding in AI friends and finding the interactions equally or more satisfying than real-life conversations. But the majority of teens view these tools with some skepticism and spend more time with actual friends. More concerning is that approximately one-quarter of teens reported sharing personal details, such as names and secrets, with AI companions, thereby granting platforms access to any information teens share for commercial use. While not wholly discouraging, the report reaffirms the guidance that AI companion use by teens requires vigilance, and those under 18 should avoid it. Related: Want Your Kid to Get Ahead? Start With This Free AI Course Read the original article on Lifewire Solve the daily Crossword

New Study Shows Teens Are Increasingly Relying on AI Chatbots for Social Interaction
New Study Shows Teens Are Increasingly Relying on AI Chatbots for Social Interaction

Yahoo

time22-07-2025

  • Yahoo

New Study Shows Teens Are Increasingly Relying on AI Chatbots for Social Interaction

This story was originally published on Social Media Today. To receive daily news and insights, subscribe to our free daily Social Media Today newsletter. Yeah, this seems like it's going to be a problem in future, though maybe that's considered the cost of progress? Last week, Common Sense Media published a new report which found that 72% of U.S. teens have already used an AI companion, with many of them now conducting regular social interactions with their chosen virtual friends. The study is based on a survey of 1,060 teens, so it's intended as an indicative measure, not as a definitive overview of AI usage. But the trends do point to some potentially significant concerns, particularly as platforms now look to introduce AI bots that can also serve as romantic partners in some capacity. First off, as noted, the data shows that 72% of teens have tried AI companions, and 52% of them have become regular users of these bots. What's worth noting here is that AI bots aren't anywhere near where they're likely to be in a few more years' time, with the tech companies investing billions of dollars into advancing their AI bots to make them more relatable, more conversational, and better emulators of real human engagement. But they're not. These are bots, which respond to conversational cues based on the context that they have available, and whatever weighting system each company puts into their back-end process. So they're not an accurate simulation of actual human interaction, and they never will be, due to the real mental and physical connection enabled through such. Yet, we're moving towards a future where this is going to become a more viable replacement for actual civic engagement. But what if a bot gets changed, gets infected with harmful code, gets hacked, shut down, etc.? The broader implications of enabling, and encouraging such connection, are not yet known, in terms of the mental health impacts that could come as a result. But we're moving forward anyway, with the data showing that 33% of teens already use AI companions for social interaction and relationships. Of course, some of this may well end up being highly beneficial, in varying contexts. For example, the ability to ask questions that you may not be comfortable saying to another person could be a big help, with the survey data showing that 18% of AI companion users refer to the tools for advice. Nonjudgmental interaction has clear benefits, while 39% of AI companion users have also transferred social skills that they've practiced with bots over to real-life situations (notably, 45% of females have done this, versus 34% of male users). So there's definitely going to be benefits. But like social media before it, the question is whether those positives will end up outweighing the potential negatives of over-reliance on non-human entities for traditionally human engagement. 31% of survey participants indicated that they find conversations with AI companions as satisfying or more satisfying than those with real-life friends, while 33% have chosen AI over humans for certain conversations. As noted, the fact that these bots can be skewed to answer based on ideological lines is a concern in this respect, as is the tendency for AI tools to 'hallucinate' and make incorrect assumptions in their responses, which they then state as fact. That could lead youngsters down the wrong path, which could then lead to potential harm, while again, the shift to AI companions as romantic partners opens up even more questions about the future of relationships. It seems inevitable that this is going to become a more common usage for AI tools, that our budding relationships with human simulators will lead to more people looking to take those understanding, non-judgmental relationships to another level. Real people will never understand you like your algorithmically-aligned AI bot can, and that could actually end up exacerbating the loneliness epidemic, as opposed to addressing it, as some have suggested. And if young people are learning these new relationship behavors in their formative years, what does that do for their future concept of human connection, if indeed they feel they need that? And they do need it. Centuries of studies have underlined the importance of human connection and community, and the need to have real relationships to help shape your understanding perspective. AI bots may be able to simulate some of that, but actual physical connection is also important, as is human proximity, real world participation, etc. We're steadily moving away from this over time, and you could argue, already, that increasing rates of severe loneliness, which the WHO has declared a 'pressing global health threat,' are already having major health impacts. Indeed, studies have shown that loneliness is associated with a 50% increased risk of developing dementia and a 30% increased risk of incident coronary artery disease or stroke. Will AI bots help that? And if not, why are we pushing them so hard? Why is every app now trying to make you chat with these non-real entities, and share your deepest secrets with their evolving AI tools? Is this more beneficial to society, or to the big tech platforms that are building these AI models? If you lean towards the latter conclusion, then progress is seemingly the bigger focus, just as it was with social media before it. AI providers are already pushing for the European Union to relax its restrictions on AI development, while the looming AI development race between nations is also increasing the pressure on all governments to loosen the reigns, in favor of expediting innovation. But should we feel encouraged by Meta's quest for 'superintelligence,' or concerned at the rate in which these tools are becoming so common in elements of serious potential impact? That's not to say that AI development in itself is bad, and there are many use cases for the latest AI tools that will indeed increase efficiency, innovation, opportunity, etc. But there does seem to be some areas in which we should probably tread more cautiously, due to the risks of over reliance, and the impacts of such on a broad scale. That's seemingly not going to happen, but in ten years time, we're going to be assessing this from a whole different perspective. You can check out Common Sense Media's 'Talk, Trust, and Trade-Offs' report here. 擷取數據時發生錯誤 登入存取你的投資組合 擷取數據時發生錯誤 擷取數據時發生錯誤 擷取數據時發生錯誤 擷取數據時發生錯誤

More Than Half of Teens Surveyed Use AI for Companionship. Why That's Not Ideal
More Than Half of Teens Surveyed Use AI for Companionship. Why That's Not Ideal

CNET

time18-07-2025

  • Health
  • CNET

More Than Half of Teens Surveyed Use AI for Companionship. Why That's Not Ideal

Is your teen using an artificial intelligence chatbot for companionship? If you don't know, it's time to find out. Common Sense Media released a study this week, where it found that more than half of pre-adult teenagers regularly use AI companions. Nearly a third of the teens surveyed reported that conversations with AI were as satisfying as conversations with actual humans, if not more so. Researchers also found that 33% of teens surveyed use AI companions such as Nomi and Replika "for social interaction and relationships, including conversation practice, emotional support, role-playing, friendship or romantic interactions." The study, which surveyed 1,060 teens aged 13 to 17 from across the US over the past year, distinguished between anthropomorphic AI bots and more assistance-oriented AI tools such as ChatGPT, Microsoft Copilot and Google's Gemini. Considering the growing widespread use of AI companions in teens, the Common Sense Media researchers concluded that their findings supported limiting the use of AI among young people. "Our earlier recommendation stands: Given the current state of AI platforms, no one younger than 18 should use AI companions," the research team said. For the past few years, generative AI has evolved at lightning speed, with new tools regularly available across the world and disrupting business models, social practices and cultural norms. This, combined with an epidemic of social isolation exacerbated by the COVID-19 pandemic, puts teens at risk from technology that their young brains might not be able to handle adequately. Why experts are worried about teens and AI Amid the growing use of chatbots by people to discuss personal problems and get advice, it's important to remember that, while they might seem confident and reassuring, they're not mental health professionals. A.G. Noble, a mental health therapist specializing in adolescents at Youth Eastside Services in Bellevue, Washington, says she isn't surprised by the Common Sense Media study. She pointed to a growing number of adolescents struggling with social skills and with feeling connected to their peers, which she calls a "perfect recipe for loneliness." "What AI companions offer are low-risk 'social' interaction: privacy, no bullying, no worries about the awkwardness of ghosting the AI companion if the kids don't want to talk anymore," Noble said. "And I think everyone can empathize -- who wouldn't want a 'social relationship' without the minefield, especially in their teens?" Debbi Halela, director of behavioral health services at Youth Eastside Services, says teens need to interact with humans in real life, especially in the aftermath of the pandemic of 2020. "Over-reliance on technology runs the risk of hindering the healthy development of social skills in young people," Halela said. "Youth are also still developing the ability to make decisions and think critically, therefore they may be vulnerable to manipulation and influence from information sources that are not always reliable, and this could inhibit the development of critical thinking skills." The American Psychological Association warned earlier this year that "we have already seen instances where adolescents developed unhealthy and even dangerous 'relationships' with chatbots." The APA issued several recommendations, including teaching AI literacy to kids and AI developers creating systems that regularly remind teen users that AI companions are not actual humans. Noble says virtual interactions "can trigger the dopamine and oxytocin responses of a real social interaction — but without the resulting social bond. Like empty calories coming from diet soda, it seems great in the moment but ultimately doesn't nourish." Parents need to encourage real-world activities that involve teens with other people, Noble said. "Real social interaction is the best buffer against the negative impacts of empty AI interactions."

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store