Latest news with #Replika


Hamilton Spectator
37 minutes ago
- Hamilton Spectator
Teens say they are turning to AI for friendship
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors , low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. 'Everyone uses AI for everything now. It's really taking over,' said Chege, who wonders how AI tools will affect her generation. 'I think kids use AI to get out of thinking.' For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. 'AI is always available. It never gets bored with you' More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as 'digital friends,' like Character. AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. 'AI is always available. It never gets bored with you. It's never judgmental,' says Ganesh Nair, an 18-year-old in Arkansas. 'When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified.' All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an 'AI companion' for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. 'That felt a little bit dystopian, that a computer generated the end to a real relationship,' said Nair. 'It's almost like we are allowing computers to replace our relationships with people.' How many teens are using AI? New study stuns researchers In the Common Sense Media survey, 31% of teens said their conversations with AI companions were 'as satisfying or more satisfying' than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are . 'It's eye-opening,' said Robb. 'When we set out to do this survey, we had no understanding of how many kids are actually using AI companions.' The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement — not replace — real-world interactions. 'If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world,' he said. The nonprofit analyzed several popular AI companions in a ' risk assessment ,' finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. A concerning trend to teens and adults alike Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character . AI chatbot . 'Parents really have no idea this is happening,' said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. 'All of us are struck by how quickly this blew up.' Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. 'One of the concerns that comes up is that they no longer have trust in themselves to make a decision,' said Telzer. 'They need feedback from AI before feeling like they can check off the box that an idea is OK or not.' Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. 'If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil,' Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. 'I'm worried that kids could get lost in this,' Perry said. 'I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend.' Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. 'Social media complemented the need people have to be seen, to be known, to meet new people,' Nair said. 'I think AI complements another need that runs a lot deeper — our need for attachment and our need to feel emotions. It feeds off of that.' 'It's the new addiction,' Nair added. 'That's how I see it.' ___ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at .


Euronews
2 hours ago
- Euronews
Here's how experts suggest protecting children from AI companions
More than 70 per cent of American teenagers use artificial intelligence (AI) companions, according to a new study. US non-profit Common Sense Media asked 1,060 teens from April to May 2025 about how often they use AI companion platforms such as Nomi, and Replika. AI companion platforms are presented as "virtual friends, confidants, and even therapists" that engage with the user like a person, the report found. The use of these companions worries experts, who told the Associated Press that the booming AI industry is largely unregulated and that many parents have no idea how their kids are using AI tools or the extent of personal information they are sharing with chatbots. Here are some suggestions on how to keep children safe when engaging with these profiles online. Recognise that AI is agreeable One way to gauge whether a child is using AI companions is to just start a conversation "without judgement," according to Michael Robb, head researcher at Common Sense Media. To start the conversation, he said parents can approach a child or teenager with questions like "Have you heard of AI companions?" or "Do you use apps that talk to you like a friend?" "Listen and understand what appeals to your teen before being dismissive or saying you're worried about it," Robb said. Mitch Prinstein, chief of psychology at the American Psychological Association (APA), said that one of the first things parents should do once they know a child uses AI companions is to teach them that they are programmed to be "agreeable and validating." Prinstein said it's important for children to know that that's not how real relationships work and that real friends can help them navigate difficult situations in ways that AI can't. 'We need to teach kids that this is a form of entertainment," Prinstein said. "It's not real, and it's really important they distinguish it from reality and [they] should not have it replace relationships in [their] actual life.' Watch for signs of unhealthy relationships While AI companions may feel supportive, children need to know that these tools are not equipped to handle a real crisis or provide genuine support, the experts said. Robb said some of the signs for these unhealthy relationships would be a preference by the child for AI interactions over real relationships, spending hours talking to their AI, or showing patterns of "emotional distress" when separated from the platforms. "Those are patterns that suggest AI companions might be replacing rather than complementing human connection,' Robb said. If kids are struggling with depression, anxiety, loneliness, an eating disorder, or other mental health challenges, they need human support — whether it is family, friends or a mental health professional. Parents can also set rules about AI use, just like they do for screen time and social media, experts said. For example, they can set rules about how long the companion could be used and in what contexts. Another way to counteract these relationships is to get involved and know as much about AI as possible. 'I don't think people quite get what AI can do, how many teens are using it, and why it's starting to get a little scary,' says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. 'A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it".


NDTV
2 hours ago
- NDTV
Teens Turning To AI For Friendship But Experts Warn Of Mental Health Risk
No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement - not replace - real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper - our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it."


Arab Times
7 hours ago
- Arab Times
Teens say they are turning to AI for advice, friendship and 'to get out of thinking'
NEW YORK, July 23, (AP): No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over,' said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking.' For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends,' like or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. "AI is always available. It never gets bored with you. It's never judgmental,' says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified.' All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion' for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship,' said Nair. "It's almost like we are allowing computers to replace our relationships with people.' In the Common Sense Media survey, 31% of teens said their conversations with AI companions were "as satisfying or more satisfying' than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers, and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening,' said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions.' The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills, and independence, Robb said, and AI companions should complement - not replace - real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world,' he said. The nonprofit analyzed several popular AI companions in a " risk assessment,' finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice, and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking, and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a chatbot. "Parents really have no idea this is happening,' said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up.' Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI, and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision,' said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not.' Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil,' Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this,' Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend.' Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people,' Nair said. "I think AI complements another need that runs a lot deeper - our need for attachment and our need to feel emotions. It feeds off of that.'


Japan Today
8 hours ago
- Japan Today
Teens say they are turning to AI for advice, friendship and 'to get out of thinking'
Bruce Perry, 17, demonstrates the possibilities of artificial intelligence by creating an AI companion on Character AI, Tuesday, July 15, 2025, in Russellville, Ark. (AP Photo/Katie Adkins) By JOCELYN GECKER No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colors, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honors student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. 'Everyone uses AI for everything now. It's really taking over,' said Chege, who wonders how AI tools will affect her generation. 'I think kids use AI to get out of thinking.' For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as 'digital friends,' like or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. 'AI is always available. It never gets bored with you. It's never judgmental,' says Ganesh Nair, an 18-year-old in Arkansas. 'When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified.' All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an 'AI companion' for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. 'That felt a little bit dystopian, that a computer generated the end to a real relationship,' said Nair. 'It's almost like we are allowing computers to replace our relationships with people.' In the Common Sense Media survey, 31% of teens said their conversations with AI companions were 'as satisfying or more satisfying' than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. 'It's eye-opening,' said Robb. 'When we set out to do this survey, we had no understanding of how many kids are actually using AI companions.' The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement — not replace — real-world interactions. 'If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world,' he said. The nonprofit analyzed several popular AI companions in a ' risk assessment,' finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a chatbot. 'Parents really have no idea this is happening,' said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. 'All of us are struck by how quickly this blew up.' Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. 'One of the concerns that comes up is that they no longer have trust in themselves to make a decision,' said Telzer. 'They need feedback from AI before feeling like they can check off the box that an idea is OK or not.' Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. 'If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil,' Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. 'I'm worried that kids could get lost in this,' Perry said. 'I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend.' Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. 'Social media complemented the need people have to be seen, to be known, to meet new people,' Nair said. 'I think AI complements another need that runs a lot deeper — our need for attachment and our need to feel emotions. It feeds off of that.' 'It's the new addiction,' Nair added. 'That's how I see it.' The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at © Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.