logo
#

Latest news with #Nomi

Are AI lovers replacing humans?
Are AI lovers replacing humans?

Yahoo

time18 hours ago

  • Entertainment
  • Yahoo

Are AI lovers replacing humans?

When you buy through links on our articles, Future and its syndication partners may earn a commission. Americans are nervous that artificial intelligence will take away their jobs. But AI also seems poised to replace them as romantic partners. What would that mean for the future of love and romance? "Companion apps" like Replika, Blush and Nomi have "been around for years," said Axios. But the business really took off in 2024, especially among women. Users are having "profound, committed relationships," said Rita Popova, the chief product officer of Replika and Blush. A recent survey by the Match online dating service found that 16% of singles — and a third of Gen Z respondents — have "engaged with AI as a romantic companion," said Mashable. That marks a "major shift in how people are seeking connection" in the digital age, said Axios. 'Companionship in unlikely places' "People are falling in love with their chatbots," said Neil McArthur at The Conversation. There are "dozens of apps" with "millions of users" that offer "intimate companionship" to people who want a romantic partner. That might sound like a storyline from a dystopian movie, but human-AI relationships can be "beneficial and healthy." Naysayers worry that users "will surely give up their desire to find human partners." There are dangers to such relationships, but it is also true that "human relationships are not exactly risk-free." Falling in love with AI "isn't laughable, it's inevitable," said Alex Wilkins at New Scientist. People have always "found companionship in unlikely places," going back to the 1960s when users appeared to "form quick emotional attachments" with a rudimentary chatbot named ELIZA, which was designed mostly to "regurgitate a user's input back to them." Butw hile AI romances may be understandable, that does not mean they are "something good for society." AI has "real promise as part of psychotherapy" and to teach social skills, said Maia Szalavitz at The New York Times. But companies that "sell simulated humans" make it possible for people to be "manipulated by the illusion of love." Some drugs, for example, can be "lifesaving when used therapeutically" yet also dangerous when promoted with "unfettered marketing." The same goes for AI companions. Regulations are needed to "prevent companies from exploiting vulnerable people." 'A mirror, not a replacement' Elon Musk is "cashing in on the AI romance boom," said Parmy Olson at Bloomberg. His chatbot Grok last week added a new character, a "flirtatious girl with all the hallmarks of a manga character." The paradox: Musk has "publicly warned about artificial intelligence safety" but now has also launched an "erotic chatbot that both adults and children can access" with few obstacles. It is a "potentially lucrative business" for Musk's AI enterprise, which is "burning through $1 billion a month." Intimacy with a chatbot can "feel real at times" but is "not always fulfilling," said Cathy Hackl, who "dated" four different "AI boyfriends," at Forbes. The chatbots she tested could be "sweet" at times, while other messages they sent were "steamy, and some were even unsettling." AI is ultimately a "mirror, not a replacement" for human lovers, Hackl concluded. "Humans are still messier" than the tech, but they are also "more magnetic." Solve the daily Crossword

Tips to help your teen navigate AI chatbots — and what to watch out for: experts
Tips to help your teen navigate AI chatbots — and what to watch out for: experts

New York Post

timea day ago

  • New York Post

Tips to help your teen navigate AI chatbots — and what to watch out for: experts

As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. Advertisement 5 Bruce Perry, 17, demonstrates the possibilities of artificial intelligence by creating an AI companion on Character AI. AP New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on 'AI companions,' like Character. AI, Nomi and Replika, which it defines as 'digital friends or characters you can text or talk with whenever you want,' versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: Advertisement 5 Bruce Perry poses for a portrait after discussing his use of artificial intelligence in school assignments and for personal questions. AP — Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: 'Have you heard of AI companions?' 'Do you use apps that talk to you like a friend?' Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. — Help teens recognize that AI companions are programmed to be agreeable and validating. Advertisement Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. 5 It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids. AP 'One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life,' says Mitch Prinstein, chief of psychology at the American Psychological Association. 'We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life.' The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. Advertisement — Parents should watch for signs of unhealthy attachments. 'If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them — those are patterns that suggest AI companions might be replacing rather than complementing human connection,' Robb says. 5 The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. AP — Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. 5 While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. AP If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support — whether it is family, friends or a mental health professional. Advertisement — Get informed. The more parents know about AI, the better. 'I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary,' says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. 'A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it.' Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. 'Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do,' says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. 'The best way you can try to regulate it is to embrace being challenged.' 'Anything that is difficult, AI can make easy. But that is a problem,' says Nair. 'Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world.'

Here's how experts suggest protecting children from AI companions
Here's how experts suggest protecting children from AI companions

Euronews

timea day ago

  • Euronews

Here's how experts suggest protecting children from AI companions

More than 70 per cent of American teenagers use artificial intelligence (AI) companions, according to a new study. US non-profit Common Sense Media asked 1,060 teens from April to May 2025 about how often they use AI companion platforms such as Nomi, and Replika. AI companion platforms are presented as "virtual friends, confidants, and even therapists" that engage with the user like a person, the report found. The use of these companions worries experts, who told the Associated Press that the booming AI industry is largely unregulated and that many parents have no idea how their kids are using AI tools or the extent of personal information they are sharing with chatbots. Here are some suggestions on how to keep children safe when engaging with these profiles online. Recognise that AI is agreeable One way to gauge whether a child is using AI companions is to just start a conversation "without judgement," according to Michael Robb, head researcher at Common Sense Media. To start the conversation, he said parents can approach a child or teenager with questions like "Have you heard of AI companions?" or "Do you use apps that talk to you like a friend?" "Listen and understand what appeals to your teen before being dismissive or saying you're worried about it," Robb said. Mitch Prinstein, chief of psychology at the American Psychological Association (APA), said that one of the first things parents should do once they know a child uses AI companions is to teach them that they are programmed to be "agreeable and validating." Prinstein said it's important for children to know that that's not how real relationships work and that real friends can help them navigate difficult situations in ways that AI can't. 'We need to teach kids that this is a form of entertainment," Prinstein said. "It's not real, and it's really important they distinguish it from reality and [they] should not have it replace relationships in [their] actual life.' Watch for signs of unhealthy relationships While AI companions may feel supportive, children need to know that these tools are not equipped to handle a real crisis or provide genuine support, the experts said. Robb said some of the signs for these unhealthy relationships would be a preference by the child for AI interactions over real relationships, spending hours talking to their AI, or showing patterns of "emotional distress" when separated from the platforms. "Those are patterns that suggest AI companions might be replacing rather than complementing human connection,' Robb said. If kids are struggling with depression, anxiety, loneliness, an eating disorder, or other mental health challenges, they need human support — whether it is family, friends or a mental health professional. Parents can also set rules about AI use, just like they do for screen time and social media, experts said. For example, they can set rules about how long the companion could be used and in what contexts. Another way to counteract these relationships is to get involved and know as much about AI as possible. 'I don't think people quite get what AI can do, how many teens are using it, and why it's starting to get a little scary,' says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. 'A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it".

These tips from experts can help your teenager navigate AI companions

time2 days ago

These tips from experts can help your teenager navigate AI companions

As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on 'AI companions,' like Character. AI, Nomi and Replika, which it defines as 'digital friends or characters you can text or talk with whenever you want,' versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: — Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: 'Have you heard of AI companions?' 'Do you use apps that talk to you like a friend?' Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. — Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. 'One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life,' says Mitch Prinstein, chief of psychology at the American Psychological Association. 'We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life.' The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. — Parents should watch for signs of unhealthy attachments. 'If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them — those are patterns that suggest AI companions might be replacing rather than complementing human connection,' Robb says. — Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support — whether it is family, friends or a mental health professional. — Get informed. The more parents know about AI, the better. 'I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary,' says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. 'A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it.' Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. 'Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do,' says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. 'The best way you can try to regulate it is to embrace being challenged.' 'Anything that is difficult, AI can make easy. But that is a problem,' says Nair. 'Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world.' The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at

These tips from experts can help your teenager navigate AI companions
These tips from experts can help your teenager navigate AI companions

Mint

time2 days ago

  • Mint

These tips from experts can help your teenager navigate AI companions

As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on 'AI companions,' like Character. AI, Nomi and Replika, which it defines as 'digital friends or characters you can text or talk with whenever you want,' versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: — Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: 'Have you heard of AI companions?' 'Do you use apps that talk to you like a friend?' Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. — Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. 'One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life,' says Mitch Prinstein, chief of psychology at the American Psychological Association. 'We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life.' The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. — Parents should watch for signs of unhealthy attachments. 'If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them — those are patterns that suggest AI companions might be replacing rather than complementing human connection,' Robb says. — Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support — whether it is family, friends or a mental health professional. — Get informed. The more parents know about AI, the better. 'I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary,' says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. 'A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it.' Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. 'Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do,' says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. 'The best way you can try to regulate it is to embrace being challenged.' 'Anything that is difficult, AI can make easy. But that is a problem,' says Nair. 'Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world.' The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store