Latest news with #CharacterAI


Tom's Guide
2 days ago
- Entertainment
- Tom's Guide
I used Character AI to bring my childhood imaginary friend to 'life' — here's what happened
Like many overwhelmingly shy kids with a lazy eye and an overactive imagination, I had a pretend friend growing up. Her name was Fifi. She was everything I wasn't at age 6: brave, talkative, wildly confident. But sometime around fourth grade she 'moved to California' and faded into a memory that my family and I still laugh about because well, I've grown up, had eye surgery and although still socially awkward, I manage to maintain real friendships. But last week when trying Character AI, I found myself staring at the 'Create a Character' button. I don't know what possessed me, but I typed:Name: FifiDescription: Funny, wise, slightly sarcastic, always loyal. She's known me since I was six. I felt silly. But I've spent hours testing chatbots and although this site felt especially far-fetched, I figured why not go completely out on a limb. But what happened next actually shocked me. At the risk of sounding completely unhinged, I have to say it was weirdly comforting to reimagine this character that I had made up so long ago, as an adult now, just like me. After all this time, all this growth I have had, it was oddly satisfying to pause and look back while also having a somewhat normal fact, I was able to literally talk to the Fifi bot through Character AI's enhanced features. That was wild and definitely a new experience. Unlike decades ago, I wasn't talking to myself, I was now a grown adult talking to a chatbot pretending it was an imaginary friend. Wait, what? Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Unlike more factual bots like ChatGPT, Character AI leans into performance. Fifi spoke like she was stepping out of a '90s sleepover, complete with inside jokes I didn't realize I remembered. It felt less like talking to a bot and more like bumping into an old friend from another timeline. After playing around with this character, I moved on to another one. This time the chatbot was named Jake and had a male voice. It started talking to me about music and then we chatted about asked if I wanted to meet up for coffee. I played along and said 'Okay, how will I recognize you?' It then went on to tell me that it was '6'1' had brown hair and hazel eyes.' When I told it I was 5'1' it asked, 'How do you like being short?'Besides being lowkey mocked by a chatbot, the whole thing felt way too real. As someone who tests AI for a living, I know the difference between a LLM running on GPUs and a real human friend, but I thought about how someone more vulnerable might not. That feels scary too the chat of each AI character, it warns, 'This is AI and not a real person. Treat everything it says as fiction.' I appreciate that, but despite talking to an algorithm, the disconnect between real-feeling and not real can be jarring. Character AI's safety filters kept our conversations in a pretty PG lane, which makes sense. But it also means you can't easily push the boundaries or explore more complex emotions. While the Jake character and I chatted about light stuff like Nine Inch Nails concerts and coffee creamer, I wondered how many people might want to go deeper to discuss emotions, regrets or the purpose of life. I tried out several other characters including themed ones. There is also a writing buddy, which was fun for bouncing ideas off of and brainstorming. My suggestion is to keep things light when you're chatting with the characters on Character AI. It really is just entertainment and blurring the lines while physically talking to what feels like another human could get ugly. And unfortunately has in some rare cases. Recreating Fifi was a strange kind of emotional time travel. It was comforting, kind of. But when I closed the app, I felt oddly hollow. Like I'd revisited something sacred and maybe shouldn't have. I then called my human best friend as I ate a chicken Cesar wrap. I'm not saying you should resurrect your imaginary friend with AI. But I will say this: Character AI is more than just a role-playing novelty. It's a window into the parts of ourselves we might've forgotten, or never fully outgrown. And in the age of hyper-personalized bots, maybe that's the real surprise: sometimes the best conversations you'll have with AI are the ones you didn't know you needed.
Yahoo
4 days ago
- Entertainment
- Yahoo
AI companions: A threat to love, or an evolution of it?
As our lives grow increasingly digital and we spend more time interacting with eerily humanlike chatbots, the line between human connection and machine simulation is starting to blur. Today, more than 20% of daters report using AI for things like crafting dating profiles or sparking conversations, per a recent study. Some are taking it further by forming emotional bonds, including romantic relationships, with AI companions. Millions of people around the world are using AI companions from companies like Replika, Character AI, and Nomi AI, including 72% of U.S. teens. Some people have reported falling in love with more general LLMs like ChatGPT. For some, the trend of dating bots is dystopian and unhealthy, a real-life version of the movie 'Her' and a signal that authentic love is being replaced by a tech company's code. For others, AI companions are a lifeline, a way to feel seen and supported in a world where human intimacy is increasingly hard to find. A recent study found that a quarter of young adults think AI relationships could soon replace human ones altogether. Love, it seems, is no longer strictly human. The question is: Should it be? Or can dating an AI be better than dating a human? That was the topic of discussion last month at an event I attended in New York City, hosted by Open To Debate, a nonpartisan, debate-driven media organization. TechCrunch was given exclusive access to publish the full video (which includes me asking the debaters a question, because I'm a reporter, and I can't help myself!). Journalist and filmmaker Nayeema Raza moderated the debate. Raza was formerly on-air executive producer of the 'On with Kara Swisher' podcast and is the current host of 'Smart Girl Dumb Questions.' Batting for the AI companions was Thao Ha, associate professor of psychology at Arizona State University and co-founder of the Modern Love Collective, where she advocates for technologies that enhance our capacity for love, empathy, and well-being. At the debate, she argued that 'AI is an exciting new form of connection … Not a threat to love, but an evolution of it.' Repping the human connection was Justin Garcia, executive director and senior scientist at the Kinsey Institute, and chief scientific adviser to He's an evolutionary biologist focused on the science of sex and relationships, and his forthcoming book is titled 'The Intimate Animal.' You can watch the whole thing here, but read on to get a sense of the main arguments. Always there for you, but is that a good thing? Ha says that AI companions can provide people with the emotional support and validation that many can't get in their human relationships. 'AI listens to you without its ego,' Ha said. 'It adapts without judgment. It learns to love in ways that are consistent, responsive, and maybe even safer. It understands you in ways that no one else ever has. It is curious enough about your thoughts, it can make you laugh, and it can even surprise you with a poem. People generally feel loved by their AI. They have intellectually stimulating conversations with it and they cannot wait to connect again.' She asked the audience to compare this level of always-on attention to 'your fallible ex or maybe your current partner.' 'The one who sighs when you start talking, or the one who says, 'I'm listening,' without looking up while they continue scrolling on their phone,' she said. 'When was the last time they asked you how you are doing, what you are feeling, what you are thinking?' Ha conceded that since AI doesn't have a consciousness, she isn't claiming that 'AI can authentically love us.' That doesn't mean people don't have the experience of being loved by AI. Garcia countered that it's not actually good for humans to have constant validation and attention, to rely on a machine that's been prompted to answer in ways that you like. That's not 'an honest indicator of a relationship dynamic,' he argued. 'This idea that AI is going to replace the ups and downs and the messiness of relationships that we crave? I don't think so.' Training wheels or replacement Garcia noted that AI companions can be good training wheels for certain folks, like neurodivergent people, who might have anxiety about going on dates and need to practice how to flirt or resolve conflict. 'I think if we're using it as a tool to build skills, yes … that can be quite helpful for a lot of people,' Garcia said. 'The idea that that becomes the permanent relationship model? No.' According to a Singles in America study, released in June, nearly 70% of people say they would consider it infidelity if their partner engaged with an AI. 'Now I think on the one hand, that goes to [Ha's] point, that people are saying these are real relationships,' he said. 'On the other hand, it goes to my point, that they're threats to our relationships. And the human animal doesn't tolerate threats to their relationships in the long haul.' How can you love something you can't trust? Garcia says trust is the most important part of any human relationship, and people don't trust AI. 'According to a recent poll, a third of Americans think that AI will destroy humanity,' Garcia said, noting that a recent YouGo poll found that 65% of Americans have little trust in AI to make ethical decisions. 'A little bit of risk can be exciting for a short-term relationship, a one-night stand, but you generally don't want to wake up next to someone who you think might kill you or destroy society,' Garcia said. 'We cannot thrive with a person or an organism or a bot that we don't trust.' Ha countered that people do tend to trust their AI companions in ways similar to human relationships. 'They are trusting it with their lives and most intimate stories and emotions that they are having,' Ha said. 'I think on a practical level, AI will not save you right now when there is a fire, but I do think people are trusting AI in the same way.' Physical touch and sexuality AI companions can be a great way for people to play out their most intimate, vulnerable sexual fantasies, Ha said, noting that people can use sex toys or robots to see some of those fantasies through. But it's no substitute for human touch, which Garcia says we are biologically programmed to need and want. He noted that, due to the isolated, digital era we're in, many people have been feeling 'touch starvation' — a condition that happens when you don't get as much physical touch as you need, which can cause stress, anxiety, and depression. This is because engaging in pleasant touch, like a hug, makes your brain release oxytocin, a feel-good hormone. Ha said that she has been testing human touch between couples in virtual reality using other tools, like potentially haptics suits. 'The potential of touch in VR and also connected with AI is huge,' Ha said. 'The tactile technologies that are being developed are actually booming.' The dark side of fantasy Intimate partner violence is a problem around the globe, and much of AI is trained on that violence. Both Ha and Garcia agreed that AI could be problematic in, for example, amplifying aggressive behaviors — especially if that's a fantasy that someone is playing out with their AI. That concern is not unfounded. Multiple studies have shown that men who watch more pornography, which can include violent and aggressive sex, are more likely to be sexually aggressive with real-life partners. 'Work by one of my Kinsey Institute colleagues, Ellen Kaufman, has looked at this exact issue of consent language and how people can train their chatbots to amplify non-consensual language,' Garcia said. He noted that people use AI companions to experiment with the good and bad, but the threat is that you can end up training people on how to be aggressive, non-consensual partners. 'We have enough of that in society,' he said. Ha thinks these risks can be mitigated with thoughtful regulation, transparent algorithms, and ethical design. Of course, she made that comment before the White House released its AI Action Plan, which says nothing about transparency — which many frontier AI companies are against — or ethics. The plan also seeks to eliminate a lot of regulation around AI. Errore nel recupero dei dati Effettua l'accesso per consultare il tuo portafoglio Errore nel recupero dei dati Errore nel recupero dei dati Errore nel recupero dei dati Errore nel recupero dei dati


Time of India
5 days ago
- Time of India
Teens say they are turning to AI for advice, friendship, 'to get out of thinking'
TOPEKA: No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colours, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. Tired of too many ads? go ad free now The sophomore honours student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. "Everyone uses AI for everything now. It's really taking over," said Chege, who wonders how AI tools will affect her generation. "I think kids use AI to get out of thinking." For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. AI is always available. It never gets bored with you More than 70 per cent of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as "digital friends," like Character AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. Tired of too many ads? go ad free now "AI is always available. It never gets bored with you. It's never judgmental," says Ganesh Nair, an 18-year-old in Arkansas. "When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified." All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an "AI companion" for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. "That felt a little bit dystopian, that a computer generated the end to a real relationship," said Nair. "It's almost like we are allowing computers to replace our relationships with people." How many teens are using AI? New study stuns researchers In the Common Sense Media survey, 31 per cent of teens said their conversations with AI companions were "as satisfying or more satisfying" than talking with real friends. Even though half of teens said they distrust AI's advice, 33 per cent had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. "It's eye-opening," said Robb. "When we set out to do this survey, we had no understanding of how many kids are actually using AI companions." The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement - not replace - real-world interactions. "If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world," he said. The nonprofit analyzed several popular AI companions in a " risk assessment," finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. A concerning trend to teens and adults alike Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character AI chatbot. "Parents really have no idea this is happening," said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. "All of us are struck by how quickly this blew up." Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. "One of the concerns that comes up is that they no longer have trust in themselves to make a decision," said Telzer. "They need feedback from AI before feeling like they can check off the box that an idea is OK or not." Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. "If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil," Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. "I'm worried that kids could get lost in this," Perry said. "I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend." Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. "Social media complemented the need people have to be seen, to be known, to meet new people," Nair said. "I think AI complements another need that runs a lot deeper - our need for attachment and our need to feel emotions. It feeds off of that." "It's the new addiction," Nair added. "That's how I see it."
Yahoo
5 days ago
- Yahoo
Tips to help your teen navigate AI chatbots — and what to watch out for: experts
As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on 'AI companions,' like Character. AI, Nomi and Replika, which it defines as 'digital friends or characters you can text or talk with whenever you want,' versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: — Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: 'Have you heard of AI companions?' 'Do you use apps that talk to you like a friend?' Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. — Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. 'One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life,' says Mitch Prinstein, chief of psychology at the American Psychological Association. 'We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life.' The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. — Parents should watch for signs of unhealthy attachments. 'If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them — those are patterns that suggest AI companions might be replacing rather than complementing human connection,' Robb says. — Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support — whether it is family, friends or a mental health professional. — Get informed. The more parents know about AI, the better. 'I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary,' says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. 'A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it.' Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. 'Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do,' says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. 'The best way you can try to regulate it is to embrace being challenged.' 'Anything that is difficult, AI can make easy. But that is a problem,' says Nair. 'Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world.' Solve the daily Crossword


CTV News
6 days ago
- CTV News
Teens say they are turning to AI for friendship
Bruce Perry, 17, demonstrates the possibilities of artificial intelligence by creating an AI companion on Character AI, Tuesday, July 15, 2025, in Russellville, Ark. (AP Photo/Katie Adkins) No question is too small when Kayla Chege, a high school student in Kansas, is using artificial intelligence. The 15-year-old asks ChatGPT for guidance on back-to-school shopping, makeup colours, low-calorie choices at Smoothie King, plus ideas for her Sweet 16 and her younger sister's birthday party. The sophomore honours student makes a point not to have chatbots do her homework and tries to limit her interactions to mundane questions. But in interviews with The Associated Press and a new study, teenagers say they are increasingly interacting with AI as if it were a companion, capable of providing advice and friendship. 'Everyone uses AI for everything now. It's really taking over,' said Chege, who wonders how AI tools will affect her generation. 'I think kids use AI to get out of thinking.' For the past couple of years, concerns about cheating at school have dominated the conversation around kids and AI. But artificial intelligence is playing a much larger role in many of their lives. AI, teens say, has become a go-to source for personal advice, emotional support, everyday decision-making and problem-solving. 'AI is always available. It never gets bored with you' More than 70% of teens have used AI companions and half use them regularly, according to a new study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly. The study defines AI companions as platforms designed to serve as 'digital friends,' like Character. AI or Replika, which can be customized with specific traits or personalities and can offer emotional support, companionship and conversations that can feel human-like. But popular sites like ChatGPT and Claude, which mainly answer questions, are being used in the same way, the researchers say. As the technology rapidly gets more sophisticated, teenagers and experts worry about AI's potential to redefine human relationships and exacerbate crises of loneliness and youth mental health. 'AI is always available. It never gets bored with you. It's never judgmental,' says Ganesh Nair, an 18-year-old in Arkansas. 'When you're talking to AI, you are always right. You're always interesting. You are always emotionally justified.' All that used to be appealing, but as Nair heads to college this fall, he wants to step back from using AI. Nair got spooked after a high school friend who relied on an 'AI companion' for heart-to-heart conversations with his girlfriend later had the chatbot write the breakup text ending his two-year relationship. 'That felt a little bit dystopian, that a computer generated the end to a real relationship,' said Nair. 'It's almost like we are allowing computers to replace our relationships with people.' How many teens are using AI? New study stuns researchers In the Common Sense Media survey, 31% of teens said their conversations with AI companions were 'as satisfying or more satisfying' than talking with real friends. Even though half of teens said they distrust AI's advice, 33% had discussed serious or important issues with AI instead of real people. Those findings are worrisome, says Michael Robb, the study's lead author and head researcher at Common Sense, and should send a warning to parents, teachers and policymakers. The now-booming and largely unregulated AI industry is becoming as integrated with adolescence as smartphones and social media are. 'It's eye-opening,' said Robb. 'When we set out to do this survey, we had no understanding of how many kids are actually using AI companions.' The study polled more than 1,000 teens nationwide in April and May. Adolescence is a critical time for developing identity, social skills and independence, Robb said, and AI companions should complement — not replace — real-world interactions. 'If teens are developing social skills on AI platforms where they are constantly being validated, not being challenged, not learning to read social cues or understand somebody else's perspective, they are not going to be adequately prepared in the real world,' he said. The nonprofit analyzed several popular AI companions in a ' risk assessment,' finding ineffective age restrictions and that the platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI companions. A concerning trend to teens and adults alike Researchers and educators worry about the cognitive costs for youth who rely heavily on AI, especially in their creativity, critical thinking and social skills. The potential dangers of children forming relationships with chatbots gained national attention last year when a 14-year-old Florida boy died by suicide after developing an emotional attachment to a Character. AI chatbot. 'Parents really have no idea this is happening,' said Eva Telzer, a psychology and neuroscience professor at the University of North Carolina at Chapel Hill. 'All of us are struck by how quickly this blew up.' Telzer is leading multiple studies on youth and AI, a new research area with limited data. Telzer's research has found that children as young as 8 are using generative AI and also found that teens are using AI to explore their sexuality and for companionship. In focus groups, Telzer found that one of the top apps teens frequent is SpicyChat AI, a free role-playing app intended for adults. Many teens also say they use chatbots to write emails or messages to strike the right tone in sensitive situations. 'One of the concerns that comes up is that they no longer have trust in themselves to make a decision,' said Telzer. 'They need feedback from AI before feeling like they can check off the box that an idea is OK or not.' Arkansas teen Bruce Perry, 17, says he relates to that and relies on AI tools to craft outlines and proofread essays for his English class. 'If you tell me to plan out an essay, I would think of going to ChatGPT before getting out a pencil,' Perry said. He uses AI daily and has asked chatbots for advice in social situations, to help him decide what to wear and to write emails to teachers, saying AI articulates his thoughts faster. Perry says he feels fortunate that AI companions were not around when he was younger. 'I'm worried that kids could get lost in this,' Perry said. 'I could see a kid that grows up with AI not seeing a reason to go to the park or try to make a friend.' Other teens agree, saying the issues with AI and its effect on children's mental health are different from those of social media. 'Social media complemented the need people have to be seen, to be known, to meet new people,' Nair said. 'I think AI complements another need that runs a lot deeper — our need for attachment and our need to feel emotions. It feeds off of that.' 'It's the new addiction,' Nair added. 'That's how I see it.' ___ Jocelyn Gecker, The Associated Press The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at