logo
#

Latest news with #JacksonWillhoit

What parents need to know about AI chatbots and kids as safety concerns grow
What parents need to know about AI chatbots and kids as safety concerns grow

Yahoo

time22-05-2025

  • Yahoo

What parents need to know about AI chatbots and kids as safety concerns grow

DENVER (KDVR) — There's growing concern about children interacting with social chatbots powered by artificial intelligence. Attorney General Phil Weiser issued a consumer alert warning parents of the dangers of social AI chatbots. It comes in response to the growing number of reports of children's interactions leading to risky behavior, including online companionship and self-harm. Fictional fiction: A newspaper's summer book list recommends nonexistent books. Blame AI 'AI is everywhere in music, in video and film,' said Jackson Willhoit, a graduate of East High School. 'We have ChatGPT and things we can look up and use them off of a browser.' Willhoit is no stranger to AI-powered chatbots. 'It really gives this illusion. It's almost like you're talking to a person and you know I think when you spend enough time with that, you can kind of get lost in that kind of illusion,' he said. It's that illusion that prompted Weiser to issue the warning to parents. 'Everyone is susceptible to this, especially when we become complacent, when we doom scroll when we go on and on and on social media. It becomes a lot harder to be aware of these things,' said Jackson House, a graduate of East High School. Nikhil Krishnaswamy is a technical expert who works as an assistant professor of computer science at Colorado State University. He explained what AI-powered chatbots are. Industry leaders urge Senate to protect against AI deepfakes with No Fakes Act 'Really, they are machines that are capable of having extended conversations with you,' said Krishnaswamy. 'So, even though they are not actually thinking, there's no person behind the machine typing responses to you. The way that they behave creates the impression they actually have thoughts that can reason. That maybe even they have feelings, and so people will tend to develop attachments to these.' That's what experts are concerned about when it comes to children and teens. 'When we have particularly minors interacting with these machines … you know, you don't know what the machine is going to say,' said Krishnaswamy. 'You also don't know how a minor is going to react to that.' Experts warn AI chatbots can generate disturbing content, including violence, explicit sexuality, self-harm and eating disorders. 'Parents need to be understanding this is how the machine works and need to be able to talk to their children about that,' said Krishnaswamy. 'They also need to be aware of the general content they may be exposed to when interacting with these AI systems.' President Trump signs Take It Down Act, addressing nonconsensual deepfakes. What is it? The Attorney General's Office is offering the following tips to parents to help them familiarize themselves with social AI chatbots. Social AI chatbot interactions can turn age-inappropriate even with innocuous prompts. Disturbing content may include violence, explicit sexuality, self-harm, and eating disorders. Engaging with social AI chatbots can be addictive. The chatbots often mimic human emotions and can be manipulative. Social AI chatbots can generate inaccurate and biased information, which should be examined carefully and critically. Information shared with social AI chatbots may be shared with the platform's developers to train the AI, raising privacy concerns. Parents should talk to their kids about their online experiences, including which online platforms they use and why. Monitor their usage and adhere to age restrictions. Use available parental controls, including internet filters. Be active in their online activities and supervise their tech use. Teach children that social AI chatbots are not human; they are only designed to seem human. Learn about the benefits and risks. Don't wait to talk to your kids about safe and responsible use of social AI chatbots and other AI tools If you believe social AI chatbot companies are violating the law should file a complaint with the Colorado Attorney General's Office. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store