logo
#

Latest news with #DonAustin

Teens are using AI to create fake nudes of their classmates — as a new form of bullying
Teens are using AI to create fake nudes of their classmates — as a new form of bullying

New York Post

time21-04-2025

  • New York Post

Teens are using AI to create fake nudes of their classmates — as a new form of bullying

They've turned tech into a weapon — and no one's safe from the scandal. Teens are using artificial intelligence to whip up disturbingly realistic nude images of their classmates — and then share them like digital wildfire, sending shockwaves through schools and leaving experts fearing the worst. The AI-powered tools, often dubbed 'nudify' apps, are as sinister as they sound. With just a headshot — often lifted from a yearbook photo or social media profile — these apps can fabricate explicit deepfake images that appear scarily real. And yes, it's already happening in schools. 3 AI 'nudify' apps are fueling a disturbing trend among teens: generating fake nude images of classmates and spreading them with devastating consequences. Getty Images These hyper-realistic images — forged with AI tools — are turning bullying into a high-tech nightmare. 'We're at a place now where you can be doing nothing and stories and pictures about you are posted online,' Don Austin, superintendent of the Palo Alto Unified School District, told Fox News Digital. 'They're fabricated. They're completely made up through AI and it can have your voice or face. That's a whole other world.' This is a full-blown digital crisis. Last summer, the San Francisco City Attorney's office sued 16 so-called 'nudify' websites for allegedly violating laws around child exploitation and nonconsensual images. Those sites alone racked up more than 200 million visits in the first half of 2023. 3 This trend is a full-blown digital crisis. Getty Images But catching the tech companies behind these tools? That's like playing a game of Whac-A-Mole. Most have skated past current state laws, though some — like Minnesota — are trying to pass legislation to hold them accountable for the havoc they're wreaking. Still, the tech moves faster than the law — and kids are getting caught in the crossfire. 3 AI apps are making bullying disturbingly easy — no skills needed, just a face and a few taps to create shockingly real fake nudes. Getty Images/iStockphoto Josh Ochs, founder of SmartSocial — an organization that trains families on online safety — told Fox News Digital that AI-generated nudes are causing 'extreme harm' to teens across the country. 'Kids these days will upload maybe a headshot of another kid at school and the app will recreate the body of the person as though they're nude,' Ochs revealed to the outlet. 'This causes extreme harm to that kid that might be in the photo, and especially their friends as well and a whole family,' he noted. He said parents need to stop tiptoeing around their children's digital lives — and start laying down some boundaries. 'Before you give your kids a phone or social media, it's time to have that discussion early and often. Hey, this is a loaner for you, and I can take it back at any time because you could really hurt our family,' Ochs said. In February, the U.S. Senate unanimously passed a bill to criminalize publishing — or even threatening to publish — nonconsensual AI deepfake porn. It now awaits further action. Austin said the only way to get ahead of the curve is to keep talking — with parents, teachers, students, and anyone else who will listen. 'This isn't going away,' he warned. 'It's evolving — and fast.'

Teens are now using AI chatbots to create and spread nude images of classmates, alarming education experts
Teens are now using AI chatbots to create and spread nude images of classmates, alarming education experts

Yahoo

time21-04-2025

  • Yahoo

Teens are now using AI chatbots to create and spread nude images of classmates, alarming education experts

A troubling trend has emerged in schools across the United States, with young students falling victim to the increasing use of artificial intelligence (AI)-powered "nudify" apps that have the power to create fake pornography of classmates. "Nudify" is an umbrella term referring to a plethora of widely available apps and websites that allow users to alter photos of full-dressed individuals and virtually undress them. Some apps can create nude images with just a headshot of the victim. Don Austin, the superintendent of the Palo Alto Unified School District, told Fox News Digital that this type of online harassment can be more relentless compared to traditional in-person bullying. "It used to be that a bully had to come over and push you. Palo Alto is not a community where people are going to come push anybody into a locker. That doesn't happen. But it's not immune from online bullying," Austin said. 'Sound Of Freedom' Producer Says Ai Tools Helped Nab Child Trafficker That Eluded Fbi For 10 Years "The differences, I think, are worse. Now your bully can be completely anonymous. You don't even know where it's coming from," he continued. Read On The Fox News App Austin noted that conversations with mental health professionals have unearthed another troubling trend wherein kids who have become the victim of online bullying can become "addicted" to searching for negative content about themselves. "They're looking, monitoring the exact place where the harm is coming from," he said. Growing up in the 1980s, Austin recalled how a student could do something stupid on a weekend and peers would whisper and talk about that individual on a Monday. Flash-forward to the early days of the internet when Austin was starting his professional career: at this point, students could post pictures and comments about classmates and display that to the entire school. Protecting Your Daughter From Deepfakes And Online Abuse "We're at a place now where you can be doing nothing and stories and pictures about you are posted online. They're fabricated. They're completely made up through AI and it can have your voice or face. That's a whole other world," he told Fox News Digital. Last August, the office of the San Francisco City Attorney filed a lawsuit accusing 16 "nudify websites" of violating nonconsensual intimate images and child abuse material laws. In the first half of 2023, the websites in question were visited over 200 million times. The parent companies of the apps that create these hyper-realistic "deepfake pornography" images have largely remained unscathed by state legislation. However, at least one state, Minnesota, is considering a bill that would hold them accountable for certain image generations. Teen Deepfake Pornography Victim Warns Future Generation Is 'At Risk' If Ai Crime Bill Fails Though technology will likely always outpace policy, Austin stressed the importance of ongoing collaboration and communication between educators, parents, and students to redefine acceptable behaviors and provide support for those affected by AI and social media. Nearly a decade ago, Austin fostered a working relationship with SmartSocial founder Josh Ochs, whose organization hosts weekly live events that teach parents how to keep their kids safe online. Ochs told Fox News Digital that in a growing number of cases, these apps are subjecting school-aged teens to humiliation, harassment and online sexual exploitation. The creation of these images can also lead to legal ramifications. "Kids these days will upload maybe a headshot of another kid at school and the app will recreate the body of the person as though they're nude. This causes extreme harm to that kid that might be in the photo, and especially their friends as well and a whole family," he told Fox News Digital. Ai 'Deepfakes' Of Innocent Images Fuel Spike In Sextortion Scams, Fbi Warns Ochs emphasized the importance of parents having open and frequent dialogues with their children about online safety and the dangers of these apps, while also taking an interest in their personal lives. Though some parents push to give their kids greater autonomy and privacy, Ochs said parents should have access to their children's devices and social media accounts (via the passcode), just as they would have a spare set of keys to a car. "Before you give your kids a phone or social media, it's time to have that discussion early and often. Hey, this is a loaner for you, and I can take it back at any time because you could really hurt our family," he said. The U.S. Senate in February unanimously approved a bill by Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minn., that would make it a federal crime to publish, or threaten to publish, nonconsensual intimate imagery, including "digital forgeries," also known as deepfakes, crafted by article source: Teens are now using AI chatbots to create and spread nude images of classmates, alarming education experts

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store