5 days ago
Deepfake nudes outpace Wisconsin law; children and adults at risk
The Brief
Generative AI websites are making it easy for children and adults to digitally undress friends, classmates and colleagues.
So-called 'deepfake nudes' of children are already banned in 38 states, including Wisconsin.
Pending legislation in Wisconsin would make it a felony to share or publish deepfake nudes of adults, too.
BAYSIDE, Wis. - A new online threat is putting Wisconsinites at risk of sexual exploitation. Apps and websites are using artificial intelligence to turn innocent photos into so-called deepfake nudes.
What we know
Dozens of mobile apps and websites are cashing in on the power of generative-AI to turn innocent photographs of identifiable, real people, into computer-generated nudity. In some cases, the sites are churning out images of graphic, simulated sexual acts.
"They're unbelievably realistic," says Michael Zimmer, professor of Computer Science and Director of the Marquette University Center for Data, Ethics and Society. "It's kind of an arms race. A lot of the app stores have been trying to remove these platforms, but the websites still exist."
FREE DOWNLOAD: Get breaking news alerts in the FOX LOCAL Mobile app for iOS or Android
For decades, Zimmer said, photo-editing software has made it possible for the tech-savvy to manipulate images in sexually-inappropriate ways. But artificial intelligence has made it faster, easier and more accessible, even to those with little to no technological skills.
"The realism, the instant ability to do this. You don't have to have your computer run all night to do this. It happens with just the click of a button," Zimmer said.
The backstory
To show just how easy it is to do without harming an actual person, FOX6 Investigators photographed a mannequin. We then uploaded the image to a so-called "AI nudify" site and asked it to undress the image. The result was so realistic that we had to use black bars to censor it for use in a news story.
We are intentionally not naming the site we used, but one AI-tracking website claims the 15 most popular sites have more than 56 million active users combined.
What they're saying
When Elliston Berry was 14, she said 2,400 classmates in her Texas High School saw images of her nude body. "It was really embarrassing and shameful," Berry said, "Especially because I [was] just a freshman and everyone is seeing these intimate images of me."
Only, it wasn't really her. A classmate had taken a fully-clothed photo of Berry and removed her clothing with an AI-undressing app."My innocence was stripped away," Berry said.
Local perspective
The same thing happened to a pair of 13-year-old girls in Milwaukee's north shore. The girls were students at Maple Dale school, a K-8 grade school in Fox Point.
According to a search warrant affidavit filed in October 2024 by Bayside Police, a 13-year-old male student took photos of two female classmates posted online and used AI to undress them.
SIGN UP TODAY: Get daily headlines, breaking news emails from FOX6 News
The original pictures were posted to Instagram. One was a selfie taken in a restroom. The other, a photograph of the girl's batmitzvah. In both images, the girls were properly clothed. The computer-generated images made them appear to be nude. The boy shared the photos with another boy on Snapchat.
By the numbers
While many adults are just beginning to learn about the new technology, researchers say there's a good chance your children already know.
"Yeah, there's a very good chance, unfortunately," said Melissa Stroebel, Vice President of Research and Insights for Thorn, a non-profit dedicated to online safety.
Thorn surveyed more than 1,200 teens and young adults, ages 13-20. They found one in eight already knew someone who had been victimized by an intimate deepfake. And one in 17 said they had personally been victims.
"That's the size of a high school classroom," Stroebel said, adding that often, children who create or share synthetic nudes may simply be curious. "It was somebody they had a crush on, and they thought that this was an acceptable way to explore that crush."
Or they may be acting out of revenge for an ex.
"Because they didn't appreciate the genuine harm and risk," Stroebel said.
While the survey aimed to quantify the problem, some experts say the cases we know about are the tip of the iceberg.
Why you should care
"I think a lot of these cases go unreported," said Erin Karshen, an Assistant District Attorney in Milwaukee County who prosecutes sensitive crimes.
While Wisconsin law does consider AI-generated images of children to constitute illegal child pornography, adults are another story.
"There just isn't a great fit for it in the law right now because it's such a new technology that we hadn't seen before," Karshen said.
According to the consumer rights non-profit, Public Citizen, 38 states, including Wisconsin, have new laws that prohibit intimate deepfakes of children. But Wisconsin is among just four of those states that do not provide similar protection for adults.
"Current law does not take into account deepfake technology," said State Representative Brent Jacobson, a Republican from Mosinee.
Jacobson is teaming up with State Senator Andre Jacque, a Republican from New Franken, on a bill that would treat intimate deepfakes of adults the same as a real nude image. The bill would make it a felony to post, publish or otherwise share a synthetic nude image of an identifiable person, without that person's consent, if the intent is to harass or intimidate the person depicted. "The message of this legislation is clear," Jacque said. "Don't do it."
The other side
The bill unanimously passed the State Senate, but in the Assembly, Representative Darrin Madison raised concerns.
"Can it be legally circulated, maybe as a joke?" Rep. Madison asked in a public hearing. "Or maybe for educational purposes."
The Milwaukee Democrat did not respond to FOX6's request for an interview, but in the hearing he worried about the impact the law could have on urban youth.
"Young people being funneled into our criminal justice system, ending up as sex offenders for life and so on," Madison said.
"There's arguments to say this is speech," said Zimmer. "That I should have the ability to create things on my computer in the privacy of my home without the government interfering."
"Artistic freedom," said FOX6 Investigator Bryan Polcyn.
"Artistic freedom," Zimmer said.
Stroebel said there's nothing funny about it.
"Creating a deepfake nude is not a joke. It is not harmless," Strobel said.
In other words, when it comes to psychological harm caused by computer-generated nudity, there's nothing artifical about it.
What's next
It's already a felony in Wisconsin to take a naked picture of a person without their consent. Senator Jacque's bill would expand that law to deepfakes. However, before the bill passed the Senate, lawmakers added two amendments.
One makes it a crime only if the person posting the image knows the person depicted did not give consent. The other makes the deepfake a crime only if the image is so realistic that a reasonable person would believe the conduct actually happened.
The bill, as amended, passed the senate unanimously, 33-0. It now awaits action in the state Assembly.