04-05-2025
AI deepfake abuse: Boys at Sydney schools caught selling sexually explicit images of female students
Male students at a Sydney private school have been caught selling deepfake nude images of female students on social media.
They reportedly used artificial intelligence (AI) to superimpose the faces of their female schoolmates — and the faces of girls from two other independant schools — onto sexually explicit images.
WATCH THE VIDEO ABOVE: Explicit depefake abuse images of girls sold online by male students.
The pictures were then sold within group chats on Instagram and Snapchat for less than $5, 7NEWS Sunrise reports.
It is unclear exactly when these incidents occurred, and whether they were reported to police for investigation. has contacted NSW Police, and the Australian Federal Police (AFP).
An e-Safety spokesperson told it has received 38 complaints about explicit deepfake images involving children under 18 in NSW since January 2023.
'While eSafety is aware of reports involving digitally generated nude images of teenagers allegedly being sold at schools, we have not received complaints of this nature to date,' the spokesperson said.
'When a report involves a person under the age of 18, it is child sexual abuse material, and we refer it to our colleagues at the Australian Centre to Counter Child Exploitation (ACCCE).'
The spokesperson did note that school leaders have been voicing concern about an increase in this kind of abuse.
'Deepfake image-based abuse is not only becoming more prevalent, but it is also very gendered and incredibly distressing to the victim-survivor,' eSafety Commissioner Julie Inman Grant previously said amid a 2024 inquiry into sexual deepfake material.
Cyber security expert Ross Bark said that people accessing the AI programs used to make such images do not need to be technology experts — the programs are free and easy to use.
'These are not hard-to-access pieces of code, or things that you need to set up, (you don't need to) have any knowledge of tech,' Bark said.
'This is not just about AI and technology, this is sexual abuse.'
'Your daughters have done nothing wrong'
A male year 12 student from southwest Sydney was accused of targeting students with explicit AI images earlier this year.
The NSW Department of Education sent an email to parents at the time, alerting them of the scandal.
'We want to emphasise that your daughters have done nothing wrong, there are no inappropriate real photos of them being used,' it said.
'I am sorry this has occurred.'
The incident came after about 50 girls, believed to be between Years 9 and 12 at Melbourne's Bacchus Marsh Grammar, had images taken from their personal Instagram accounts, which were then manipulated using AI to make 'obscene photographs'.
Those images were then also shared across multiple social media platforms.
'Explicit deepfakes have increased on the internet as much as 550 per cent year on year since 2019,' Inman Grant said in 2024.
'It's a bit shocking to note that pornographic videos make up 98 per cent of the deepfake material currently online and 99 per cent of that imagery is of women and girls.'
The effects of producing and distributing such material can be devastating.
Matilda 'Tilly' Rosewarne was just 15-years-old when she took her own life near a cubby house at her family's Bathurst home in 2022, after becoming the victim of sexually explicit image-based abuse on Snapchat — a blow which followed years of bullying.
New laws criminalising the sharing of non-consensual deepfake sexually explicit material were brought forward late last year under The Criminal Code Amendment (Deepfake Sexual Material) Bill 2024.
The Australian Federal Police later charged two men in March with child abuse material offences, for possessing and accessing deepfake child abuse material.
They were among 25 people arrested as part of a global operation targeting the alleged production and distribution of child abuse material generated by AI, AFP said.