02-05-2025
AI is fuelling an alarming surge of deepfake child sexual abuse material in Canada: What parents need to know
Canada could be facing a disturbing new frontier in digital child sexual abuse.
New data from the
Canadian Centre for Child Protection's
shows a chilling surge in AI-generated child sexual abuse materials (CSAM) in Canada. The number of sexually explicit deepfakes of children that the tip line has processed more than doubled in the past year,
director David Rabsch said in an interview with Metroland Media.
Deepfakes
are hyper-realistic, AI-generated synthetic media that use existing photos, images or audio. It is produced by feeding an AI software with images or audio of real people to create fake materials.
According to Rabsch, who leads the team of analysts at Cybertip — Canada's national tip line for reporting online sexual exploitation of children — the number of these types of materials processed has shot up from 4,000 in 2023 to 9,000 in 2024.
And the figures in early 2025 paint an even grimmer picture, with 4,000 of sexually explicit deepfakes processed just in the first quarter of 2025 alone.
'We have seen cases where offenders have taken images of known children and used AI to create intimate images,' Det. Staff Sgt. Mike Krisko of the Waterloo Region Police's cybercrime unit said in a phone interview.
Offenders will sometimes find photos of people that are publicly available on social media and run it through an AI tool, he explained, adding that the cybercrime unit started seeing cases of this in Waterloo Region late last year.
In cases involving
sextortion
, extortionists will send the image to the victim and threaten to distribute the deepfakes to friends and family unless the victim pays or gives in to more demands, Krisko said.
With AI, predators can now skip the initial process of luring the child into sending the explicit photos.
That 'fast tracks' their process by picking up typically harmless images on a child's Snapchat, TikTok or Instagram, then using a generative AI tool to create sexually explicit deepfakes of children that they can use for sextortion, to add to their own collections or to sell and trade in underground pedophile forums.
'These are minors, and adult content is being made of them nonconsensually, then also being posted on social media,' says Bellingcat senior researcher Kolina Koltai. ttps://
While many of these are accumulated by child predators and added to massive personal collections that are saved in their computers, others trade and share and many more make profits out of it on the dark web.
Rabsch said some criminals share the materials and ask for 'donations' in cryptocurrency. Transactions are done in crypto, the currency of the dark web, since it's an anonymous way of trading value, he explained.
In other cases, these child predators will go on dark web forums looking for like-minded individuals. These are places on the dark web where people discuss child abuse and advertise their massive collections.
Advertisements luring other predators to get access are posted on the dark web, but the content itself may be kept in a compressed archive on the clear web or surface web, the publicly accessible layer of the internet.
These archives have thousands of child sexual abuse photos and videos and are hosted on the clear web by file hosting services that do not have any form of proactive scanning for harmful materials, Rabsch explained.
They then set up premium file hosting services that charge pedophiles a subscription or one-time fee to download the content.
'Now you've got a profit model where you can have a ring of sites on the dark web that are all funneling this traffic and potentially money into these premium file hosters,' he said.
A
Pulitzer investigation
published early this month has found that sites like Instagram are also being used to advertise child sexual exploitation materials and direct child predators willing to pay content subscription platforms Patreon and Fanvue. The two are file hosting sites that are sometimes used by digital child exploitation rings to keep large collections of AI-generated child sexual exploitation materials.
According to the investigation, the Patreon and Fanvue accounts offered 'exclusive subscription content' and production of 'special media.'
Patreon is a content hosting and sharing site where fans can support content creators by paying for monthly subscription to access videos, images and blogs.
A
BBC investigation
has found that some Patreon accounts used to host 'AI-generated, photorealistic obscene images of children,' including materials depicting sexual abuse of babies and toddlers, are selling them with varying pricing levels depending on the type of material predators want to access.
Detective Krisko advises parents to make sure profiles are 'locked down with the tightest security settings on them.' Parents should also keep in mind that any photo they post publicly can be taken by online predators and altered.
'You only want to share images with people that you really know,' he added. This means parents should make sure all those in their contact list and who may have access to their children's photos are people they know and trust. 'You shouldn't have strangers or people you've never met on your contact list.'
How does exploitation happen? 🫢
It starts small. Something as simple as setting your family's accounts to private can make a major difference in protecting your kids online.
We're proud to partner with
@Know2Protect
, a DHS campaign to support you.
For those with children who go online, Rabsch says the best that they can do is to talk to their children. Explain the dangers lurking online and 'teach them how to spot and avoid these kinds of scenarios,' he said.
'We have content on
and
to help parents at least understand what the scope of the problem is,' he shared.
Parents who discover sexually explicit materials of their children being shared online can reach out to
or report it to
.
NeedHelpNow can guide parents through the process of reporting and removing the harmful images or videos of their children online.
Parents of victims can also reach out to the
Canadian Centre for Child Protection
who can help determine and navigate the next steps, assist victims of online sexual abuse with impact statements for criminal proceedings, connect victims and their families to support services and assist in disrupting the availability of the harmful material online.
While Rabsch acknowledges the role parents must play to protect their children, he thinks emphasis should instead be on pressuring governments, regulators and the industry to create more comprehensive standards for the development, deployment and use of AI technologies.
The bigger problem, he said, is how this new technology has recently been brought out into the public to freely use without systems of controls in place.
'It's being developed at a ridiculously rapid pace. There's no oversight, there's no regulation … And here we see the fruits of that reckless behaviour,' he said.
Rabsch says it doesn't matter how the imagery is created.
'Whether it was created the way it's always been created or whether it's produced through deepfake technology, the harm or trauma inflicted on the victim is the same,' he said.
'We have this real issue now where child sexual abuse material can be freely created by basically anyone and there's nothing that the industry, government and regulators are really doing,' Rabsch said, adding that governments should start pressuring and applying regulations so the AI industry will have to conform.
'Until that is really addressed, this problem is just going to continue as it is right now.'