logo
#

Latest news with #ChildTraffickingandPornographyActof1998

Expert warns parents over AI deepfakes of children
Expert warns parents over AI deepfakes of children

RTÉ News​

time20-05-2025

  • RTÉ News​

Expert warns parents over AI deepfakes of children

Only 20 images of a child are needed to create a deepfake video of them, a leading expert in cybersecurity has warned. The study, conducted by Perspectus Global, focused on 2,000 parents with children under the age of 16 in the UK, and showed that parents upload an average of 63 images to social media every month. Over half of these photos include family photos (59%), with one in five parents (21%) uploading these types of images multiple times a week. Speaking on RTÉ's Today with Claire Byrne, CEO of Mick Moran, said that as AI gets stronger, the 20 images required to create the videos will be reduced to only one. "The big worry is that these AI models will be used to create CSAM (Child Sexual Abuse Material) and children involved in sex acts," he said. "We've already seen in the past, innocent images that kids themselves are posting, or their parents are posting, being used in advertising pornography sites. "In this case however, giving a certain data set of images, 20 of them, will allow you to produce a non-limited amount in any scenario of that child." Mr Moran explained that the risk of CSAM is only one aspect of the issue, and the deep-fake videos could also be used for fraud or for scams. "You have to be aware that your data is being used to train these models and fundamentally, any information you share online can be used in ways you never intended." He said that if images are being shared publicly, the expectation of privacy is "gone," adding that some companies see uploaded material as under "implicit consent." "If you're an adult and you share a picture... it attracts different rules under data protection. However, if you're a parent and you share a picture of your child or another child, it is deemed to be implicit consent from the parent that transfers to the child, and therefore they can use the image." Parents urged to limit social media privacy settings Mr Moran said that there is "no problem" in sharing images online, as long as the audience who can view it is limited through social media privacy settings. He called on the Government to bring in legislation to make it illegal to possess or to make an engine which trains AI to produce CSAM. "CSAM and child pornography are illegal under the Child Trafficking and Pornography Act of 1998, so it's illegal to possess it, whether it's made by AI or not," he said. "What I'd be calling on the Government to do here would be to make it illegal to possess, make an engine, or to train an AI engine that will produce CSAM - that's not illegal. "What you put into it might be illegal, what comes out of it might be illegal, but the act of doing it is not necessarily illegal," he added.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store