24-07-2025
Digital paedophilia is still child abuse, even if the child is virtual — Haezreena Begum Abdul Hamid
JULY 24 — Digital paedophilia refers to the sexual exploitation of children in virtual spaces, through tools like artificial intelligence, deepfakes, cartoons, and virtual reality. These images may be digitally generated, and the children portrayed may not even exist in real life. Yet, the intent is deeply exploitative, the gratification it feeds is criminal, and the harm it causes individually and socially is undeniable.
While this form of abuse doesn't always involve contact with a child, it normalises the sexualisation of minors, perpetuates a dangerous fantasy, and fuels demand in underground networks. As a criminologist, I've seen how what initially began as 'just digital content' has the potential to desensitise consumers and eventually lead to real-world offending. It systematically erodes the legal, ethical, and societal safeguards established to protect the rights and dignity of children.
This threat is amplified by the pervasive culture of online sharing. Many individuals, including parents, are drawn to posting their daily lives on social media, treating platforms like Facebook and TikTok as digital diaries. While these platforms foster connection and self-expression, they also expose users — particularly children — to serious risks. It is disturbingly easy to download images from Facebook and extract videos from TikTok, even without the content owner's knowledge or consent. Once uploaded, these digital footprints can be stolen, altered, and weaponised for exploitation.
Digital paedophilia can take many forms, for instance: AI-generated child sexual abuse material (CSAM) that mimics realistic children in sexual scenarios; deepfakes that superimpose children's faces — often taken from social media — onto adult bodies in explicit content; illustrated pornography (such as hentai or lolicon) that sexualises underage characters; and virtual reality simulations that allow users to act out child abuse fantasies in immersive settings.
What makes this even more dangerous is that it's easy to access, easy to share, and hard to trace. The perpetrators hide behind screens, usernames, and encrypted platforms. And yet, the damage is far from virtual.
Digital paedophilia can take many forms, for instance: AI-generated child sexual abuse material (CSAM) that mimics realistic children in sexual scenarios; deepfakes that superimpose children's faces — often taken from social media — onto adult bodies in explicit content; illustrated pornography (such as hentai or lolicon) that sexualises underage characters; and virtual reality simulations that allow users to act out child abuse fantasies in immersive settings. ― iStock pic
Malaysia has made commendable strides with the Sexual Offences Against Children Act 2017, the Penal Code, and the Communications and Multimedia Act 1998 — all of which provide mechanisms to combat traditional forms of child sexual abuse, including live-streamed exploitation. However, these laws still fall short when it comes to digital creations. Many of these materials are not technically illegal under current definitions of CSAM, because no real child was involved in their creation. But does that make them any less harmful? I would argue not. These depictions, no matter how 'virtual', are created with the intent of sexualising minors and should be recognised as forms of child sexual exploitation.
Other countries have taken this step. The UK, Canada, and Australia have criminalised virtual child pornography, recognising that a legal definition limited only to real, identifiable victims is inadequate in the face of emerging technology. Therefore, Malaysia must consider doing the same.
The harms of digital paedophilia are not hypothetical. Every day, images of real children — often taken innocently by their parents and shared online — are stolen, manipulated, and repurposed into sexual content. This is a profound violation of dignity, privacy, and safety. These children may never know what was done to their image, but the psychological and reputational damage can be lifelong. Unlike Denmark, which recognises an individual's legal right to their own image and personal likeness through robust copyright and data protection laws, Malaysia currently lacks explicit legal provisions that grant individuals — particularly children — ownership and control over the use of their personal images. This legal vacuum leaves Malaysian children especially vulnerable to digital exploitation, with limited recourse or protection under current frameworks.
Moreover, digital abuse fuels cognitive distortions in those with paedophilic tendencies. Studies show that repeated exposure to virtual CSAM lowers inhibition, increases desensitisation, and can serve as a gateway to contact offences. As a society, we cannot afford to wait until a physical child is harmed to institute any action. The damage begins long before that.
Therefore, an effective and urgent response is required. This includes amending existing laws to explicitly cover digitally created and AI-generated child sexual abuse material. We must criminalise intent and impact, not just physical involvement. Social media platforms, app developers, and AI companies must also be held accountable for detecting, reporting, and removing exploitative content — technological innovation must not come at the expense of child safety.
Digital literacy should no longer be seen as optional. Parents, children, and educators must be equipped with the knowledge to understand the risks of oversharing online and how personal images can be misused. Every policy, investigation, and reform effort must place the child at the centre — not merely as a legal category, but as a human being deserving of dignity, protection, and justice. In addition, Malaysia must strengthen its cooperation with global agencies such as Interpol, Aseanapol, and other cross-border cybercrime task forces to effectively track offenders and dismantle transnational networks.
Digital paedophilia sits at the intersection of technological progress and moral regression. As artificial intelligence and virtual reality continue to advance, so too does the capacity to simulate harm, commodify children, and conceal abuse behind layers of code. But just because the abuse is virtual does not mean it is any less real. The law must evolve to meet this challenge because every child, whether real or digitally represented, deserves to be safe from sexual exploitation.
* Dr Haezreena Begum Abdul Hamid is a Criminologist and Senior Lecturer at the Faculty of Law, University of Malaya. She can be reached at [email protected]y
* This is the personal opinion of the writer or publication and does not necessarily represent the views of Malay Mail.