logo
Digital paedophilia is still child abuse, even if the child is virtual — Haezreena Begum Abdul Hamid

Digital paedophilia is still child abuse, even if the child is virtual — Haezreena Begum Abdul Hamid

Malay Mail4 days ago
JULY 24 — Digital paedophilia refers to the sexual exploitation of children in virtual spaces, through tools like artificial intelligence, deepfakes, cartoons, and virtual reality. These images may be digitally generated, and the children portrayed may not even exist in real life. Yet, the intent is deeply exploitative, the gratification it feeds is criminal, and the harm it causes individually and socially is undeniable.
While this form of abuse doesn't always involve contact with a child, it normalises the sexualisation of minors, perpetuates a dangerous fantasy, and fuels demand in underground networks. As a criminologist, I've seen how what initially began as 'just digital content' has the potential to desensitise consumers and eventually lead to real-world offending. It systematically erodes the legal, ethical, and societal safeguards established to protect the rights and dignity of children.
This threat is amplified by the pervasive culture of online sharing. Many individuals, including parents, are drawn to posting their daily lives on social media, treating platforms like Facebook and TikTok as digital diaries. While these platforms foster connection and self-expression, they also expose users — particularly children — to serious risks. It is disturbingly easy to download images from Facebook and extract videos from TikTok, even without the content owner's knowledge or consent. Once uploaded, these digital footprints can be stolen, altered, and weaponised for exploitation.
Digital paedophilia can take many forms, for instance: AI-generated child sexual abuse material (CSAM) that mimics realistic children in sexual scenarios; deepfakes that superimpose children's faces — often taken from social media — onto adult bodies in explicit content; illustrated pornography (such as hentai or lolicon) that sexualises underage characters; and virtual reality simulations that allow users to act out child abuse fantasies in immersive settings.
What makes this even more dangerous is that it's easy to access, easy to share, and hard to trace. The perpetrators hide behind screens, usernames, and encrypted platforms. And yet, the damage is far from virtual.
Digital paedophilia can take many forms, for instance: AI-generated child sexual abuse material (CSAM) that mimics realistic children in sexual scenarios; deepfakes that superimpose children's faces — often taken from social media — onto adult bodies in explicit content; illustrated pornography (such as hentai or lolicon) that sexualises underage characters; and virtual reality simulations that allow users to act out child abuse fantasies in immersive settings. ― iStock pic
Malaysia has made commendable strides with the Sexual Offences Against Children Act 2017, the Penal Code, and the Communications and Multimedia Act 1998 — all of which provide mechanisms to combat traditional forms of child sexual abuse, including live-streamed exploitation. However, these laws still fall short when it comes to digital creations. Many of these materials are not technically illegal under current definitions of CSAM, because no real child was involved in their creation. But does that make them any less harmful? I would argue not. These depictions, no matter how 'virtual', are created with the intent of sexualising minors and should be recognised as forms of child sexual exploitation.
Other countries have taken this step. The UK, Canada, and Australia have criminalised virtual child pornography, recognising that a legal definition limited only to real, identifiable victims is inadequate in the face of emerging technology. Therefore, Malaysia must consider doing the same.
The harms of digital paedophilia are not hypothetical. Every day, images of real children — often taken innocently by their parents and shared online — are stolen, manipulated, and repurposed into sexual content. This is a profound violation of dignity, privacy, and safety. These children may never know what was done to their image, but the psychological and reputational damage can be lifelong. Unlike Denmark, which recognises an individual's legal right to their own image and personal likeness through robust copyright and data protection laws, Malaysia currently lacks explicit legal provisions that grant individuals — particularly children — ownership and control over the use of their personal images. This legal vacuum leaves Malaysian children especially vulnerable to digital exploitation, with limited recourse or protection under current frameworks.
Moreover, digital abuse fuels cognitive distortions in those with paedophilic tendencies. Studies show that repeated exposure to virtual CSAM lowers inhibition, increases desensitisation, and can serve as a gateway to contact offences. As a society, we cannot afford to wait until a physical child is harmed to institute any action. The damage begins long before that.
Therefore, an effective and urgent response is required. This includes amending existing laws to explicitly cover digitally created and AI-generated child sexual abuse material. We must criminalise intent and impact, not just physical involvement. Social media platforms, app developers, and AI companies must also be held accountable for detecting, reporting, and removing exploitative content — technological innovation must not come at the expense of child safety.
Digital literacy should no longer be seen as optional. Parents, children, and educators must be equipped with the knowledge to understand the risks of oversharing online and how personal images can be misused. Every policy, investigation, and reform effort must place the child at the centre — not merely as a legal category, but as a human being deserving of dignity, protection, and justice. In addition, Malaysia must strengthen its cooperation with global agencies such as Interpol, Aseanapol, and other cross-border cybercrime task forces to effectively track offenders and dismantle transnational networks.
Digital paedophilia sits at the intersection of technological progress and moral regression. As artificial intelligence and virtual reality continue to advance, so too does the capacity to simulate harm, commodify children, and conceal abuse behind layers of code. But just because the abuse is virtual does not mean it is any less real. The law must evolve to meet this challenge because every child, whether real or digitally represented, deserves to be safe from sexual exploitation.
* Dr Haezreena Begum Abdul Hamid is a Criminologist and Senior Lecturer at the Faculty of Law, University of Malaya. She can be reached at [email protected]y
* This is the personal opinion of the writer or publication and does not necessarily represent the views of Malay Mail.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Trump-Epstein AI deepfakes rack up millions of views, researchers say
Trump-Epstein AI deepfakes rack up millions of views, researchers say

Malay Mail

time2 days ago

  • Malay Mail

Trump-Epstein AI deepfakes rack up millions of views, researchers say

WASHINGTON, July 26 — Fake AI-generated photos and videos purporting to show Donald Trump and convicted sex offender Jeffrey Epstein socialising with underage girls have flooded social media, racking up millions of views, researchers said yesterday. The surge in deepfakes comes as the US president — frequently photographed with Epstein during their 15-year friendship — attempts to distance himself from the disgraced financier, who died in jail in 2019 while awaiting trial on sex trafficking charges. One widely circulated AI-generated video appears to show Trump and Epstein leering at a group of young girls dancing, with the song Is it a Crime? by the English band Sade added as background music. At least two other fake photos appear to show the pair on a couch alongside underaged girls. Another such photograph purports to shows Trump dancing with a teenage girl on Epstein's private island. Overlaying the image is the caption: 'Trump was in his 50's when this was taken. What kind of man does that?' At least seven such AI-generated images and one video cumulatively garnered more than 7.2 million views across social media platforms, according to a conservative estimate by disinformation watchdog group NewsGuard. The watchdog said it used multiple detection tools, including Hive and IdentifAI, to establish that the content was fabricated using AI tools and the actual number of views was likely much higher than its manual tally of high-engagement posts. Trump's ties to Epstein are extensive, and the pair were frequently pictured partying together during their friendship before they fell out in 2004 over a property deal. But there appear to be no known authentic photographs of the pair with underage girls or of Trump visiting Epstein's private island in the Caribbean, NewsGuard said. AI slop — low-quality visual content generated using cheap and widely available artificial intelligence tools –- increasingly appears to be flooding social media sites, blurring the lines between reality and fiction. Many content creators on YouTube and TikTok offer paid courses on how to monetise viral AI slop on tech platforms, many of which have reduced their reliance on human fact-checkers and scaled back content moderation. AI-generated images of Trump spread rapidly after the FBI and Justice Department said in a July 7 memo that there was no proof that Epstein kept a 'client list' of elite co-conspirators as conspiracy theorists have contended. Trump's core Make America Great Again (MAGA) base erupted in anger over the memo, calling on the White House to release the so-called 'Epstein files.' Some even within the Republican president's own party have demanded the files be released, but his administration has declined to do so. Fake images appear to be gaining traction in that vacuum. The Wall Street Journal reported on Wednesday that the president's name was among hundreds found during an official review of the files, though there has not been evidence of wrongdoing. Trump filed a US$10 billion (RM42.2 billion) defamation suit against the newspaper last week after it reported that he had penned a sexually suggestive letter to Epstein for his 50th birthday in 2003. — AFP

Digital paedophilia is still child abuse, even if the child is virtual — Haezreena Begum Abdul Hamid
Digital paedophilia is still child abuse, even if the child is virtual — Haezreena Begum Abdul Hamid

Malay Mail

time4 days ago

  • Malay Mail

Digital paedophilia is still child abuse, even if the child is virtual — Haezreena Begum Abdul Hamid

JULY 24 — Digital paedophilia refers to the sexual exploitation of children in virtual spaces, through tools like artificial intelligence, deepfakes, cartoons, and virtual reality. These images may be digitally generated, and the children portrayed may not even exist in real life. Yet, the intent is deeply exploitative, the gratification it feeds is criminal, and the harm it causes individually and socially is undeniable. While this form of abuse doesn't always involve contact with a child, it normalises the sexualisation of minors, perpetuates a dangerous fantasy, and fuels demand in underground networks. As a criminologist, I've seen how what initially began as 'just digital content' has the potential to desensitise consumers and eventually lead to real-world offending. It systematically erodes the legal, ethical, and societal safeguards established to protect the rights and dignity of children. This threat is amplified by the pervasive culture of online sharing. Many individuals, including parents, are drawn to posting their daily lives on social media, treating platforms like Facebook and TikTok as digital diaries. While these platforms foster connection and self-expression, they also expose users — particularly children — to serious risks. It is disturbingly easy to download images from Facebook and extract videos from TikTok, even without the content owner's knowledge or consent. Once uploaded, these digital footprints can be stolen, altered, and weaponised for exploitation. Digital paedophilia can take many forms, for instance: AI-generated child sexual abuse material (CSAM) that mimics realistic children in sexual scenarios; deepfakes that superimpose children's faces — often taken from social media — onto adult bodies in explicit content; illustrated pornography (such as hentai or lolicon) that sexualises underage characters; and virtual reality simulations that allow users to act out child abuse fantasies in immersive settings. What makes this even more dangerous is that it's easy to access, easy to share, and hard to trace. The perpetrators hide behind screens, usernames, and encrypted platforms. And yet, the damage is far from virtual. Digital paedophilia can take many forms, for instance: AI-generated child sexual abuse material (CSAM) that mimics realistic children in sexual scenarios; deepfakes that superimpose children's faces — often taken from social media — onto adult bodies in explicit content; illustrated pornography (such as hentai or lolicon) that sexualises underage characters; and virtual reality simulations that allow users to act out child abuse fantasies in immersive settings. ― iStock pic Malaysia has made commendable strides with the Sexual Offences Against Children Act 2017, the Penal Code, and the Communications and Multimedia Act 1998 — all of which provide mechanisms to combat traditional forms of child sexual abuse, including live-streamed exploitation. However, these laws still fall short when it comes to digital creations. Many of these materials are not technically illegal under current definitions of CSAM, because no real child was involved in their creation. But does that make them any less harmful? I would argue not. These depictions, no matter how 'virtual', are created with the intent of sexualising minors and should be recognised as forms of child sexual exploitation. Other countries have taken this step. The UK, Canada, and Australia have criminalised virtual child pornography, recognising that a legal definition limited only to real, identifiable victims is inadequate in the face of emerging technology. Therefore, Malaysia must consider doing the same. The harms of digital paedophilia are not hypothetical. Every day, images of real children — often taken innocently by their parents and shared online — are stolen, manipulated, and repurposed into sexual content. This is a profound violation of dignity, privacy, and safety. These children may never know what was done to their image, but the psychological and reputational damage can be lifelong. Unlike Denmark, which recognises an individual's legal right to their own image and personal likeness through robust copyright and data protection laws, Malaysia currently lacks explicit legal provisions that grant individuals — particularly children — ownership and control over the use of their personal images. This legal vacuum leaves Malaysian children especially vulnerable to digital exploitation, with limited recourse or protection under current frameworks. Moreover, digital abuse fuels cognitive distortions in those with paedophilic tendencies. Studies show that repeated exposure to virtual CSAM lowers inhibition, increases desensitisation, and can serve as a gateway to contact offences. As a society, we cannot afford to wait until a physical child is harmed to institute any action. The damage begins long before that. Therefore, an effective and urgent response is required. This includes amending existing laws to explicitly cover digitally created and AI-generated child sexual abuse material. We must criminalise intent and impact, not just physical involvement. Social media platforms, app developers, and AI companies must also be held accountable for detecting, reporting, and removing exploitative content — technological innovation must not come at the expense of child safety. Digital literacy should no longer be seen as optional. Parents, children, and educators must be equipped with the knowledge to understand the risks of oversharing online and how personal images can be misused. Every policy, investigation, and reform effort must place the child at the centre — not merely as a legal category, but as a human being deserving of dignity, protection, and justice. In addition, Malaysia must strengthen its cooperation with global agencies such as Interpol, Aseanapol, and other cross-border cybercrime task forces to effectively track offenders and dismantle transnational networks. Digital paedophilia sits at the intersection of technological progress and moral regression. As artificial intelligence and virtual reality continue to advance, so too does the capacity to simulate harm, commodify children, and conceal abuse behind layers of code. But just because the abuse is virtual does not mean it is any less real. The law must evolve to meet this challenge because every child, whether real or digitally represented, deserves to be safe from sexual exploitation. * Dr Haezreena Begum Abdul Hamid is a Criminologist and Senior Lecturer at the Faculty of Law, University of Malaya. She can be reached at [email protected]y * This is the personal opinion of the writer or publication and does not necessarily represent the views of Malay Mail.

Actor-singer lodges police report over ‘defamatory' social media post
Actor-singer lodges police report over ‘defamatory' social media post

Free Malaysia Today

time7 days ago

  • Free Malaysia Today

Actor-singer lodges police report over ‘defamatory' social media post

Acting Kuala Lumpur police chief Usuf Jan Mohamad said the actor-singer filed a report on July 18. (Facebook pic) PETALING JAYA : A popular actor and singer has lodged a police report over an allegedly defamatory and provocative post uploaded on the social media platform Threads. Acting Kuala Lumpur police chief Usuf Jan Mohamad said the report was filed at 4.12pm on July 18, Harian Metro reported. He said the male artiste was unhappy with a post by a Threads user, claiming it contained defamatory, provocative and seditious elements that were widely circulated online. 'The complainant believes the post could harm his reputation and public image,' Usuf was quoted as saying. Police are investigating the case under Section 500 of the Penal Code for defamation, and Section 233 of the Communications and Multimedia Act 1998 for improper use of network facilities. Usuf also clarified that the case has no connection to singer Shila Amzah, who recently claimed that she was attacked by a senior artiste during a concert rehearsal.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store