logo
#

Latest news with #ChildlightGlobalChildSafetyInstitute

Digital wounds, lifelong scars
Digital wounds, lifelong scars

The Star

time21-06-2025

  • The Star

Digital wounds, lifelong scars

Child Sexual Abuse Material isn't just content – t's a crime that inflicts deep, lasting trauma JUST one hour. That's all it takes for an online predator to groom a child – convincing them to share personal details, including their location, and ultimately trapping them in a web of sexual abuse. In just 60 minutes, a predator can build trust through social media, using flattery, attention and deceitful promises to manipulate a young mind. Yet, many Malaysians remain unaware of the gravity of what's happening behind screens. Most of us have never even heard of the term Child Sexual Abuse Material (CSAM), let alone understand its devastating implications. CSAM isn't limited to explicit photos. It covers a wide range of disturbing content – videos, drawings, manipulated images, and any material that depicts or suggests the sexual exploitation of children. Lurking dangers In a 2024 global report by the Childlight Global Child Safety Institute, it was estimated that a staggering 300 million children around the world fall prey to CSAM every year. Behind that number are countless children whose lives have been deeply affected by online sexual abuse – receiving suggestive questions, being pushed to share images of themselves or their body parts or exposed to sexually explicit content involving other minors. According to THORN – a non-profit that develops technology to protect children from sexual abuse – children under the age of 12 are often the main targets in CSAM, especially in content that is shared among offenders. But teens aged 13-17 are also at risk with predators turning to sextortion – cruelly blackmailing them with the threat of exposing their most intimate moments unless they hand over more images or even money. Kelly Chan, a clinical psychologist at Soul Mechanics Therapy in Petaling Jaya, explains that online grooming is a calculated process in which predators earn a child's trust – often targeting children who feel isolated or emotionally neglected. 'Groomers often present themselves as a supportive adult or even as a friend, to an extent, they offer praises, gifts and attention to create emotional dependency on the children,' she shares. Chan also adds that trust is established, groomers begin to desensitise children to sexual content – often by introducing inappropriate topics disguised as games or jokes. Over time, they escalate their demands, asking for explicit photos or acts, leaving the child feeling trapped in a cycle of fear, shame and guilt. Lifetime of trauma Once CSAM is shared online, it spreads like wildfire – almost impossible to erase. Survivors live with the constant fear that someone, somewhere, is viewing their abuse, and the trauma is repeated every time a photo or video is opened, shared, or saved. 'Psychologically speaking, victims can struggle with severe anxiety, depression and symptoms of post-traumatic stress disorder (PTSD),' warns Chan. 'They may experience chronic shame and low self-worth, especially if they feel they've lost control over their identity – even more so if they know others can still access their abuse at any time,' she adds. Kempen Internet Selamat (KIS), an initiative by the Communications Ministry and Malaysian Communications and Multimedia Commission (MCMC), was launched to promote safer internet use and awareness of digital crimes, including CSAM. Even if CSAM was created in the past, its continued circulation online can keep the trauma alive, leaving victims feeling powerless and trapped in a relentless cycle of abuse. Many become hypervigilant, withdrawn, or even aggressive, driven by fear and distrust. This emotional toll can affect their ability to build secure relationships and friendships. 'Some children may exhibit age-inappropriate sexual behaviours, such as engaging in sexual talk or mimicking sexual acts, which could be a result of exposure to CSAM,' Chan observes. She adds that older children may also resort to substance use, self-harm, or other high-risk behaviours as a way to regain a sense of control or escape the emotional pain. No child's play 'The circulation of CSAM online today involves a complex and evolving ecosystem,' says CyberSecurity Malaysia chief executive officer Datuk Dr Amirudin Abdul Wahab. He noted that Peer-to-Peer networks, encrypted messaging apps, and the dark web are often used to share CSAM due to their anonymity, making detection and enforcement difficult. Amirudin adds that they also see a concerning shift toward the misuse of more mainstream platforms. 'Cloud storage services, social media direct messaging, and even online gaming platforms are increasingly being exploited to share or store such material, often through covert methods,' he says. By law, those caught possessing, producing, or circulating such material face tough consequences under the Sexual Offences Against Children Act 2017, with prison terms of up to 30 years. On top of that, Section 233 of the Communications and Multimedia Act 1998 adds another layer of punishment, with fines reaching RM50,000 or up to a year behind bars for distributing obscene or offensive content. Yet, the rising numbers indicate more than law and order are needed to battle this epidemic which is silently slipping through screens, reaching into the lives of young Malaysians. In just the first quarter of 2024, Malaysian authorities reported 51,638 cases of harmful online content to social media platforms – a sharp rise from the 42,904 cases recorded throughout all of 2023. Malaysia has long been battling CSAM through various awareness initiatives, including the latest effort by the Communications Ministry and the Malaysian Communications and Multimedia Commission (MCMC). The campaign, called Kempen Internet Selamat (KIS), is a nationwide campaign running from 2025 to 2027, which will involve talks, exhibitions and training on areas including online safety guides and digital literacy. Raising awareness In December last year, Bukit Aman's Sexual, Women and Child Investigations Division (D11) principal assistant director senior assistant commissioner Siti Kamsiah Hassan issued a stern reminder that parents have a critical duty to shield their children from all forms of abuse – including sexual exploitation. Her reminder came as the country faced a troubling surge in CSAM cases. 'While awareness of general online threats such as scams has grown among the Malaysian public, understanding of the presence and dangers of CSAM remains limited,' Amirudin observes. KIS will be carried out in primary and secondary schools, universities and colleges, teacher training institutes, and local community spaces like Digital Economy Centres. He notes that the deeply rooted taboo and stigma surrounding abuse often prevent open discussion – leading to under-reporting and obscuring the true scale of the issue. Amirudin also highlights a widespread lack of awareness about how seemingly innocent, everyday actions can put children at risk. 'There is a lack of sustained, targeted education that highlights the evolving risks, including how everyday actions like 'sharenting' (parents who share children's images online) can be misused by predators,' he explains. Everyone's responsibility 'I make it a point to ask my teens about the apps they're using, who they're talking to, and what kind of messages they're getting,' says homemaker P. Meena Kumari, whose children are aged 13 and 16. 'And honestly, just teaching them what's not okay– like someone asking for photos, or trying to move the chat to another app. Being able to talk about these things with your children goes a long way.' But parents too, says Meena, have to educate themselves. 'It's so easy to fall behind with all the new stuff coming out, but if we don't know what they're on, we can't really help guide them.' While she agrees parents should play the biggest responsibility, she also feels strongly that it takes a collective effort. 'Schools can help by teaching online safety, and tech companies really need to do more to flag and block harmful stuff before it ever reaches our children.' If you come across any form of child sexual abuse material, don't stay silent. Report it immediately at your nearest police station or through the Communications Ministry and the Malaysian Communications and Multimedia Commission (MCMC). Every report helps protect a child. Scan the QR code below to find out more:

Paedophiles may be targeting parents on dating apps to access children
Paedophiles may be targeting parents on dating apps to access children

The Independent

time09-04-2025

  • The Independent

Paedophiles may be targeting parents on dating apps to access children

Paedophiles may be trying to gain access to children through their parents' dating apps, researchers have warned. A report co-led by the University of Edinburgh found men who have sexually offended against children use dating apps daily, leading to calls for stronger regulation of apps used by 381 million people according to Statista. Research by the Childligh t Global Child Safety Institute, hosted by the University of Edinburgh, found men who sexually abuse children are nearly four times more likely to use dating sites than non-offenders. The unit found 66% of men who have sexually offended against children use dating platforms – and more than 22% use them daily. The report, called Swipe Wrong, is part of a broader investigation into the multibillion-dollar industry of child sexual exploitation and abuse, which financially benefits perpetrators, organised crime and, according to researchers, mainstream companies. It warned sexual exploitation and abuse of children has become a pandemic, impacting more than 300 million every year. Research based on a survey of about 5,000 men in the UK, US and Australia showed single mothers are at particular risk, while 11.5% of men surveyed admitted having sexual feelings towards children and 11% confessed to sexual offences against minors. It followed a separate survey by the Australian Institute of Criminology that found 12% of dating app users received requests to facilitate child sexual exploitation and abuse – most often related to their own children. Most dating sites do not require new users to provide evidence of their identity, and the report shared new insights into perpetrator behaviour online. It found offenders may appear trustworthy, as they are more likely to have a child in their house, work with children, and have a higher education level. The report also found men who have committed sexual offences against children engage more frequently in certain online activities, like online shopping, dating and gaming, and are also more likely to own and use cryptocurrency and to buy sexual content online. Report co-author Professor Michael Salter, director of the Childlight East Asia and Pacific Hub at the University of New South Wales, said: 'Our findings provide clear evidence that dating apps lack adequate child protection measures, and loopholes are exploited by abusers to target single parents and their children. 'There's no reason why the robust user identification methods we have in other industries, such as banking and gambling, should not also have to be adopted by dating app platforms. 'Similarly, there are a range of AI tools and systems that can flag problematic words and conversations that can and should be used.' Professor Deborah Fry, Childlight's global director of data and professor of international child protection research at the University of Edinburgh, said: 'Child sexual exploitation and abuse is a global public health emergency that requires emergency measures – but it's preventable. 'We must mobilise globally, focusing not just on reactive law enforcement but on prevention strategies tackling underlying determinants of abuse – including financial and technological ecosystems sustaining it.'

Paedophiles ‘may be targeting children through parents' dating apps'
Paedophiles ‘may be targeting children through parents' dating apps'

Telegraph

time09-04-2025

  • Telegraph

Paedophiles ‘may be targeting children through parents' dating apps'

Paedophiles may be targeting children through their parents' dating apps, researchers have warned. The study, based on 5,000 men, found that those who sexually offended against children were more likely to use dating apps. The researchers said that adult dating apps provided fewer safeguards against sexual exploitation than other platforms. The researchers called on dating apps to make greater use of user identification methods and AI systems to flag suspect conversations and problematic keywords. The research by the Childlight Global Child Safety Institute, hosted by the University of Edinburgh, found men who sexually abuse children are nearly four times more likely to use dating sites than non-offenders. It is estimated that as many as 381 million people worldwide use dating apps. The unit found 66 per cent of men who have sexually offended against children use dating platforms – and more than 22 per cent use them daily. It warned that sexual exploitation and abuse of children has become a pandemic, impacting more than 300 million every year. Single mothers at risk The report, called Swipe Wrong, is part of a broader investigation into the multi-billion-dollar industry of child sexual exploitation and abuse, which financially benefits perpetrators, organised crime and, according to researchers, mainstream companies. Based on a survey of about 5,000 men in the UK, US and Australia, it showed single mothers were at particular risk, while 11.5 per cent of men surveyed admitted having sexual feelings towards children and 11 per cent confessed to sexual offences against minors. Previous research has suggested 12 per cent of dating app users received requests to facilitate child sexual exploitation and abuse, most often related to their own children. It found offenders may appear trustworthy, as they are more likely to have a child in their house, work with children and have a higher level of education. The report also found men who have committed sexual offences against children engage more frequently in certain online activities, like online shopping, dating and gaming, and are also more likely to own and use cryptocurrency and to buy sexual content online. Some dating sites do not require new users to provide evidence of their identity. Abusers exploiting 'loopholes' Prof Michael Salter, a co-author of the report and director of the Childlight East Asia and Pacific Hub at the University of New South Wales, said: 'Our findings provide clear evidence that dating apps lack adequate child protection measures, and loopholes are exploited by abusers to target single parents and their children. 'There's no reason why the robust user identification methods we have in other industries, such as banking and gambling, should not also have to be adopted by dating app platforms. 'Similarly, there is a range of AI tools and systems that can flag problematic words and conversations that can and should be used.' Prof Deborah Fry, Childlight's global director of data and professor of international child protection research at the University of Edinburgh, said: ' Child sexual exploitation and abuse is a global public health emergency that requires emergency measures – but it's preventable. 'We must mobilise globally, focusing not just on reactive law enforcement but on prevention strategies tackling underlying determinants of abuse, including financial and technological ecosystems sustaining it.'

Warning that paedophiles are using dating apps to target parents
Warning that paedophiles are using dating apps to target parents

The Independent

time08-04-2025

  • The Independent

Warning that paedophiles are using dating apps to target parents

Researchers have warned that paedophiles may be using parents' dating apps to target children. A report co-led by the University of Edinburgh revealed that men who have sexually offended against children are active daily on dating apps. This has led to increased calls for stricter regulation of these platforms, which are used by 381 million people, according to Statista. The Childlight Global Child Safety Institute, hosted by the University of Edinburgh, found that men who sexually abuse children are nearly four times more likely to use dating sites compared to non-offenders. The study also found that 66% of men who have sexually offended against children use dating platforms, with over 22% using them daily. The report, called Swipe Wrong, is part of a broader investigation into the multibillion-dollar industry of child sexual exploitation and abuse, which financially benefits perpetrators, organised crime and, according to researchers, mainstream companies. It warned sexual exploitation and abuse of children has become a pandemic, impacting more than 300 million every year. Research based on a survey of about 5,000 men in the UK, US and Australia showed single mothers are at particular risk, while 11.5% of men surveyed admitted having sexual feelings towards children and 11% confessed to sexual offences against minors. It followed a separate survey by the Australian Institute of Criminology that found 12% of dating app users received requests to facilitate child sexual exploitation and abuse – most often related to their own children. Most dating sites do not require new users to provide evidence of their identity, and the report shared new insights into perpetrator behaviour online. It found offenders may appear trustworthy, as they are more likely to have a child in their house, work with children, and have a higher education level. The report also found men who have committed sexual offences against children engage more frequently in certain online activities, like online shopping, dating and gaming, and are also more likely to own and use cryptocurrency and to buy sexual content online. Report co-author Professor Michael Salter, director of the Childlight East Asia and Pacific Hub at the University of New South Wales, said: 'Our findings provide clear evidence that dating apps lack adequate child protection measures, and loopholes are exploited by abusers to target single parents and their children. 'There's no reason why the robust user identification methods we have in other industries, such as banking and gambling, should not also have to be adopted by dating app platforms. 'Similarly, there are a range of AI tools and systems that can flag problematic words and conversations that can and should be used.' Professor Deborah Fry, Childlight's global director of data and professor of international child protection research at the University of Edinburgh, said: 'Child sexual exploitation and abuse is a global public health emergency that requires emergency measures – but it's preventable. 'We must mobilise globally, focusing not just on reactive law enforcement but on prevention strategies tackling underlying determinants of abuse – including financial and technological ecosystems sustaining it.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store