logo
#

Latest news with #Childlight

Children are speaking to strangers online – and grooming is on the rise. This is how to protect them
Children are speaking to strangers online – and grooming is on the rise. This is how to protect them

The Guardian

time21-05-2025

  • Health
  • The Guardian

Children are speaking to strangers online – and grooming is on the rise. This is how to protect them

When we look at what causes poor mental health, we often think of stress, genetics, poverty or loneliness. These are all contributing factors, but there's another, more hidden cause that isn't talked about enough: abuse, especially during childhood. I recall Chad Varah, the founder of Samaritans, reflecting that there were many things that drove people to call the charity's suicide helpline. But abuse was a prominent reason. Abuse isn't an easy subject to raise or talk about. It brings up issues of gender dynamics – a colleague studying global sexual abuse told me: 'The vast majority of perpetrators are men; the victims are equally boys and girls.' These are hard issues to think about, harder still to discuss and difficult to address. They challenge notions of safety, trust, family and community. But if we want to make progress in addressing poor mental health, we have to start here – with the truths we'd rather avoid. The internet has fundamentally changed our world. Where once we worried about a child walking home from school alone or sleeping over at a friend's house, now we have the entire online world to contend with. Grooming and exploitation no longer happen only in person – they happen on smartphones, in video games and through tablets handed over to keep kids entertained. And too often, the adults meant to protect children are far behind. Recent research from Childlight, a child safety charity at the University of Edinburgh, has shown a steep rise in online grooming cases. It estimates that about 830,000 young people worldwide are at risk of social exploitation and abuse every day. This includes explicit photo sharing, sexual extortion, solicitation, deepfake images, pornography and grooming. Social media platforms, messaging apps and multiplayer games have become common avenues for abusers to target youngsters. They've been designed to be attractive and addictive to children, but largely without their safety in mind. And that puts the burden of protection unfairly on parents, many of whom don't understand the risks – or aren't even aware that they exist. Take Roblox, a platform marketed as a child-friendly virtual playground. Behind its colourful, blocky graphics and simple games lies a reality far less innocent. A recent study examining interactions within the game found that through its open chat features, users were able to initiate contact with children as young as five, and were able to potentially speak with them over time before moving to other, less public platforms. Children could also see and hear sexual and suggestive content while playing various games. The researchers found that a test avatar registered to an adult could ask for a five-year-old's test avatar's Snapchat details on the platform. Just last month, a California man was accused of kidnapping and sexual conduct with a 10-year-old he met on Roblox. The surface looks benign. The danger lies underneath. One thing that should be noted, too, is that even if physical contact never occurs, exposure to traumatic or sexually inappropriate content can still leave lasting mental scars. The internet and social media in particular have made it easier to access this kind of content, even if a child is never contacted by a coercive individual. Online abuse can take many forms, from exposure to sexual images and videos to inappropriate sexual and non-sexual language, extortion and solicitation. In 2023, an estimated 19% of children aged 10 to 15 in England and Wales exchanged messages with someone online whom they had never met in real life. Nearly a third of eight- to 17-year-olds who game online say they chat to strangers while gaming. The vast majority of those interactions will be harmless, but when bad things do happen, many children feel isolated or unsupported: only half of children in a survey in England told their parents or teacher about harmful content they had seen online. The same shame, confusion, fear and guilt that silences victims of abuse in the real world also mutes those suffering from exposure virtually. That silence can be deadly. Studies have shown that children who are groomed or coerced online often suffer from anxiety, depression, PTSD and suicidal thoughts. According to Samaritans, children and young people with histories of abuse are at far higher risk of self-harm and suicide. Varah's reflections underscore this: abuse, especially when unaddressed, can derail an entire life. But awareness is the first step towards prevention. We need to remove the stigma around abuse so that survivors of any age can speak up. We also need to better understand how quickly the landscape of risk is evolving. This means having open conversations with children, not just once but regularly. It means teaching them that they can talk to us about anything they see or experience online, without fear or shame. Tech companies need to be regulated by government to be accountable for creating safer environments. Just leaving it to voluntary initiatives doesn't seem to be enough. On 25 July, the Online Safety Act will be implemented in Britain, with clear safety rules for platforms to protect young people from harmful content, online abuse and sexual material. It is an important step forward in treating this issue with the urgency it deserves – just as we would with any other public health threat. While Samaritans continues its vital work supporting those in crisis, we owe it to our children to intervene earlier, to prevent that crisis from occurring in the first place. If we want to protect the mental health of young people, we need to start where the damage begins – and that means looking directly at the hard truths, online and off. Prof Devi Sridhar is chair of global public health at the University of Edinburgh

When dating apps turn dangerous
When dating apps turn dangerous

Associated Press

time09-04-2025

  • Associated Press

When dating apps turn dangerous

EDINBURGH, United Kingdom, April 09, 2025 (GLOBE NEWSWIRE) -- Women looking for love online are being warned of a sinister side to dating apps, with new evidence that abusers may try to groom them to gain access to their children. Research by Childlight Global Child Safety Institute finds men who sexually offend against children are nearly four times more likely to use dating sites than non-offenders. The unit, hosted by the University of Edinburgh and University of New South Wales, found nearly two thirds (66%) of men who sexually offended against children used dating platforms – and over one in five (22%) used them daily. The report is part of a broader investigation into the multi-billion-dollar industry of child sexual exploitation and abuse, which financially benefits perpetrators, organised crime and even mainstream companies. While Childlight warns that sexual exploitation and abuse of children has become a pandemic, affecting over 300 million every year, it says education, legislation and technological measures can help prevent it. Its findings, based on a survey of about 5,000 men in Australia, the UK and US, represent the latest evidence of the risk of dating site misuse by people who sexually offend against children. It follows a separate recent survey by the Australian Institute of Criminology that found 12% of dating apps users received requests to facilitate child sexual exploitation and abuse – often related to their own children. Recent high-profile cases include Scottish lorry driver Paul Stewart who manipulated single mothers via dating apps to gain access to their children for sexual abuse. He was jailed for over three years last December. Around 381 million people use dating apps like Tinder, according to Statista. Report co-author Professor Michael Salter, director of the Childlight East Asia and Pacific Hub at UNSW, said: 'Our findings provide clear evidence that dating apps lack adequate child protection measures, and loopholes are exploited by abusers to target single parents and their children.' The survey of 5,000 men found 11.5% admitted having sexual feelings towards children, while 11% confessed to sexual offences against minors. Most dating sites do not require new users to provide evidence of their identity. Salter recommends user verification processes, like mandatory ID checks, and tools to detect predatory behaviours like grooming language or suspicious messaging patterns. Childlight's research also reveals that mainstream companies profit from and perpetuate the global trade in technology-facilitated sexual exploitation and abuse of children (CSEA). They include payment transfer firms and social media platforms where illegal child sexual abuse images are present and where abuse-related traffic can increase advertising revenues. Debi Fry, Childlight's Global Director of Data and Professor of International Child Protection Research at University of Edinburgh, said: 'Child sexual exploitation and abuse is a global public health emergency that requires emergency measures but it's preventable. We must mobilise globally, focusing not just on reactive law enforcement but on prevention strategies tackling underlying determinants of abuse — including financial and technological ecosystems sustaining it.'

Paedophiles may be targeting parents on dating apps to access children
Paedophiles may be targeting parents on dating apps to access children

Yahoo

time09-04-2025

  • Yahoo

Paedophiles may be targeting parents on dating apps to access children

Paedophiles may be trying to gain access to children through their parents' dating apps, researchers have warned. A report co-led by the University of Edinburgh found men who have sexually offended against children use dating apps daily, leading to calls for stronger regulation of apps used by 381 million people according to Statista. Research by the Childlight Global Child Safety Institute, hosted by the University of Edinburgh, found men who sexually abuse children are nearly four times more likely to use dating sites than non-offenders. The unit found 66% of men who have sexually offended against children use dating platforms – and more than 22% use them daily. 🚨Swipe wrong 🚨 👩‍👧‍👧Single parents who use dating sites are at risk of being targeted by child sex abusers. 👨‍💻New research shows child abusers are more likely to use dating sites. 🛡️Dating sites can and must improve safeguarding #AbuseIsPreventable — Childlight – Global Child Safety Institute (@Childlight_) April 9, 2025 The report, called Swipe Wrong, is part of a broader investigation into the multibillion-dollar industry of child sexual exploitation and abuse, which financially benefits perpetrators, organised crime and, according to researchers, mainstream companies. It warned sexual exploitation and abuse of children has become a pandemic, impacting more than 300 million every year. Research based on a survey of about 5,000 men in the UK, US and Australia showed single mothers are at particular risk, while 11.5% of men surveyed admitted having sexual feelings towards children and 11% confessed to sexual offences against minors. It followed a separate survey by the Australian Institute of Criminology that found 12% of dating app users received requests to facilitate child sexual exploitation and abuse – most often related to their own children. Most dating sites do not require new users to provide evidence of their identity, and the report shared new insights into perpetrator behaviour online. It found offenders may appear trustworthy, as they are more likely to have a child in their house, work with children, and have a higher education level. The report also found men who have committed sexual offences against children engage more frequently in certain online activities, like online shopping, dating and gaming, and are also more likely to own and use cryptocurrency and to buy sexual content online. Report co-author Professor Michael Salter, director of the Childlight East Asia and Pacific Hub at the University of New South Wales, said: 'Our findings provide clear evidence that dating apps lack adequate child protection measures, and loopholes are exploited by abusers to target single parents and their children. 'There's no reason why the robust user identification methods we have in other industries, such as banking and gambling, should not also have to be adopted by dating app platforms. 'Similarly, there are a range of AI tools and systems that can flag problematic words and conversations that can and should be used.' Professor Deborah Fry, Childlight's global director of data and professor of international child protection research at the University of Edinburgh, said: 'Child sexual exploitation and abuse is a global public health emergency that requires emergency measures – but it's preventable. 'We must mobilise globally, focusing not just on reactive law enforcement but on prevention strategies tackling underlying determinants of abuse – including financial and technological ecosystems sustaining it.'

Social media firms profiting from surge in sextortion cases, experts warn
Social media firms profiting from surge in sextortion cases, experts warn

The Independent

time09-04-2025

  • The Independent

Social media firms profiting from surge in sextortion cases, experts warn

Social media giants are among a host of companies profiting from a grim new criminal trend of financial extortion in which children are tricked into sending nude images and blackmailed for money, experts have warned. While extortion online for sexual purposes is not a new phenomenon, monitoring groups say that in early 2022 they noticed a surge in fake online profiles mostly targeting teenage boys – with organised criminals then rapidly demanding monetary payment or threatening to share the images, which can either be real or fake. In recent weeks, the National Crime Agency has launched a new social media campaign to raise awareness of this emerging threat, warning that in each of the first five months of 2024, British police forces received an average of 117 reports of ' sextortion ' from under 18's each month – with the actual figure likely to be far higher. Now, experts at the University of Edinburgh 's Childlight institute, which works with Interpol to track and combat child sexual exploitation and abuse, have warned that a range of companies are facilitating and profiting from such crimes. In a new report which tracks how child sexual exploitation and abuse is linked to serious organised crime, Childlight's researchers warn: 'The financial gains not only accrue to the offender. 'By involving and misusing commercial enterprises, such as electronic service providers (social media, messaging apps and video call services) and online payment systems (money transfer services, cryptocurrency and online banking), these institutions become facilitators of the exploitation. 'Hence, these platforms play a role in the sexual exploitation and abuse that occurs on their services – services which have evolved to suit users – and by doing so they facilitate and enable the abuse to occur.' As advertising is linked to the number of platform users, social media companies can make more money as more people use their platforms, including for purposes linked to the sexual exploitation and abuse of children, the report states. 'As a result, there is a financial gain for these sectors and for the offenders, which could be seen as a disincentive to regulate such activities,' the researchers warn in the report shared with The Independent. Beyond these streams, this crime has also spawned the creation of fee-based companies that provide cybersecurity and reputation management services to victims to combat the offending extorters, the researchers warn. These fees are often paid-up front and can amount to thousands of dollars – which only further commodifies the sexual abuse content of children by forcing them to pay for a solution to the crime of exploitation committed against them, Childlight warns. Childlight is now calling for companies making money from the illegal trade to become better at spotting red flags that point to illicit activity – and to face financial penalties if they fail to do so. Rhiannon-Faye McDonald, head of advocacy at the sexual abuse support charity Marie Collins Foundation, was targeted online at the age of 13 by a sexual predator posing as a teenage girl. Soon afterwards he turned up at her home and abused her in person. She told The Independent: 'The impacts cannot be understated. We have seen many cases reported in the media of victims who have lost their lives as a result of this abuse. 'For most victims and survivors, even with the right support, the impacts are significant and long-lasting. We live with deep feelings of shame, misplaced self-blame, and the fear of being recognised by those who have seen the images or videos of our abuse.' Ms McDonald added: 'This is not a new issue – technology assisted child sexual abuse has been happening for decades already, and yet we are still struggling to address it. Every year we see higher and higher numbers of children abused and child sexual abuse material being shared. As a society it is our responsibility to protect children, and so it really is time that we get a handle on this. 'For too long platforms have favoured profit over safety, and the rising numbers of children being abused is a direct result.' Experts at the US National Centre for Missing and Exploited Children (NCMEC) – which tracks such cases globally, mainly through mandatory reporting by electronic service providers, such as social media firms – say they first began to notice cases of mostly teenage boys being financially extorted in early 2022. In the year to June 2024, the agency said it had received an average of 556 reports of financial sextortion per week, amounting to over 28,000 global cases per year. In recent testimony to the US Congress, the NCMEC said it was aware of more than three dozen teenage boys who have taken their lives since 2021 as a result of being victimised by financial 'sextortion'. Childlight has identified tell-tale digital clues that it says should act as red flags to financial institutions and law enforcement, making it easier for them to detect and disrupt abusers. These include spending patterns involving crypto currency, online gaming, online shopping and the purchasing of sexual content online. In their new report, the researchers call for a system of fines internationally to penalise companies for the role they play in facilitating such abuse. They are also urging greater regulation of all sectors such as app stores and internet service providers to use all safety tools available – including age assurance, content moderators and technology which can match images being traded against known child sexual abuse material. 'The exploitation of children is not just an atrocity — it is an industry, generating billions of dollars in profits,' said Paul Stanfield, a former director of Interpol and now chief executive of Childlight, which estimates that over 300 million children are targeted annually by at least one form of online abuse. 'This is a market, structured and profitable, designed to generate revenue off the backs of vulnerable children. But markets can be disrupted, and that is where change must begin.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store