logo
Six child sexual image offences a day in Wales, new figures show

Six child sexual image offences a day in Wales, new figures show

BBC News18-02-2025
Welsh police forces logged six offences a day relating to child sexual abuse images last year, new figures show.One campaigner has criticised the use of disappearing messages on apps such as Snapchat, after the app was noted in 50% of cases involving social media platforms.Children's charity NSPCC have called for tougher action from politicians and social media companies, as there can be "no excuse for inaction or delay".The UK government said it was "committed to the robust implementation of the Online Safety Act", and had already passed laws to crack down on child sexual abuse online.
Home Office figures showed 2,194 child sexual abuse image crimes were recorded in Wales last year,South Wales Police recorded the most with 964, followed by North Wales, which had 535.There were 503 child sexual abuse image crimes reported to Gwent Police, and 192 to Dyfed-Powys Police. It covers a wide range of offences from possessing, making, distributing or publishing child abuse material, to sharing or coercing someone under-18 to send indecent images.In a separate Freedom of Information request, the NSPCC found that of the offences in England and Wales where a social media platform used by the perpetrator was recorded, half had taken place on Snapchat.Other social media sites included 11% on Instagram, 7% on Facebook and 6% on WhatsApp.Last year Childline delivered 903 counselling sessions to children and young people relating to blackmail or threats to expose or share sexual images online - a 7% increase compared to 2022/23.
Snapchat offences 'a wake-up call'
Mared Parry from Wales, who now lives in London, was 14 when she was groomed online by men who manipulated her into sending sexual images of herself.The presenter and journalist is also an online safety campaigner who has worked with the NSPCC, and described the scale of the problem as "horrifying"."Evolving technology seems to have made it easier for groomers to get away with their crimes when it should be the opposite," she said."Online abuse has very real consequences."
Snapchat being the platform used in half the recorded offences should also be a "wake-up call", said Ms Parry. "Its disappearing messages, lack of accountability, and emphasis on privacy create the perfect conditions for abuse to go undetected."It's already difficult enough for victims to prove abuse, and features like this just make it even easier for offenders to cover their tracks."Yet, tech companies continue to prioritise user engagement over safeguarding, and the consequences are devastating."
What should you do if you are threatened online?
Being threatened online when it comes to sexual images can be a frightening experience. The Internet Watch Foundation (IWF) has the following advice: Remember that you are not at fault if approached or threatened online. The person trying to blackmail or sexually extort you is the one who is in the wrong, lots of other young people have been in a similar situation.Stop all contact with anyone who is trying to threaten you, and do not share any more images or videos or pay any money of any sort. If you have been communicating on an app, there should be in-built tools to block and report the user.You will not be in trouble with the police - report what has happened to your local police on 101 or by making a report to the National Crime Agency's CEOP Safety Centre, where a child protection advisor will make sure you get the help you need. You can use an online tool called Report Remove. The IWF will then try to have the sexual images, videos, or links removed from the internet. You can also talk to Childline, who have provided support to others in the same situation.For parents, it is also vital to have open and honest conversations with your children about the risks online and to listen to their concerns.
Tech bosses 'let off the hook'
The NSPCC have issued a joint call with other charities, including Barnardo's and the Marie Collins Foundation, for the UK government to give regulator Ofcom greater powers.Currently user-to-user services are only required to remove illegal content where it was "technically feasible", according to Ofcom, something the charities have criticised as an "unacceptable loophole".The charities said children will not be protected from the worst forms of abuse on private messaging services under Ofcom's current plans. But with most of the offences taking place on private messaging sites, the NSPCC also claim companies need to introduce "robust safeguards" so their sites are not "a safe haven for perpetrators of child sexual abuse".
"These offences cause tremendous harm and distress to children, with much of this illegal material being repeatedly shared and viewed online," said NSPCC chief executive Chris Sherwood. "It is an outrage that in 2025 we are still seeing a blatant disregard from tech companies to prevent this illegal content from proliferating on their sites," he added. "Having separate rules for private messaging services lets tech bosses off the hook from putting robust protections for children in place."This enables crimes to continue to flourish on their platforms even though we now have the Online Safety Act."
In a statement, a Home Office spokesperson described child sexual exploitation and abuse as despicable, and said tech company design choices cannot be used as an excuse not to root out "heinous crimes". "UK law is clear: child sexual abuse is illegal and social media is no exception, so companies must ensure criminal activity cannot proliferate on their sites."We have already introduced four new laws to crack down on child sexual abuse online and we will not hesitate to go further to protect children from vile online predators."
A Snapchat spokesperson condemned any sexual exploitation on the platform and said it works with law enforcement agencies to identify information and content if necessary. "Whether that's through our proactive detection efforts or confidential in-app reporting, we remove it, lock the violating account, and report to authorities."Snapchat is designed to make it difficult for predators to find and interact with young people and has extra safeguards in place to help prevent strangers from connecting with teens."Our Family Centre also allows parents to see who their teens are friends with and talking to on Snapchat. "We work with expert NGOs, and industry peers to jointly attack these problems and don't believe the methodology used in this report reflects the seriousness of our collective commitment and efforts."
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

'I'll make 9/11 look like The Teletubbies' said accused man
'I'll make 9/11 look like The Teletubbies' said accused man

The Herald Scotland

time12 hours ago

  • The Herald Scotland

'I'll make 9/11 look like The Teletubbies' said accused man

The 24 year-old had been arrested after posting online a clip of him blowing up two gas canisters near the River Leven in Methil, Fife. Ross's home was raided and police discovered a drawing marked "Project Payback". Read More: A phone and tablet device were also examined which included the voice message about the 2001 Twin Towers atrocity and him discussing with others "murdering all the people who wronged you". Ross appeared in the dock this week at the High Court in Glasgow. He pled guilty to a charge of behaving in a threatening and abusive manner which included sending the concerning messages and voice notes on Snapchat and Facebook causing an explosion, filming it and putting the footage on social media between June 25 and July 31 2024. He had a not guilty plea to a charge under the Terrorism Act accepted. Ross was remanded in custody and will be sentenced at a later date. A Facebook friend of Ross had spotted the explosion video in late July 2024. He showed it to a young woman who was so "alarmed" she contacted police. Ross was held that day after being spotted in Methil. Detectives - along with Counter Terrorism officers - searched his home in the town. They found the "Project Payback" drawing along with a sketch of what was described as a "homemade explosive device". Inside a desk were various items including a roll of wire, mobile phone batteries, nails, screws and a watch. There were also initial fears about a package in the property which led to homes in the area being evacuated, but it did not contain an explosive. Prosecutor Greg Farrell said Ross "laughed" when first quizzed about what he had filmed claiming it was an "attempt at satire comedy". But, he confirmed that he had blew up two butane gas canisters and had posted it on his Facebook page under the name of a Batman comic villain. Mr Farrell: "He made reference to social media corrupting his decisions." Ross was asked about his interest in Kaczynski - captured in 1996 - and said he was "apparently some kind of mail bomber" that he had learned more about by going down a "rabbit hole" online. Ross went on to insist that he himself was "not a terrorist" as he had "made peace with everything in his life". But, police found a series of concerning messages during checks of his phone and black tablet. In late June 2024, he wrote to 15 users on Facebook Messenger: "Here guys, I am just here to inform you that the only thing stopping you from murdering all of the people who wronged you is just a box. "That is only if you cannot do it correctly and make sure enough evidence is gone so that the charges do not stick." He referred to "instructions on how to make a pipe bomb" adding: "Hope this comes in handy for you one day x". In messages on the day of the River Leven explosion, a social media contact called Ross "a human". Mr Farrell then told the court of a Snapchat conversation with a friend shortly before. Ross stated at one stage: "I am going to make 9/11 look like an episode of the Teletubbies, f*** sake." He then backtracked again claiming it was "satire comedy" and that he was "only joking". The friend replied: "F*** Islam, f*** them all." During further rants, Ross said he had been let down by the "justice system" moaning he had been treated differently because he is a man. In messages to another contact, he said: "I have realised that I simply cannot allow what is happening to humanity and our world to continue. "I can and will have an impact on preventing the worst from happening xx." The court heard there were also photos, videos and sketches of the "Unabomber" on the devices as well as images of a pipe bomb and firearms together with "various other clips which suggest violence". Lord Colbeck deferred sentencing for reports.

Age verification UK explained: How is it impacting the UK?
Age verification UK explained: How is it impacting the UK?

The Herald Scotland

time12 hours ago

  • The Herald Scotland

Age verification UK explained: How is it impacting the UK?

This means that age verification tools are now being used on sites where they could access harmful content. Here's all you need to know about the new rules and how they are being implemented. Well done to everyone who campaigned to ensure age verification for pornography was in the Online Safety Act! Today it comes into force and while no doubt there will be some who get around it, it means young kids in particular won't be stumbling on violent and harmful porn. — Jess Asato MP (@Jess4Lowestoft) July 25, 2025 What is the Online Safety Act? The Online Safety Act is a piece of legislation that received Royal Assent on October 26, 2023, with the aim of protecting children and adults online. The Government website adds: "It puts a range of new duties on social media companies and search services, giving them legal duties to protect their users from illegal content and content harmful to children. "The Act gives providers new duties to implement systems and processes to reduce risks their services are used for illegal activity, and to take down illegal content when it does appear. Why is age verification being used on the internet? As of July 25, internet platforms have a legal duty to protect children from harmful content. Companies within the scope of the act must introduce safety measures as part of this, which include age verification. The Guardian reports: "This means all pornography sites must have in place rigorous age-checking procedures." They continued: "Social media platforms and large search engines must also prevent children from accessing pornography and material that promotes or encourages suicide, self-harm and eating disorders." Platforms will also have to suppress other material that could be potentially harmful to children. This could include "the promotion of dangerous stunts, encouraging the use of harmful substances and enabling bullying". How is age verification utilised by platforms? Ofcom, the media regulator, has set out a number of ways websites can verify the age of users. This can be done through credit card checks, photo ID matching and estimating age using a selfie. Whatever format platforms choose, they must be "technically accurate, robust, reliable and fair," BBC News reports. Which sites will require age verification? Pornhub and a number of other major adult websites have confirmed they will introduce enhanced age checks, BBC News reports. Recommended reading: Reddit has already introduced checks to stop people aged under 18 from looking at "certain mature content", while X and Grindr have committed to this as well. Discord gives UK users a choice of face or ID scanning as a way to verify their age, after testing methods, and Bluesky says it will give UK users a range of different verification options, external. BBC News adds: "Many more services which allow sexually explicit material may need to bring in measures to comply with the new rules."

Migrant hotel protests spread across the country with more planned today as cops clamp down on weekend of stand-offs
Migrant hotel protests spread across the country with more planned today as cops clamp down on weekend of stand-offs

The Sun

time15 hours ago

  • The Sun

Migrant hotel protests spread across the country with more planned today as cops clamp down on weekend of stand-offs

MIGRANT hotel protests have spread across the country as furious citizens take to the streets to challenge illegal immigration. Yesterday protests were held across the country with demonstrations outside migrant hotels held in Norwich, Leeds, Southampton and Nottinghamshire. 3 3 Further demonstrations are planned today in Epping, Wolverhampton and Cheshire as anger over the Government's continued use of migrant hotels rises. The protests have so far remained peaceful but some minor confrontations with counter protestors were seen. A group of counter protesters wearing masks reportedly broke away from the main group at the Nottinghamshire demonstration and walked into the middle of the crowd. Some were said to be carrying 'Stand Up to Racism' placards and were escorted away by police. Further demonstrations are planned today in Epping, Wolverhampton and Cheshire as public anger over the Government's use of migrant hotels rises. Police have so far arrested 18 people and charged seven in connection with the continuous protests in Epping. The migrant hotel demonstrations began after an asylum seeker was charged with sexual assault. The man is alleged to have attempted to kiss a 14-year-old girl. Protests have spread across the country with demonstrations held earlier in the week outside the four-star Britannia Hotel in Canary Wharf. According to the latest Home Office data 32,000 asylum seekers are being housed in around 210 hotels across the country. A record 24,000 migrants have crossed the Channel so far in 2025. 3

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store