logo
#

Latest news with #ProtectionofChildrenCodesandGuidance

Ofcom announces 'game changing' new rules to keep children safe online
Ofcom announces 'game changing' new rules to keep children safe online

ITV News

time24-04-2025

  • ITV News

Ofcom announces 'game changing' new rules to keep children safe online

Ofcom has announced what it calls 'game-changing' new rules designed to keep children safe online. The communications regulator, which now also oversees online safety, published more than 40 measures which make up the Protection of Children Codes and Guidance on Thursday. Tech firms must follow them under the Online Safety Act, and any platform likely to be accessed by children in the UK will need to abide by them - or they could face fines which could run into millions of pounds. The measures include requiring online platforms to have robust age checks to stop children from accessing harmful content, to ensure that algorithms that recommend content do not operate in a way that harms children, and to implement more effective moderation systems so that quick action is taken on harmful content. Ofcom says its priority is to protect children so they can enjoy the benefits of being online, without experiencing the potentially serious harms that exist in the online world. On Thursday, the regulator's chief executive called the changes a "reset" for children online. Dame Melanie Dawes, continued: "They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. "Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.' Whether they intend to or not, children are currently able to sometimes access harmful content, including violence, hate, pornography and misogyny, and algorithms can also lead youngsters towards content promoting suicide, eating disorders and self-harm. They can also be exposed to cyberbullying, sextortion and dangerous online by Ofcom last year found that children aged eight to 17 spend between two and five hours online per day, and that amount of time increases with age. Nearly every child over 12 has a mobile phone, and almost all of them watch videos on platforms such as YouTube or TikTok. Children aged five to seven are increasingly present online, with a third found to use social media is concern, however, that the new Protection of Children Codes dont go far enough. Daisy Greenwell, who co-founded the grassroots movement, Smartphone Free Childhood, told ITV News: "This is a positive step forward in building a digital world that doesn't expose children to the sorts of harm we'd never allow in the real world. "But while this new Code is welcome, it also highlights how slowly the wheels of regulation turn compared to the pace of technological change…. In real life, no product, toy or device used by kids hits the market without extensive testing and proof of safety. "And yet our children are spending an average of 35 hours a week on devices and platforms that have gone through no safety testing, and where overwhelming evidence points to serious harm. "We need to build on the momentum of the Children's Code and demand bolder, faster and more radical action so that every child can grow up free from addictive design, toxic content and exploitative algorithms.' Platforms have three months from Thursday to complete children's risk assessments, and by July 25 2025, they must start implementing appropriate safety measures to protect children. Tech safety expert Lina Ghazal, who used to work at Meta and Ofcom and is now Head of Regulatory and Public Affairs at online safety provider Verifymy, said: 'With the regulator pledging to start enforcement in July, the era of tick-box access for content sites is over, while platforms that allow young users, but have 18+ features, will have to prove these are effectively walled off. 'Robust age checks will play a leading role in ensuring the success of the codes, and with greater scrutiny on content and the role of algorithm 'recommendations', providers must also not shirk on their moderation policies. Innovative AI-powered tools, designed to work in tandem with human judgment, can now sound the alarm on harmful content before it is even published. 'In 2025, the industry's top priority should be creating safer, more supportive online environments for children. The regulatory framework and the technology to back it up already exist, so platforms have no excuse not to take immediate action.' Some commentators, like American social psychologist and author of the bestseller 'The Anxious Generation,' Jonathan Haidt, have drawn a line directly between an increased use of social media and smartphones and a rapid decline in the mental health of young people. In the UK, the Department of Science, Innovation and Technology has commissioned a study, led by the University of Cambridge, to determineevidence exists to establish such a direct link. Companies that run social media platforms all currently have various measures in place. Meta introduced Teen accounts on Instagram last September and earlier this year expanded them to Facebook and Messenger. There have been pledges from X and TikTok to comply with UK law, and Snapchat says it is supportive of the Online Safety Act. YouTube says it invests heavily in the technology and teams that help to provide children and families with the best protection possible.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store