logo
#

Latest news with #Children'sCodes

Major plan to switch off kids' social media while they're at school being looked at
Major plan to switch off kids' social media while they're at school being looked at

Daily Mirror

time24-04-2025

  • Politics
  • Daily Mirror

Major plan to switch off kids' social media while they're at school being looked at

Technology Secretary Peter Kyle told The Mirror he is looking 'very carefully' at ways to stop children wasting hours scrolling on their phones after youngsters begged for help Children could have their access to social media switched off while they're at school under plans being considered by ministers. Technology Secretary Peter Kyle is looking for ways to help kids manage their online lives and stop young people wasting hours doom-scrolling on the internet. The Mirror understands these could include a social media curfew or limiting kids' access to the internet during school hours. ‌ Mr Kyle has said he will be examining the results of TikTok's recently announced 10pm curfew for under 16s. 'I'm looking at all the measures that would positively contribute towards a positive, enthusiastic, supportive environment online,' Mr Kyle told the Mirror. After years of bureaucratic consultations, media regulator Ofcom published its Children's Codes under the Online Safety Act (OSA), which sets out rules tech firms must follow by July. Under the codes, online sites must introduce robust age verification tools to make sure underage kids aren't accessing things they shouldn't. They have also been ordered to tame toxic algorithms and take faster action on removing harmful content. Mr Kyle celebrated the 'first step' in the journey to improving kids' safety online but admitted the OSA is 'lopsided' and more action is needed. He said he was taking a step back to think about how the addictive nature of phones and social media is also 'disrupting the childhood experience', as well as online harms. ‌ 'Sometimes it's interfering with young people's sleep or ability to concentrate when they're doing school work, sometimes out of hours, as well as focusing on the school day itself, even though 97% of schools do exclude smartphones from school itself,' he said. Mr Kyle has confirmed he would not support an Australia-style blanket ban on under-16s using social media and the Government has also ruled out a statutory ban on phones in school, arguing that the majority already enforce one. ‌ But he said he was 'looking very carefully at what comes next' to help the kids who beg him for solutions after suffering from 'resentment' after accidentally wasting hours scrolling on their phone. Elsewhere Mr Kyle insisted he is confident tech billionaires X/ Twitter owner Elon Musk and Meta chief Mark Zuckerberg will follow Ofcom's new rules despite the pair having recently rolled back content moderation on their platforms. The Cabinet minister said anyone who breaks the codes will 'face the full consequences of the British law' - which can include fines of up to 10% of global turnover from Ofcom or, in extreme cases, their platforms being switched off in the UK. ‌ He also said Britain's online safety laws were not up for negotiation amid fears the Donald Trump administration is pushing for them to be eased in UK-US trade deal talks. Concerns have been raised about the leeway social media giants will have to work around Ofcom's codes. For instance, the media regulator is only telling tech firms they have the 'option of excluding' content showing dangerous online challenges, material that incites hatred or misogynistic content. In an interview with the Mirror, Ofcom's child protection policy lead Almudena Lara said tech companies will be able to choose whether to exclude such content from being pushed to kids through their algorithms to give them the freedom to show different content to older teens and younger children. ‌ 'The ball is in their courts to understand their user base and to understand the content that they have and how best to serve their users,' she said. Ian Russell, the dad of Molly Russell, who took her own life at 14 after being bombarded with harmful material online, said he was 'dismayed' by Ofcom's codes and that he has lost trust in Mr Kyle. "I am dismayed by the lack of ambition in today's codes. Instead of moving fast to fix things, the painful reality is that Ofcom's measures will fail to prevent more young deaths like my daughter Molly's,' he said. 'Ofcom's risk adverse approach is a bitter pill for bereaved parents to swallow. Their overly cautious codes put the bottom line of reckless tech companies ahead of tackling preventable harm.'

Ofcom announces new rules for tech firms to protect children online
Ofcom announces new rules for tech firms to protect children online

Yahoo

time24-04-2025

  • Politics
  • Yahoo

Ofcom announces new rules for tech firms to protect children online

Social media platforms and websites will be legally required to protect children from accessing harmful content online or risk facing fines, the communications watchdog has said. Ofcom has published new regulations - known as the Children's Codes- that will require tech firms to instate age verification checks and change algorithm rcommendations to continue operating in the UK. Sites must adhere to the standards by 25 July. Any site which hosts pornography, or content which encourages self-harm, suicide or eating disorders must have robust age checks in place to protect children from accessing that content. Ofcom boss Dame Melanie Dawes says the codes will create "safer social media feeds". Some critics however say the restrictions don't go far enough, calling it a "bitter pill for bereaved parents to swallow". Ian Russell, Chair of the Molly Rose Foundation, which was set up in honour of his daughter who took her own life aged 14, said he was "dismayed by the lack of ambition" in the codes. But Prof Victoria Baines, a former safety officer at Facebook told the BBC it is "a step in the right direction". Talking to BBC Radio 4's Today Programme on Thursday, she said: "Big tech companies are really getting to grips with it , so they are putting money behind it, and more importantly they're putting people behind it." Under the Codes, algorithms must also be configured to filter out harmful content from children's feeds and recommendations. As well as the age checks, there will also be more streamlined reporting and complaints systems, and platforms will be required to take faster action in assessing and tackling harmful content when they are made aware if it. All platforms must also have a "named person accountable for children's safety", and the management of risk to children should be reviewed annually by a senior body. If companies fail to abide by the regulations put to them by 24 July, Ofcom said it has "the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK."

Ofcom sets out new rules to force tech firms to protect children online
Ofcom sets out new rules to force tech firms to protect children online

STV News

time24-04-2025

  • Business
  • STV News

Ofcom sets out new rules to force tech firms to protect children online

Social media and other internet platforms will be legally required to block children's access to harmful content from July or face massive fines, Ofcom has said. The regulator has published the final version of its Children's Codes under the Online Safety Act, setting out what sites must do to follow the law and protect children online. Under the codes, any site which hosts pornography, or content which encourages self-harm, suicide or eating disorders must have robust age verification tools in place in order to protect children from accessing that content. Those tools could be the use of facial age estimation technology, photo ID matching, or credit card checks to verify age more reliably. In addition, platforms will be required to configure their algorithms to filter out harmful content from children's feeds and recommendations, ensuring they are not sent down a rabbit hole of harmful content. Platforms will also be required to give children more control over their online experience, including indicating what content they don't like, as well as robust controls to block connection requests, comments and other controls. In total, the codes set out 40 practical measures firms must meet by July in order to fulfil their duties under the Online Safety Act. As well as fines, which can be up to £18 million or 10% of qualifying global revenue – which could reach billions of pounds for the largest firms – Ofcom will also have the power to seek a court order banning access to a site in the UK, in the most extreme cases. Ofcom chief executive, Dame Melanie Dawes, said: 'These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. 'Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.' Technology Secretary Peter Kyle said the publication of the codes was a 'watershed moment' after years of 'exposed, poisonous environments' online. 'Growing up in the digital age should mean children can reap the immense benefits of the online world safely but in recent years too many young people have been exposed to lawless, poisonous environments online which we know can lead to real and sometimes fatal consequences. This cannot continue,' he said. 'The Children's Safety codes should be a watershed moment – turning the tide on toxic experiences on these platforms – with the largest social media companies now having to prioritise children's safety by law. 'This means age checks to stop children being exposed to the most extreme harmful content, as well as changes to platform design including algorithms to stop young users being served up harmful content they often aren't even seeking. 'Like parents across the country I expect to see these laws help create a safer online world, so we set every child up for the best start in life. 'But we won't hesitate to go further to protect our children; they are the foundation not the limit when it comes to children's safety online.' However, some online safety campaigners have warned that the Online Safety Act in its current form was not adequate in its protection of internet users, and in particular children, and fails to cover areas of concern. Ian Russell, now chairman of charity the Molly Rose Foundation, set up in his daughter's name after she chose to end her life aged 14, in 2017, after viewing harmful content on social media, said Ofcom's codes would not protect young people. Mr Russell urged the Prime Minister to step in and strengthen the Online Safety Act. 'I am dismayed by the lack of ambition in today's codes. Instead of moving fast to fix things, the painful reality is that Ofcom's measures will fail to prevent more young deaths like my daughter Molly's,' he said. 'Ofcom's risk averse approach is a bitter pill for bereaved parents to swallow. Their overly cautious codes put the bottom line of reckless tech companies ahead of tackling preventable harm. 'We lose at least one young life to tech-related suicide every single week in the UK which is why today's sticking plaster approach cannot be allowed to stand. 'A speedy remedy is within reach if the Prime Minister personally intervenes to fix this broken system. Less than one in 10 parents think Ofcom is doing enough and Sir Keir Starmer must commit without delay to strengthen online safety legislation.' Get all the latest news from around the country Follow STV News Scan the QR code on your mobile device for all the latest news from around the country

Ofcom announces new rules for tech firms to protect children online
Ofcom announces new rules for tech firms to protect children online

Yahoo

time24-04-2025

  • Politics
  • Yahoo

Ofcom announces new rules for tech firms to protect children online

Social media platforms and websites will be legally required to protect children from accessing harmful content online or risk facing fines, the communications watchdog has said. Ofcom has published new regulations - known as the Children's Codes- that will require tech firms to instate age verification checks and change algorithm rcommendations to continue operating in the UK. Sites must adhere to the standards by 25 July. Any site which hosts pornography, or content which encourages self-harm, suicide or eating disorders must have robust age checks in place to protect children from accessing that content. Ofcom boss Dame Melanie Dawes says the codes will create "safer social media feeds". Some critics however say the restrictions don't go far enough, calling it a "bitter pill for bereaved parents to swallow". Ian Russell, Chair of the Molly Rose Foundation, which was set up in honour of his daughter who took her own life aged 14, said he was "dismayed by the lack of ambition" in the codes. But Prof Victoria Baines, a former safety officer at Facebook told the BBC it is "a step in the right direction". Talking to BBC Radio 4's Today Programme on Thursday, she said: "Big tech companies are really getting to grips with it , so they are putting money behind it, and more importantly they're putting people behind it." Under the Codes, algorithms must also be configured to filter out harmful content from children's feeds and recommendations. As well as the age checks, there will also be more streamlined reporting and complaints systems, and platforms will be required to take faster action in assessing and tackling harmful content when they are made aware if it. All platforms must also have a "named person accountable for children's safety", and the management of risk to children should be reviewed annually by a senior body. If companies fail to abide by the regulations put to them by 24 July, Ofcom said it has "the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK."

Ofcom announces new rules for tech firms to protect children online
Ofcom announces new rules for tech firms to protect children online

BBC News

time24-04-2025

  • Business
  • BBC News

Ofcom announces new rules for tech firms to protect children online

Social media platforms and websites will be legally required to protect children from accessing harmful content online or risk facing fines, the communications watchdog has has published new regulations - known as the Children's Codes- that will require tech firms to instate age verification checks and change algorithm rcommendations to continue operating in the must adhere to the standards by 25 July. Any site which hosts pornography, or content which encourages self-harm, suicide or eating disorders must have robust age checks in place to protect children from accessing that boss Dame Melanie Dawes says the codes will create "safer social media feeds". Some critics however say the restrictions don't go far enough, calling it a "bitter pill for bereaved parents to swallow". Ian Russell, Chair of the Molly Rose Foundation, which was set up in honour of his daughter who took her own life aged 14, said he was "dismayed by the lack of ambition" in the Prof Victoria Baines, a former safety officer at Facebook told the BBC it is "a step in the right direction".Talking to BBC Radio 4's Today Programme on Thursday, she said: "Big tech companies are really getting to grips with it , so they are putting money behind it, and more importantly they're putting people behind it."Under the Codes, algorithms must also be configured to filter out harmful content from children's feeds and well as the age checks, there will also be more streamlined reporting and complaints systems, and platforms will be required to take faster action in assessing and tackling harmful content when they are made aware if platforms must also have a "named person accountable for children's safety", and the management of risk to children should be reviewed annually by a senior companies fail to abide by the regulations put to them by 24 July, Ofcom said it has "the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK."

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store