Latest news with #childprotection


BBC News
3 hours ago
- Business
- BBC News
What the Online Safety Act is - and how to keep children safe online
The way people in the UK might navigate the internet is the Online Safety Act, platforms must take action - such as carrying out age checks - to stop children seeing illegal and harmful material from face large fines if they fail to comply with UK's sweeping online safety rules. But what do they mean for children?Here's what you need to know. What is the Online Safety Act and how will it protect children? The Online Safety Act's central aim is to make the internet safer for people in the UK, especially is a set of laws and duties that online platforms must follow, being implemented and enforced by Ofcom, the media its Children's Codes, platforms must prevent young people from encountering harmful content relating to suicide, self-harm, eating disorders and pornography from 25 will see some services, notably porn sites, start checking the age of UK rules are also designed to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous which wish to continue operating in the UK must adopt measures including:changing the algorithms which determine what is shown in children's feeds to filter out harmful contentimplementing age verification methods to check whether a user is under 18removing identified harmful material quickly and supporting children who have been exposed to itidentifying a named person who is "accountable for children's safety", and annually review how they are managing risk to children on their platformsFailure to comply could result in businesses being fined £18m or 10% of their global revenues - whichever is higher - or their executives being jailed. In very serious cases Ofcom says it can apply for a court order to prevent the site or app from being available in the sites are introducing UK age checks - and will they be effective? What else is in the Online Safety Act? The bill also requires firms to show they are committed to removing illegal content, including:child sexual abusecontrolling or coercive behaviourextreme sexual violencepromoting suicide or self-harmselling illegal drugs or weaponsterrorismThe Act has also created new offences, such as:cyber-flashing - sending unsolicited sexual imagery onlinesharing "deepfake" pornography, where artificial intelligence is used to insert someone's likeness into pornographic content Why has it been criticised? A number of campaigners want to see even stricter rules for tech firms, and some want under-16s banned from social media Russell, chairman of the Molly Rose Foundation - which was set up in memory of his daughter who took her own life aged 14 - said he was "dismayed by the lack of ambition" in Ofcom's codes. The Duke and Duchess of Sussex have also called for stronger protection from the dangers of social media, saying "enough is not being done".They unveiled a temporary memorial in New York City dedicated to children who have died due to the harms of the internet. "We want to make sure that things are changed so that... no more kids are lost to social media," Prince Harry told BBC will UK age verification for porn work, and what about privacy?The NSPCC children's charity argues that the law still doesn't provide enough protection around private messaging apps. It says that the end-to-end encrypted services which they offer "continue to pose an unacceptable, major risk to children".On the other side, privacy campaigners complain the new rules threaten users' also argue age verification methods are invasive without being effective age checks can lead to "security breaches, privacy intrusion, errors, digital exclusion and censorship," according to Silkie Carlo, director of Big Brother Watch. How much time do UK children spend online? Children aged eight to 17 spend between two and five hours online per day, according to Ofcom found that nearly every child over 12 has a mobile phone and almost all of them watch videos on platforms such as YouTube or half of children over 12 think being online is good for their mental health, according to the Children's Commissioner said that half of the 13-year-olds her team surveyed reported seeing "hardcore, misogynistic" pornographic material on social media sites. Children also said material about suicide self-harm and eating disorders was "prolific" and that violent content was "unavoidable". What online parental controls are available? The NSPCC says it's vital that parents talk to their children about internet safety and take an interest in what they do of parents say they use controls to limit what their children see online, according to Internet Matters, a safety organisation set up by some of the big UK-based internet has a list of parental controls available and step-by-step guides, external on how to use include advice on how to manage teen or child accounts on social media, video platforms such as YouTube, and gaming platforms such as Roblox or Ofcom data suggests that about one in five children are able to disable parental has already introduced "teen accounts" which turn on many privacy settings by default - although some researchers have claimed they were able to circumvent the promised Guide for parentsKeep kids off Roblox if you're worried, its CEO tells parents What controls are there on mobile phones and gaming consoles? Phone and broadband networks may block some explicit websites until a user has demonstrated they are over also have parental controls that can limit the websites children can visit on their and Apple devices also offer options for parents to block or limit access to specific apps, restrict explicit content, prevent purchases and monitor console controls also let parents ensure age-appropriate gaming and control in-game purchases. Sign up for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? Sign up here.


The Guardian
4 hours ago
- Politics
- The Guardian
UK should act to stop children getting hooked on social media ‘dopamine loops'
The UK government has been urged to 'detoxify' the 'dopamine loops' of addictive social media platforms by a leading online safety campaigner, as tech companies prepare to implement significant child protection measures. Beeban Kidron, a crossbench peer, urged the technology secretary, Peter Kyle, to use the Online Safety Act to bring forward new codes of conduct on disinformation and tech features that can lead to children becoming addicted to online content. 'The secretary of state has a power under the Online Safety Act to bring forward new codes of conduct,' said Kidron. 'We have urgently asked him to do so, but so far we have been rebuffed.' Kidron added it was not 'nanny state' to prevent companies that invest billions of pounds into making their platforms as addictive as possible from targeting under-18s. 'It is up to ministers to use their powers to detoxify those dopamine loops – they have the power – so why not to do so right now?' 'Dopamine-like' measures identified by 5Rights, a campaign group founded by Kidron, include displaying the amount of times a user's post has been liked or shared, mobile phone notifications and showing content with expiry dates such as Instagram's stories feature. Kidron spoke to the Guardian before the 25 July deadline for online platforms – including Facebook, Instagram, TikTok, YouTube, X and Google – to introduce child safety measures, and for pornography sites to bring in stringent age-checking. Age-checking measures could also be required for social media sites that allow harmful content, such as X, which is the most popular source of pornography for young people, according to research published by the children's commissioner for England, Dame Rachel de Souza. X announced on Thursday that if it was unable to determine whether a user was 18 or over, they would be defaulted into sensitive content settings and would not be able to view adult material. Dame Melanie Dawes, Ofcom's chief executive, said: 'Prioritising clicks and engagement over children's online safety will no longer be tolerated in the UK. Our message to tech firms is clear – comply with age-checks and other protection measures set out in our codes, or face the consequences of enforcement action from Ofcom.' The changes mean that social media companies must, as a priority, prevent children from seeing pornography and harmful content that encourages suicide, self-harm or eating disorders. They must also suppress the spread of harmful content, such as violent, hateful or abusive material and online bullying. Companies that breach the act face fines of up to 10% of global turnover which in the case of Instagram's parent company, Meta, would amount to $16.5bn. In extreme cases, sites or apps could be blocked in the UK. Tech executives could also be prosecuted if they ignored Ofcom demands to comply with child safety duties. Ofcom has outlined a series of measures that comply with the child safety requirements. Those include: sites and apps having procedures for taking down dangerous content quickly; children having a 'straightforward' way to report harmful content; and algorithms, which recommend content to users, filtering out harmful material. X gave details of its age-checking measures on Thursday, including that if a user has previously indicated that they are under 18 or if the account was created in 2012 or earlier. Bluesky, Discord, Grindr and Reddit have also committed to age-gating measures. Ofcom will assess whether these approaches comply with the act. Meta, the owner of Instagram and Facebook, says it has a multilayered approach in place that complies to age-checking requirements, including its teenager account feature – a default setting for users under 18 – that it says already provide an 'age appropriate' experience for young users. TikTok, which argues it already prohibits the vast majority of content prohibited to children, is introducing new age-checking measures for certain restricted material from Friday. Pornography providers such as Pornhub have committed to introducing stringent age checks from Friday. Measures recommended by Ofcom include: facial age estimation, where technology assesses a person's likely age through a live photo or video; checking a person's age via their credit card provider, bank or mobile phone network operator; photo ID matching, where a passport or similar ID is checked against a selfie; or a 'digital identity wallet' that contains proof of age.
Yahoo
5 hours ago
- Business
- Yahoo
Comply with child age checks or face consequences, Ofcom tells tech firms
Tech firms have been warned to act now or face the consequences, as new online safety protections for children come into force. From Friday, so-called 'risky' sites and apps will be expected to use what the regulator has described as 'highly effective' age checks to identify which users are children and subsequently prevent them from accessing pornography, as well as other harmful content including self-harm, suicide, eating disorders and extreme violence. But some online safety campaigners said while the new measures should have been a 'watershed moment for young people', regulator Ofcom has instead 'let down' parents, accusing it of choosing to 'prioritise the business needs of big tech over children's safety'. The Molly Rose Foundation, founded by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media, said the changes lack ambition and accountability and warned that big tech will have taken note. In the face of campaigners' criticism, Ofcom chief executive Dame Melanie Dawes has previously defended the reforms, insisting that tech firms are not being given much power over the new measures, which are coming into effect as part of the Online Safety Act. The changes include age checks on pornography websites, as well as others such as dating app Grindr, which Ofcom said will ensure it is more difficult for children in the UK to access online porn than in many other countries. The regulator said sites such as X, formerly Twitter, and others including Bluesky and Reddit have also committed to age assurances. Ofcom said its safety codes also demand that algorithms 'must be tamed and configured for children so that the most harmful material is blocked'. It said it has launched a monitoring and impact programme focused on some of the platforms where children spend most time including social media sites Facebook, Instagram and TikTok, gaming site Roblox and video clip website YouTube. The sites are among those which have been asked to submit, by August 7, a review of their efforts to assess risks to children and, by September 30, scrutiny of the practical actions they are taking to keep children safe. Actions which could be taken against firms which fail to comply with the new codes include fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater, and court orders potentially blocking access in the UK. Dame Melanie said: 'Prioritising clicks and engagement over children's online safety will no longer be tolerated in the UK. 'Our message to tech firms is clear – comply with age checks and other protection measures set out in our codes, or face the consequences of enforcement action from Ofcom.' But Andy Burrows, chief executive of the Molly Rose Foundation, said: 'This should be a watershed moment for young people but instead we've been let down by a regulator that has chosen to prioritise the business needs of big tech over children's safety.' He said the 'lack of ambition and accountability will have been heard loud and clear in Silicon Valley'. He added: 'We now need a clear reset and leadership from the Prime Minister. That means nothing less than a new Online Safety Act that fixes this broken regime and firmly puts the balance back in favour of children.'


BBC News
21 hours ago
- Health
- BBC News
'Wrong decision' over care plan before mum and daughter found dead
A child protection plan for a vulnerable teenager was wrongfully removed months before she and her mum were found dead in their home, an inquest has bodies of Alphonsine Djiako Leuga, 47, and her 18-year-old daughter Loraine Choulla were discovered in a house in Radford, Nottingham on 21 May had been raised about Loraine, who had Down's Syndrome and learning disabilities, and in 2023 a child protection plan was implemented by social care teams at Nottingham City Council. On Wednesday, an inquest into their deaths heard the removal of that plan in January 2024, was a "wrong decision". It is thought Alphonsine and Loraine may have been dead for weeks or even months before they were found in their council house in Hartley Road, where they had been living since 2019. The inquest, which will investigate how the mother and daughter died, will also examine whether there were any missed opportunities to save Loraine - if it is accepted her mother died before her. Loraine was "entirely dependent" on Alphonsine to eat and drink and was "primarily non-verbal", the court heard. In 2021, Alphonsine began to stop engaging with housing, education and social care services. By 2022, Loraine stopped attending her special educational needs school. Giving evidence at Nottingham Coroner's Court, Nichola Goode a service manager for the council's whole life disability team, said due to her mother's lack of engagement, Loraine was considered a "vulnerable hidden child" and it was known her case was "complex". She added despite worries for Loraine, there were no concerns raised about her "presentation" and the relationship between her and her mother was observed as being "warm". The court heard that there "continued to be concerns about Loraine's health", her lack of education and her "social isolation" at the point her plan was terminated. Loraine's child protection plan was closed on 31 January 2024, before she had turned 18. Ms Goode told the inquest: "I think in hindsight, now that we've looked at that, it was a wrong decision made by child social care. We could have followed more thorough inquiries."She added: "We accept that we shouldn't have closed the plan". It was known to the social care team that Alphonsine had been admitted to hospital critically unwell - days before her daughter's plan ended - and required "life-saving treatment". After her discharge, social care attempted a home visit but left when it "appeared no-one was home".Ms Goode said: "Had we believed Alphonsine and Loraine were inside, we would have called the police."The inquest has heard that days later, on 3 February, Alphonsine called 999 pleading for an ambulance, but the call was mistakenly considered abandoned and was subsequently closed, with no-one attending the proposed medical cause of death was recorded as pneumonia, while Loraine's was not established. The inquest you have been affected by any of the issues raised in this story, support is available via the BBC Action Line


The Verge
a day ago
- The Verge
Instagram addresses its creep problem.
Posted Jul 23, 2025 at 11:00 AM UTC Instagram addresses its creep problem. Adult-managed accounts that primarily post pictures of children will no longer be recommended to adult users 'who have shown potentially suspicious behavior,' according to Meta, and vice versa — making them harder to find in Search. This was announced today alongside new features for teen accounts that make it easier to report and block unwanted contact in DMs. Expanding Teen Account Protections to More Accounts [ Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates. Jess Weatherbed Posts from this author will be added to your daily email digest and your homepage feed. See All by Jess Weatherbed Posts from this topic will be added to your daily email digest and your homepage feed. See All Instagram Posts from this topic will be added to your daily email digest and your homepage feed. See All Meta Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech