Latest news with #onlineharms


Telegraph
2 hours ago
- Politics
- Telegraph
Labour ‘playing gesture politics' with online safety, says Molly Russell's father
Labour is 'playing gesture politics' over protecting children from online harms, the father of Molly Russell has said. In an exclusive article for The Telegraph, Ian Russell warned that eight years after his 14-year-old daughter's death, social media platforms were still bombarding children with the same kind of suicide and self-harm content that led her to take her life. He claimed he had been met with 'radio silence' from Sir Keir Starmer and Peter Kyle, the Technology Secretary, over the six months since they personally assured him they would look again at toughening up the Online Safety Act. 'Sticking plaster ideas' And he accused the Government of being 'more interested in playing performative gesture politics with sticking plaster ideas' – such as two-hour caps on children's app use that would do little to tackle their exposure to online harms. Mr Russell is demanding the Government strengthen the Act by replacing Ofcom's 'timid' codes of practice with clear outcomes and targets for the social media companies to wipe their sites clean of harmful content such as suicide material. Research published on Tuesday by the Molly Rose Foundation, the charity set up in his daughter's memory, found TikTok and Instagram were still deluging teenagers with 'industrial levels' of dangerous suicide and self-harm content. The study claimed more than 90 per cent of the videos recommended to potentially vulnerable teenagers were promoting or glorifying suicide or self-harm. Molly took her own life after being bombarded with 16,000 'destructive' posts – including 2,100 on Instagram – encouraging self-harm, anxiety and even suicide in her final six months. The coroner at her inquest concluded she died from an act of self-harm while suffering from depression and 'the negative effects of online content' which had 'more than minimally contributed' to her death. Mr Russell, who chairs the foundation, said: 'It is staggering that eight years after Molly's death, incredibly harmful suicide, self-harm and depression content like she saw is still pervasive across social media. 'Ofcom's recent child safety codes do not match the sheer scale of harm being suggested to vulnerable users, and ultimately do little to prevent more deaths like Molly's. 'For over a year, this entirely preventable harm has been happening on the Prime Minister's watch and where Ofcom have been timid, it is time for him to be strong and bring forward strengthened, life-saving legislation without delay.' The researchers used accounts opened with a registered age and identity of a 15-year-old girl who had previously engaged with suicide, self-harm and depression material. The study was conducted in the weeks leading up to the implementation of the Online Safety Act, which requires companies to prevent and remove such content. However, the research suggested its algorithms were still driving content deemed harmful towards teenagers. Videos were classified as harmful if they either promoted or glorified suicide or self-harm, referred to suicide or self-harm ideation, or otherwise featured highly intense themes of hopelessness, misery and despair. Almost all (96 per cent) of the algorithmically recommended videos watched on TikTok's For You Page contained content that was likely to be harmful, particularly when viewed cumulatively or in large amounts. Some 97 per cent of Instagram short-form videos (known as Reels) contained themes likely to be harmful, particularly when being recommended and consumed in large amounts. 'The findings suggest safeguards were still not in place on either TikTok and Instagram, and that in the immediate period before regulation took effect, children could still be exposed to a substantial risk of reasonably foreseeable but preventable harm,' said the charity's report. More than half (55 per cent) of recommended harmful posts on TikTok's For You Page included references to suicide and self-harm ideation, and 16 per cent referred to suicide methods. The charity said the harmful content was achieving 'disturbing' levels of interest. One in 10 of the videos deemed harmful by researchers on TikTok's For You Page had been liked at least one million times. On Instagram Reels, one in five harmful recommended videos had been liked more than 250,000 times. The Technology Secretary said: 'These figures show a brutal reality – for far too long, tech companies have stood by as the internet fed vile content to children, devastating young lives and even tearing some families to pieces. 'But companies can no longer pretend not to see. The Online Safety Act, which came into effect earlier this year, requires platforms to protect all users from illegal content and children from the most harmful content, like promoting or encouraging suicide and self-harm. Forty-five sites are already under investigation. 'Ofcom is also considering how to strengthen existing measures, including by proposing that companies use proactive technology to protect children from self-harm content and that sites go further in making algorithms safe.' Meanwhile, a study by the Children's Commissioner found that children are more likely to view porn on Elon Musk's X than on dedicated adult sites. Dame Rachel de Souza found that children as young as six are being exposed to more porn since the Online Safety Act became law than they were before. That included illegal violent porn, such as strangulation and non-consensual sex. Social media and networking sites accounted for 80 per cent of the main sources by which children viewed porn. Dame Rachel said this easy access was influencing children's attitudes towards women, meaning nearly half of them believed girls who said no could be persuaded to have sex. With social media and networking sites accounting for eight out of 10 of the sources for children viewing porn, X, formerly Twitter, remained the most common source, outstripping dedicated porn sites. The gap between the number of children seeing pornography on X and those seeing it on dedicated porn sites has widened (45 per cent versus 35 per cent in 2025, compared to 41 per cent versus 37 per cent in 2023). Snapchat accounted for 29 per cent, Instagram 23 per cent, TikTok 22 per cent, and YouTube 15 per cent. The research, based on 1,020 young people aged 16 to 21, found 70 per cent of children had seen porn before the age of 18, an increase from 64 per cent in 2023, when the Online Safety Act received royal assent. In her report, published on Tuesday, Dame Rachel said: 'Violent pornography is easily accessible to children, exposure is often accidental and often via the most common social media sites, and it is impacting children's behaviours and beliefs in deeply concerning ways. 'This report must be a line in the sand. It must be a snapshot of what was – not what will be.' A TikTok spokesman said: 'Teen accounts on TikTok have 50+ features and settings designed to help them safely express themselves, discover and learn, and parents can further customise 20+ content and privacy settings through family pairing. 'With over 99 per cent of violative content proactively removed by TikTok, the findings don't reflect the real experience of people on our platform which the report admits.' A Meta spokesman said: 'We disagree with the assertions of this report and the limited methodology behind it. Tens of millions of teens are now in Instagram teen accounts, which offer built-in protections that limit who can contact them, the content they see, and the time they spend on Instagram. 'We continue to use automated technology to remove content encouraging suicide and self-injury, with 99 per cent proactively actioned before being reported to us. We developed teen accounts to help protect teens online and continue to work tirelessly to do just that.' Is Starmer ready to take decisive measures to save our children? By Ian Russell Eight years on from my daughter Molly's death, we continue to lose the battle against the untold harm being inflicted by tech giants. Every week in the UK, we lose at least another teenager to suicide where technology plays a role. However, the unfathomable reality is that I'm less convinced than ever that our politicians will do what's necessary to stop this preventable harm in its tracks. This week, Molly Rose Foundation released deeply disturbing new research that showed that in the weeks before the Online Safety Act took effect, Instagram and TikTok's algorithms continued to recommend the type of toxic material that cost my daughter's life. Vulnerable teenagers continue to be bombarded with suicide, self-harm and intense depression material on a near industrial scale. We should be in no doubt why this is still happening. The harms on social media are the direct result of business models that actively prioritise user engagement and a race for market share. Children's safety continues to be seen as an optional extra, and Ofcom's desperately unambitious implementation of the Online Safety Act will do little to change the commercial incentives that continue to cost children's lives. This preventable harm is happening on this Government's watch. Six months ago, I met with the Prime Minister and told him that further urgent action was necessary. Ofcom 'timid and unambitious' I told him that parents were heartened by Labour's commitment to strengthen the Online Safety Act in opposition. That they were encouraged by the Technology Secretary's recognition that the Act was 'uneven and unsatisfactory'. Crucially, I explained that swift and decisive action was necessary to fix structural issues with the Online Safety Act, still the most effective and quickest way to protect children from widespread harm while also enabling them to enjoy the benefits of life online, and to arrest the sticking plaster approach that Ofcom has adopted to implementation. I don't have confidence in the regulator's approach. Ofcom has proven to be desperately timid and unambitious, and seems determined to take decisions that are stacked in favour of tech companies rather than victims. For all the regulator's breathless claims to be 'taming toxic algorithms', buried in the detail of their plans is an expectation that the likes of TikTok and Instagram will only need to spend £80,000 fixing the algorithms that helped kill Molly, and that our research shows are continuing to cause widespread and pervasive harm today. This is pocket money to platforms making tens of billions every year. It sends the clearest of signals to the tech giants that the current regime expects them to pay lip service to online safety but doesn't really expect them to implement the achievable changes to prioritise safety over profit. Six months after I met the Prime Minister and Technology Secretary Peter Kyle, and received a personal assurance from them they would look again at this issue, all I have received from Number 10 is radio silence. Meanwhile, the Technology Secretary appears to be more interested in playing performative gesture politics with sticking plaster ideas like two-hour app caps that those who work in online safety immediately recognise will do little to meaningfully change the dial. Public support for Act is strong Despite the recent predictable howls of protest from free speech activists and tech libertarians, public support for the Online Safety Act remains strong. Our polling suggests that 80 per cent of adults want the Act to be strengthened, with a growing despair from parents and the public that our politicians seem unable or unwilling to protect our children, families and wider society from preventable harm. With all this in mind, my message to Sir Keir Starmer is clear. Is he prepared to take the decisive measures necessary to strengthen regulation and take on the tech companies which are a threat to children's safety? Will he listen to the public, bereaved families and civil society to deliver a comprehensive strengthening of the Online Safety Act, knowing that the majority of people in this country will be cheering him on? The alternative is that he leaves children and families at risk from largely unchecked but inherently preventable harm. Regulation that is well-meaning but isn't up to the job will not save the lives that it must.


National Post
29-06-2025
- Politics
- National Post
Liberals revamping online harms bill with focus on deepfakes, exploitation and AI: justice minister
Justice Minister Sean Fraser says the federal government plans to take a 'fresh' look at its online harms legislation over the summer, but it's not clear yet exactly what the bill will look like when it is reintroduced. Article content It would be the Liberals' third attempt to pass legislation to address harmful behaviour online. Article content Article content Fraser told The Canadian Press in an interview that the government hasn't decided whether to rewrite or simply reintroduce the Online Harms Act, which was introduced in 2024 but did not pass. Article content Article content He said Canadians can expect measures addressing deepfakes and child exploitation 'to be included in legislative reforms coming up in the near future.' Article content Article content In their election platform, the Liberals promised to make the distribution of non-consensual sexual deepfakes a criminal offence. They also pledged to introduce a bill to protect children from online sexploitation and extortion, and to give law enforcement and prosecutors additional tools to pursue those crimes. Article content Fraser said the growth of artificial intelligence is influencing the discussions. Article content The spread of generative AI has changed both the online space and everyday life since the federal government first introduced the legislation. Article content 'We will have that in mind as we revisit the specifics of online harms legislation,' he added. 'The world changes and governments would be remiss if they didn't recognize that policy needs to shift.' Article content Online harms legislation was first proposed by then-heritage minister Steven Guilbeault in 2021, but after widespread criticism, the government pivoted and shifted the file to the justice minister. Article content Article content Guilbeault is now back in his old ministry, which has been renamed Canadian identity and culture. Prime Minister Mark Carney has also created an artificial intelligence ministry, headed up by rookie MP Evan Solomon. Article content Article content Fraser said he expects 'significant engagement' with Guilbeault and Solomon, but it will be determined later which minister will take the lead on the file. Article content The first version of the bill alarmed critics who warned that the provision requiring platforms to take down offending content within 24 hours would undermine free expression. Article content When Fraser's predecessor, Arif Virani, introduced the Online Harms Act in 2024, the bill restricted that 24-hour takedown provision to content that sexually victimizes a child or revictimizes a survivor, or intimate content shared without consent, including deepfakes. Article content

CBC
29-06-2025
- Politics
- CBC
Liberals taking 'fresh' look at online harms bill, says Justice Minister Sean Fraser
Justice Minister Sean Fraser says the federal government plans to take a "fresh" look at its online harms legislation over the summer but it's not clear yet exactly what the bill will look like when it is reintroduced. It would be the Liberals third attempt to pass legislation to address harmful behaviour online. Fraser told The Canadian Press in an interview that the government hasn't decided whether to rewrite or simply reintroduce the Online Harms Act, which was introduced in 2024 but did not pass. He said Canadians can expect measures addressing deepfakes and child exploitation "to be included in legislative reforms coming up in the near future." WATCH | A bill to protect people — especially children — from harmful content online: Federal government introduces online harms bill 1 year ago Duration 4:59 The Liberal government has tabled bill C-63, which aims to protect people — especially children — from harmful content online, including sexual exploitation and hate speech, through the creation of a new regulatory body called the Digital Safety Commission and changes to the Criminal Code. In their election platform, the Liberals promised to make the distribution of non-consensual sexual deepfakes a criminal offence. They also pledged to introduce a bill to protect children from online sexploitation and extortion, and to give law enforcement and prosecutors additional tools to pursue those crimes. Fraser said the growth of artificial intelligence is influencing the discussions. The spread of generative AI has changed both the online space and everyday life since the federal government first introduced the legislation. "We will have that in mind as we revisit the specifics of online harms legislation," he added. "The world changes and governments would be remiss if they didn't recognize that policy needs to shift." Fraser expects to work with other ministers Online harms legislation was first proposed by then-heritage minister Steven Guilbeault in 2021. After widespread criticism, the government pivoted and shifted the file to the justice minister. Guilbeault is now back in his old ministry, which has been renamed Canadian identity and culture. Prime Minister Mark Carney has also created an artificial intelligence ministry, headed up by rookie MP Evan Solomon. Fraser said he expects "significant engagement" with Guilbeault and Solomon but it will be determined later which minister will take the lead on it. The first version of the bill alarmed critics who warned that the provision requiring platforms to take down offending content within 24 hours would undermine free expression. When Fraser's predecessor, Arif Virani, introduced the Online Harms Act in 2024, the bill restricted that 24-hour takedown provision to content that sexually victimizes a child or revictimizes a survivor, or intimate content shared without consent, including deepfakes. It also required social media companies to explain how they plan to reduce the risks their platforms pose to users, and imposed on them a duty to protect children. But the government also included Criminal Code and Canadian Human Rights Act amendments targeting hate in the same legislation — which some said risked chilling free speech. In late 2024, Virani said he would split those controversial provisions off into a separate bill, but that didn't happen before this spring's federal election was called and the bill died on the order paper. Fraser said no decision has been made yet on whether to bring back online harms legislation in one bill or two. "That is precisely the kind of thing that I want to have an opportunity to discuss with stakeholders, to ensure we're moving forward in a way that will create a broad base of public support," he said. Fraser said the government could "modify existing versions that we may have on the shelf from the previous Parliament as may be needed, or to accept the form in which we had the legislation." He added he wants to have a "fresh consideration of the path forward, where I personally can benefit from the advice of those closest to the file who know best how to keep kids safe online." While the government hasn't set a date to introduce legislation, it could include some online harms measures in a crime bill Fraser plans to table in the fall. Fraser said online harms provisions that "touch more specifically on criminal activity" could be "included in one piece of legislation, with a broader set of reforms on online harms at a different time."


CTV News
29-06-2025
- Politics
- CTV News
Liberals taking ‘fresh' look at online harms bill, justice minister says
Minister of Justice Sean Fraser arrives for a cabinet meeting on Parliament Hill in Ottawa on Tuesday, June 3, 2025. THE CANADIAN PRESS/Sean Kilpatrick OTTAWA — Justice Minister Sean Fraser says the federal government plans to take a 'fresh' look at its online harms legislation over the summer but it's not clear yet exactly what the bill will look like when it is reintroduced. It would be the Liberals third attempt to pass legislation to address harmful behaviour online. Fraser told The Canadian Press in an interview that the government hasn't decided whether to rewrite or simply reintroduce the Online Harms Act, which was introduced in 2024 but did not pass. He said Canadians can expect measures addressing deepfakes and child exploitation 'to be included in legislative reforms coming up in the near future.' In their election platform, the Liberals promised to make the distribution of non-consensual sexual deepfakes a criminal offence. They also pledged to introduce a bill to protect children from online sexploitation and extortion, and to give law enforcement and prosecutors additional tools to pursue those crimes. Fraser said the growth of artificial intelligence is influencing the discussions. The spread of generative AI has changed both the online space and everyday life since the federal government first introduced the legislation. 'We will have that in mind as we revisit the specifics of online harms legislation,' he added. 'The world changes and governments would be remiss if they didn't recognize that policy needs to shift.' Online harms legislation was first proposed by then-heritage minister Steven Guilbeault in 2021, but after widespread criticism, the government pivoted and shifted the file to the justice minister. Guilbeault is now back in his old ministry, which has been renamed Canadian identity and culture. Prime Minister Mark Carney has also created an artificial intelligence ministry, headed up by rookie MP Evan Solomon. Fraser said he expects 'significant engagement' with Guilbeault and Solomon but it will be determined later which minister will take the lead on it. The first version of the bill alarmed critics who warned that the provision requiring platforms to take down offending content within 24 hours would undermine free expression. When Fraser's predecessor, Arif Virani, introduced the Online Harms Act in 2024, the bill restricted that 24-hour takedown provision to content that sexually victimizes a child or revictimizes a survivor, or intimate content shared without consent, including deepfakes. It also required social media companies to explain how they plan to reduce the risks their platforms pose to users, and imposed on them a duty to protect children. But the government also included Criminal Code and Canadian Human Rights Act amendments targeting hate in the same legislation — which some said risked chilling free speech. In late 2024, Virani said he would split those controversial provisions off into a separate bill, but that didn't happen before this spring's federal election was called and the bill died on the order paper. Fraser said no decision has been made yet on whether to bring back online harms legislation in one bill or two. 'That is precisely the kind of thing that I want to have an opportunity to discuss with stakeholders, to ensure we're moving forward in a way that will create a broad base of public support,' he said. Fraser said the government could 'modify existing versions that we may have on the shelf from the previous Parliament as may be needed, or to accept the form in which we had the legislation.' He added he wants to have a 'fresh consideration of the path forward, where I personally can benefit from the advice of those closest to the file who know best how to keep kids safe online.' While the government hasn't set a date to introduce legislation, it could include some online harms measures in a crime bill Fraser plans to table in the fall. Fraser said online harms provisions that 'touch more specifically on criminal activity' could be 'included in one piece of legislation, with a broader set of reforms on online harms at a different time.' This report by The Canadian Press was first published June 29, 2025. Anja Karadeglija, The Canadian Press

Globe and Mail
24-06-2025
- Politics
- Globe and Mail
Ottawa pressed to split online harms bill to fast-track its passage
Child-safety advocates and technology experts are urging the federal government to swiftly bring back the online harms bill, but to split it in two to speed passage of measures that protect children from abuse. Bill C-63, which died when the last Parliament was prorogued in January, included initiatives to combat online child abuse and hate. But it faced sharp criticism from opposition MPs and civil liberty advocates for also proposing new criminal offences for hate propaganda and hate crimes – including life in prison for inciting genocide. Advocacy group OpenMedia says hundreds of messages have been sent to MPs since the election calling for the government to reintroduce the online harms bill. They want it to focus on measures to improve online safety for children and youth, and to create an independent regulator to tackle predatory behaviour, bullying and abuse online, while protecting online privacy or expression. The bill drew criticism from civil liberties groups for proposing a 'peace bond' to deter people feared to be planning to carry out hate crimes and hate propaganda offences, with penalties such as house arrest. Government ministers have indicated they plan to bring back the online harms bill but have not yet confirmed who would be shepherding it through Parliament. Will Carney's to-do list be hindered by parliamentary tactics? How the next government can protect Canada's information ecosystem Earlier this month, Government House Leader Steven MacKinnon said he expected it would be steered through by Canadian Identity Minister Steven Guilbeault. Among those calling for a swift reintroduction of the bill is Carol Todd, the mother of Amanda Todd, a teenager who died by suicide after falling victim to cyberbullying. She warned that Canada is lagging far behind countries such as the U.S. and Britain, which have already passed laws to protect people in the digital sphere. Ms. Todd said the government should take feedback it received on Bill C-63 before the election, including criticism of increased penalties for hate crimes, and put the Criminal Code measures on a separate track. 'They need to do two bills. If they put the same bill through, the same things will happen again and it will get held up,' she said. Bill C-63 would have forced online platforms to swiftly remove child sexual abuse material, intimate content shared without consent, and posts encouraging a child to self-harm. It would have created a digital safety commission and ombudsperson to combat online hate. 'The previous government's attempt to combine a platform accountability bill with a criminal justice bill was unwise,' said John Matheson, who leads the Canadian arm of Reset Tech, a global non-profit that fights digital threats to democracy, 'The Carney government would miss the mark if they do not create a new public regulator to hold platforms accountable in keeping our kids safe,' he said. The advocacy group OpenMedia wants the government to bring back the bill soon after MPs return from their summer break. 'Canada's next Online Harms Act should be about addressing the worst online harms, and not package in broader measures that aren't about the consequences of digital technologies,' said Matt Hatfield, the group's executive director. He said the controversy over new criminal penalties for hate speech and hate crimes 'completely overshadowed discussion of part one, the real core of the Online Harms Act.' 'There's still critical amendments to make to part one's text to strike the right balance between safety and online privacy and expression, but these changes are at a scale a parliamentary committee given adequate time can accomplish.' Lianna McDonald, executive director of the Canadian Centre for Child Protection, said it 'would not be opposed to the approach of addressing Criminal Code and human rights amendments through its own bill or bills, and addressing online harms to children in its own bill.' 'It was clear in the last session that there was consensus amongst our elected officials that legislative action to protect children from online harms is urgently needed, so it seems more likely that a bill focused on the protection of children will be able to move forward,' she said. Charlotte Moore Hepburn, medical director of the division of pediatrics at the Hospital for Sick Children in Toronto, said 'a new bill – one that prioritizes online safety for children and youth – is essential.'