logo
DANIEL HANNAN: The Online Safety Bill will make us less happy, wealthy and free - and this fit of moral panic won't stop tech-savvy teens

DANIEL HANNAN: The Online Safety Bill will make us less happy, wealthy and free - and this fit of moral panic won't stop tech-savvy teens

Daily Mail​2 days ago
The first many people knew of the Online Safety Act was when it was used to prevent users on X from viewing images from an anti-immigration protest in Yorkshire at the weekend.
After a demonstration outside the Britannia Hotel in Leeds on Friday, users claimed the website blocked footage of police detaining activists. They were instead shown the message: 'Due to local laws, we are temporarily restricting access to this content until X estimates your age.'
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

We must fight the deepfake future
We must fight the deepfake future

New Statesman​

time2 hours ago

  • New Statesman​

We must fight the deepfake future

Photo by Jonathan Brady/Alamy Penny Mordaunt broadsworded her way into Britain's collective imagination when she became the unexpected breakout star of King Charles III's May 2023 coronation. We had lost one stoic queen; here was another. Mordaunt bore the Sword of State, the heaviest in the royal collection, for 50 minutes. The world watching, Mordaunt kept her face composed, the image of ceremonial gravitas, strength, tradition and honour. But imagine that face smeared across violent pornography. Speaking to BBC Newsnight recently, the former Conservative MP and cabinet minister revealed that she had been a victim of deepfake pornography while serving in parliament. Her face, along with those of other female MPs including Priti Patel and Angela Rayner, had been digitally placed onto explicit videos. 'It was deliberately humiliating and violent,' she said. Deepfakes are the latest grotesque frontier in the battle for digital dignity, where artificial intelligence is weaponised to humiliate, disempower, and violate women's bodies. And the harm inflicted is not virtual – it can be as real as any other form of sexual violence. Headlines in 2013 may have asserted otherwise: 'No harm in simulated rape videos (as long as they are well made), say ministers' ran in the Telegraph. Though this predates the inception of deepfakes by a few years, it is grim that, even today, some still think this basic principle of female autonomy is up for debate. Digital violence is violence, as Mordaunt understands. 'The people behind this,' she said, 'don't realise the consequences in the real world when they do something like that.' Since the first deepfake was created in 2017, AI-generated, sexually explicit videos have proliferated across the internet. A study assessed that half a million deepfakes were shared in 2023; this year's total is expected to be eight million. Of all deepfakes, 98 per cent are sexually explicit, and 99 per cent of those are of females. This technology is both misogynistic and, as it stands, unregulated. Worse, it is now so sophisticated that viewers no longer realise they are consuming fakes. We stand on the precipice, looking at potentially an entire generation of young males whose sexual understanding of consent is being warped by digital hallucinations. Keir Starmer's government has shown some willingness to take on issues related to deepfakes. Amendments to the Online Safety Act, which require pornography websites to implement age-verification measures, came into force on 25 July. The aim is to prevent children from accessing explicit material, and thereby protect them. But we might note the unnecessary protractions before the legislation was introduced. (While it is illegal to distribute deepfakes, it is legal to create one. Rishi Sunak pledged to legislate against the production of deepfakes in April 2024 though said legislation never materialised; Keir Starmer pledged the same in January 2025, yet production remains legal.) We may also note that a lot of porn lives outside of traditional porn sites, instead circulating on the murky backwaters of Telegram groups, Reddit threads, and 4chan. Whatever the measures, we need more of them. AI-driven deepfake porn is a disturbing new theatre of abuse advancing, like all AI developments, at an alarming pace. But technology is made by humans. The scaffolding of our digital lives is designed, curated, and upheld by other people. The sword Mordaunt held at King Charles' coronation was historic and symbolic. Today, her sword is rhetorical: a call to action against the degradation of female autonomy, identity, and safety in a world that increasingly treats women's faces and bodies as public property. Mordaunt has exposed a frightening fault line in British society. Children are given unfettered access to pornography. Women are transformed into digitally altered chimeras without consent and without recourse. Allowing this to continue is not just a regulatory failure but a cultural one. Technological change is relentless; violence against women is perennial. The internet is hard to contain and full of malicious actors. But we must summon the will to protect basic privacies and dignities. However heavy, we should pick up and carry that sword. Subscribe to The New Statesman today from only £8.99 per month Subscribe [See also: Schools need more sex education, not less] Related

Under CTRL, the Epping migrant protests & why is ‘romantasy' so popular?
Under CTRL, the Epping migrant protests & why is ‘romantasy' so popular?

Spectator

time3 hours ago

  • Spectator

Under CTRL, the Epping migrant protests & why is ‘romantasy' so popular?

First: the new era of censorship A year ago, John Power notes, the UK was consumed by race riots precipitated by online rumours about the perpetrator of the Southport atrocity. This summer, there have been protests, but 'something is different'. With the introduction of the Online Safety Act, 'the government is exerting far greater control over what can and can't be viewed online'. While the act 'promises to protect minors from harmful material', he argues that it is 'the most sweeping attempt by any liberal democracy to bring the online world under the control of the state'. Implemented and defended by the current Labour government, it is actually the result of legislation passed by the Conservatives in 2023 – which Labour did not support at the time, arguing it didn't go far enough. So how much of a danger is the Act to free speech in Britain? John joined the podcast to discuss further alongside former Conservative minister Steve Baker, MP from 2010-24, and who was one of the biggest critics of the bill within the Conservative Party at the time. Next: should we be worried about protests against migrants? This week, outside a hotel in Epping, groups amassed to protest against the migrants housed there, with counter-protestors appearing in turn. Tommy Robinson might not have appeared in the end, but the Spectator's Max Jeffery did, concluding that the protests were ultimately 'anticlimactic'. Nevertheless, the protests have sparked debate about the motivations of those speaking out against the migrants – are there legitimate concerns voiced by locals, or are the protests being manipulated by figures on the political fringes? And what do the protests tell us about community tensions in the UK? Max joined the podcast to discuss alongside the editor of Spiked Tom Slater. And finally: why are 'romantasy' novels so popular? Lara Brown writes in the magazine this week about the phenomenon of the genre 'romantasy', which mixes romance with fantasy. While 'chick-lit' is nothing new, Lara argues that this is 'literature taken to its lowest form', emblematic of the terminally online young people who consume it. Nevertheless, it is incredibly popular and is credited by publishers as boosting the British fiction industry to over £1 billion. To unpack the genre's popularity, Lara joined the podcast, alongside Sarah Maxwell, the founder of London's first romance-only bookshop Saucy Books, based in Notting Hill. Hosted by William Moore and Lara Prendergast. Produced by Patrick Gibbons and Megan McElroy.

The more governments try to restrict social media use, the more young people will find ways to get around it
The more governments try to restrict social media use, the more young people will find ways to get around it

The Guardian

time8 hours ago

  • The Guardian

The more governments try to restrict social media use, the more young people will find ways to get around it

It's not entirely surprising the Australian government is now including YouTube accounts in its under-16s social media ban – but the decision to stop a 15-year-old from subscribing to their favourite channels only adds to an endless list of problems with the policy. This ban already had a number of broad issues, including the possibility of every Australian being required to hand over personal identification in order to use social media websites. We've already seen the UK's Online Safety Act making global headlines over the past week, where it is proving to be a nightmare for enforcement. Let's be clear: the harmful content the federal government keeps referencing isn't going anywhere. This content (excluding videos already restricted to those over 18) can still be viewed by anyone in a logged-out state or by teenagers using the account of their parents. Once you turn 16, there is nothing stopping you from accessing the content, and it could just as easily lead to a negative impact. What this ban does do, however, is effectively punish teenagers, even those who have always had overwhelmingly positive experiences from YouTube. Parents who are more than happy for their teenager to use YouTube with an account are not given any exemptions, despite the government's repeated line that the social media ban gives power back to the parents. The reason for YouTube's initial exemption was education, something I can personally attest to as a current year 12 student. After remote learning ended, YouTube continued to be used as a key tool for learning, both during and outside of school hours. This includes simple things such as a homework task involving taking down notes from a video, to a teacher uploading their own set of videos for a particular subject unit for student viewing anytime and anywhere. There have even been more than a few instances of teachers recommending subscribing to an educational channel, which I know has been a help to me and many of my peers. That's not to say YouTube isn't also a source of harmful content. But while there's no perfect solution for keeping young people safe online, there are clear steps that could and should be taken by the federal and state governments. And it starts in the classroom. From the late stages of primary school into secondary school, repeated lessons on ways to report content and dangers to look out for (among many other things to teach) would be a welcome addition. Crucially, this prevents the entire burden being on parents, many of whom are not tech-savvy by their own admissions. Simply saying 'don't do this' has never worked for any generation of teenagers, and it doesn't work for keeping them offline in this day and age. The more parents – or for that matter, governments – try to force a restriction on social media use, the more young people will be motivated to get around it. Additionally, measures to actually target the platforms, rather than the teens who use them, would make a lot more sense. Exactly what those measures would be is a further question, but we already know the government has plenty of tools at its disposal, such as the eSafety commissioner, if it's looking for a direct fight with tech giants. With a high court challenge from Google seemingly looming, it's worth remembering that – like it or not – Australia is far from the largest market for social media companies. This means it's not beyond the realm of possibility that these platforms could abandon Australia altogether rather than follow this legislation, just as Facebook did with news for a short period in 2021. I'm personally not convinced that will end up happening, but it also doesn't require that much imagination. Banning teens from accessing YouTube through their own account isn't going to stop harmful content in the slightest. Genuine problems need genuine solutions, but the social media ban isn't one of those. Leo Puglisi is chief anchor and managing director at 6 News Australia

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store