logo
Commissioner calls for ban on apps that make deepfake nude images of children

Commissioner calls for ban on apps that make deepfake nude images of children

The Guardian28-04-2025

Artificial intelligence 'nudification' apps that create deepfake sexual images of children should be immediately banned, amid growing fears among teenage girls that they could fall victim, the children's commissioner for England is warning.
Girls said they were stopping posting images of themselves on social media out of a fear that generative AI tools could be used to digitally remove their clothes or sexualise them, according to the commissioner's report on the tools, drawing on children's experiences. Although it is illegal to create or share a sexually explicit image of a child, the technology enabling them remains legal, the report noted.
'Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone – a stranger, a classmate, or even a friend – could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps,' the commissioner, Dame Rachel de Souza, said.
'The online world is revolutionary and quickly evolving, but there is no positive reason for these particular apps to exist. They have no place in our society. Tools using deepfake technology to create naked images of children should not be legal and I'm calling on the government to take decisive action to ban them, instead of allowing them to go unchecked with extreme real-world consequences.'
De Souza urged the government to introduce an AI bill that would require developers of GenAI tools to address the risks their products pose, and to roll out effective systems to remove sexually explicit deepfake images of children. This should be underpinned by policymaking that recognises deepfake sexual abuse as a form of violence against women and girls, she suggested.
In the meantime, the report urges Ofcom to ensure that age verification on nudification apps is properly enforced and that social media platforms prevent sexually explicit deepfake tools being promoted to children, in line with the Online Safety Act.
The report cited a 2025 survey by Girlguiding, which found that 26% of respondents aged 13 to 18 had seen a sexually explicit deepfake image of a celebrity, a friend, a teacher, or themselves.
Many AI tools appear to only work on female bodies, which the report warned is fuelling a growing culture of misogyny.
One 18-year-old girl told the commissioner: 'The narrative of Andrew Tate and influencers like that … backed by a quite violent and becoming more influential porn industry is making it seem that AI is something that you can use so that you can always pressure people into going out with you or doing sexual acts with you.'
The report noted that there is a link between deepfake abuse and suicidal ideation and PTSD, for example in the case of Mia Janin, who died by suicide in March 2021.
De Souza wrote in the report that the new technology 'confronts children with concepts they cannot yet understand', and is changing 'at such scale and speed that it can be overwhelming to try and get a grip on the danger they present'.
Lawyers told the Guardian that they were seeing this reflected in an increase in cases of teenage boys getting arrested for sexual offences because they did not understand the consequences of what they were doing, for example experimenting with deepfakes, being in a WhatsApp chat where explicit images are circulating, or looking up porn featuring children their own age.
Danielle Reece-Greenhalgh, a partner at the law firm Corker Binning who specialises in sexual offences and possession of indecent images, said the law was 'trying to keep up with the explosion in accessible deepfake technology', which was already posing 'a huge problem for law enforcement trying to identify and protect victims of abuse'.
She noted that app bans were 'likely to stir up debate around internet freedoms', and could have a 'disproportionate impact on young men' who were playing around with AI software unaware of the consequences.
Reece-Greenhalgh said that although the criminal justice system tried to take a 'commonsense view and avoid criminalising young people for crimes that resemble normal teenage behaviour … that might previously have happened behind a bike shed', arrests could be traumatic experiences and have consequences at school or in the community, as well as longer-term repercussions such as needing to be declared on an Esta form to enter the US or showing up on an advanced DBS check.
Matt Hardcastle, a partner at Kingsley Napley, said there was a 'minefield for young people online' around accessing unlawful sexual and violent content. He said many parents were unaware how easy it was for children to 'access things that take them into a dark place quickly', for example nudification apps.
'They're looking at it through the eyes of a child. They're not able to see that what they're doing is potentially illegal, as well as quite harmful to you and other people as well,' he said. 'Children's brains are still developing. They have a completely different approach to risk-taking.'
Marcus Johnstone, a criminal solicitor specialising in sexual offences, said he was working with an 'ever-increasing number of young people' who were drawn into these crimes. 'Often parents had no idea what was going on. They're usually young men, very rarely young females, locked away in their bedrooms and their parents think they're gaming,' he said. 'These offences didn't exist before the internet, now most sex crimes are committed online. It's created a forum for children to become criminals.'
A government spokesperson said: 'Creating, possessing or distributing child sexual abuse material, including AI-generated images, is abhorrent and illegal. Under the Online Safety Act platforms of all sizes now have to remove this kind of content, or they could face significant fines.
'The UK is the first country in the world to introduce further AI child sexual abuse offences, making it illegal to possess, create or distribute AI tools designed to generate heinous child sexual abuse material.'
In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In the US, call or text the Childhelp abuse hotline on 800-422-4453. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Scots warned of ‘scamdemic' as £860,000 lost to cyber criminals in 12 months
Scots warned of ‘scamdemic' as £860,000 lost to cyber criminals in 12 months

STV News

time2 days ago

  • STV News

Scots warned of ‘scamdemic' as £860,000 lost to cyber criminals in 12 months

Scots have lost more than £860,000 to cyber criminals in the last year as digital scams have increased by more than 1,000% in recent years, according to figures from the national consumer advice service. In what is being dubbed a 'scamdemic' by charity Advice Direct Scotland, the number of such crimes rose from 94 in 2021-22 to 1,119 in 2024-25 – a 1,090% increase. The charity, which runs the website offering advice on avoiding scams, says the reality of the so-called scamdemic is likely to be far worse as the figures do not include cases which go unreported or are reported directly to police. It urged the public to remain vigilant as scammers' methods become more sophisticated. Over the same 2021-22 to 2024-25 period, it said social media scams rose 847%, or from 19 to 180. Email-based scams rose from 24 to 88, while SMS scams dropped by 75%, down from 102 to 25. Phone scams remained at a steady level, with 200 in 2021/22 and 202 this year. The financial impact of scams reported to Advice Direct Scotland reached £861,384 over the past 12 months, with £338,758 lost in the last six months alone. The more common forms of fraud are romance scams, cryptocurrency fraud, fake deals sent through phishing emails, and false adverts on social media. The charity says consumers should be wary of high-pressure tactics and unrealistic promises. Scams were previously more easily identified, with poor spelling or grammar being an obvious sign. Criminals are now using AI to create more convincing messages, and according to the charity, some are even creating fake celebrity endorsements to make their schemes seem more authentic. New rules under the Online Safety Act came into force in March which require online platforms to do more to stop user-generated fraud. They must now carry out risk assessments and have effective complaints procedures in place, while large platforms must provide a dedicated way for users to report scams. Hazel Knowles, senior project lead at Advice Direct Scotland, said: 'Unfortunately, we know that Scots have lost more than £860,000 to online scams in the past year, but this is just the tip of the iceberg. 'This is a hugely under-reported issue. Many victims are too embarrassed to speak out or do not even realise they have been targeted until it is too late. 'Our latest figures show a clear shift in tactics. Cyber criminals are moving away from basic text scams and using more sophisticated methods across email, social media and interactive platforms that mimic legitimate services with worrying accuracy. 'While SMS scams are in sharp decline, phone scams remain surprisingly resilient. It shows that some old threats persist even as technology evolves. 'It is important to remember there is no shame in being scammed, and anyone who is worried or needs help can contact our specialist advisers for free, impartial and practical advice. 'We are here to help people stay one step ahead of scammers and protect the digital spaces we all rely on.' Advice is available on or by calling 0808 164 6000. Get all the latest news from around the country Follow STV News Scan the QR code on your mobile device for all the latest news from around the country

Pair Finance brings digital first debt collection to Poland and opens office in Warsaw
Pair Finance brings digital first debt collection to Poland and opens office in Warsaw

Finextra

time2 days ago

  • Finextra

Pair Finance brings digital first debt collection to Poland and opens office in Warsaw

Most people have a negative association with debt collection. PAIR Finance is out to change that. 0 The technology company is using artificial intelligence to do so and is already successfully leading the industry in nine European countries, with customers such as online fashion retailer Zalando. Today, PAIR Finance is announcing its market entry in Central and Eastern Europe with the launch in Poland, marking its tenth market. As part of this step, it is strengthening its management team with Michał Gębała. As Managing Director, he will drive the growth of PAIR Finance in Poland from the Warsaw office together with his team. The fintech uses reinforcement learning, supervised learning and GenAI to improve the quality of service in debt collection for consumers and to maximise the recovery rate for business customers. PAIR Finance integrates the latest findings from behavioural economics and psychology. The key to success: individually optimised solutions. For example, the AI model can suggest sending a consumer a message in a cooperative tone on Tuesday noon and offering the payment options BLIK and Apple Pay. Michał Gębała has been appointed as managing director of PAIR Finance Poland. He has almost two decades of management experience in the financial services industry. Among other roles, Michał Gębała has been Country Head Poland at OMNIO, MyBucks and MiniCredit. His most recent role was Head of Digital Collections at Austrian debt collection company AxFina Holding SA. 'We are delighted to have won an experienced finance expert like Michał Gębała for PAIR Finance', says Stephan Stricker, CEO and founder of the PAIR Finance Group. 'He will be contributing his extensive expertise in Poland to our organisation and has an excellent network. His track record matches our vision of redefining debt collection through bold innovation. With Michał, PAIR Finance is perfectly placed for expansion on the Polish market.' 'I am excited to be building the Polish market for the most dynamic debt collection tech company in Europe', says Michał Gębała, Managing Director PAIR Finance Poland. 'Having seen what the PAIR Finance team has developed and achieved, I am sure that we are ideally positioned to provide Polish companies with a service that is totally new to the market and will lead our industry with our innovative and consumer-centric approach.' PAIR Finance is attracting strong interest in the European market and, thanks to its outstanding service, has more than 500 business customers, including well-known digital players from sectors such as e-commerce, PSP, banking, digital goods, insurance, TelCo, mobility or energy. A holistic approach helps consumers to make outstanding payments in a self-determined way and avoid legal proceedings. In Poland, the fintech offers consumers popular local payment methods. Poland is one of the fastest growing economies in Europe. The GDP is expected to rise by 3.5% in 2025. The country currently has a lower debt ratio than the EU average. Household debt in Poland was 23.4% of GDP in the second quarter of 2024, well below the highs of 2015.

Does storing your car keys in a microwave prevent theft?
Does storing your car keys in a microwave prevent theft?

South Wales Guardian

time3 days ago

  • South Wales Guardian

Does storing your car keys in a microwave prevent theft?

Usually, storing car keys in a kitchen may mean hanging them on a key holder or placing them in a dish. But a recent increase of motorists storing their car keys in a microwave has led to experts to express their concern. Yes, the trusty microwave has seen an increase in key storage overnight as it appears it's not only used for a quick re-heat of a midnight snack. Ofcom states that it's all down to the latest methods that crafty thieves are taking advantage of with keyless fobs. 'These keyless fobs use radio frequencies to communicate with your car, allowing it to unlock and start it when you have it in your possession,' Ofcom says. 'Sadly for car owners, thieves are sometimes able to use bits of tech that capture and amplify these frequencies in a way that helps them to gain access to your car. 'And if they can get near enough to the fob, for example by lurking close to your home or in your driveway, they can do this while it is inside your house – where you might think it's safe.' Because a microwave uses high-power radio waves to heat food, these can block the signal between your keys and car – but they're not the most ideal or safe solution. We spoke to experts from car leasing deals site LeaseLoco, who have been hearing about a rise in this peculiar trend. John Wilmot, CEO and founder of LeaseLoco, said: 'Thieves are becoming increasingly savvy when it comes to keyless car theft. 'Many modern cars use radio frequency signals to communicate with key fobs, which means criminals can use relay devices to capture and amplify these signals - even when the key is inside your home - to unlock and steal your vehicle. 'While it's true that microwaves are designed to block electromagnetic signals, using one as a storage place for your car keys is far from practical - and potentially dangerous. 'Microwaves are not a safe or recommended alternative to a proper signal-blocking solution.' It's easily done – you're in a rush and accidentally catch the timer on your microwave, or worse still, microwave a ready meal with your keys already inside. Mr Wilmot continued: 'If you forget the keys are inside and accidentally turn it on, you risk seriously damaging the fob's delicate electronics. Recommended reading: How you could get a roadside fine amid new DVSA changes HMRC How is car tax changing on April 1 2025 for drivers? DVLA issues warning to anyone who passed their driving test before 2015 'That could mean short circuits, melted components, or total failure, and repairs or replacements can run into the hundreds. 'Worse still, microwaving a key fob introduces a real fire hazard. The metal components inside could spark, potentially causing smoke, flames, or even damage to your microwave and home. 'The safer, more reliable option is a purpose-made RFID-blocking pouch or Faraday box. These are specifically designed to block key signals and protect your car from relay theft, without the risk of frying your keys!'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store