logo
Children's Ombudsman hugely concerned over use of AI 'nudify' apps on images of underage girls

Children's Ombudsman hugely concerned over use of AI 'nudify' apps on images of underage girls

The Journal03-05-2025

THE CHILDREN'S OMBUDSMAN has said he is 'hugely concerned' about the potential of AI apps that can be used by anyone to create sexually explicit images of children.
Dr Niall Muldoon has warned that stronger laws are needed to tackle the scourge of so-called 'nudification' apps, which allow real photos of women and girls to be edited by artificial intelligence to produce deepfake images that make them appear naked.
Nudification apps can be downloaded via online app stores,
though some have been removed by Apple and Google
; others can be accessed via a web browser by anyone who has a URL to the relevant app.
Although sharing non-consensual sexual images is a crime in Ireland under the Harassment, Harmful Communications and Related Offences Act (also known as Coco's Law), legal experts
have said the legislation does not cover the creation of deepfakes
.
Tens of thousands of ads for these apps
have appeared on Facebook and Instagram
in recent months, and keep pushing the apps to Irish users despite Meta's repeated attempts to remove them because they breach the company's advertising rules.
'The ease of access by children to this type of technology is a huge concern to the Ombudsman for Children's Office (OCO),' Muldoon told
The Journal
.
'It is difficult to comprehend any possible need for these apps when the risk of abuse and sexual exploitation of children is so high.'
He called for Coimisiún na Meán and the European Commission to strengthen the oversight of major technology companies under the Digital Services Act, to ensure that the apps were not being recommended to children and young people online.
A spokesperson for Coimisiún na Meán said that the Online Safety Framework makes big tech platforms accountable for how they protect people, especially children, from harm online.
The European Commission's spokesperson for tech sovereignty Thomas Regnier said that the commission is aware that ads for services to create pornographic deepfakes of women were present on Facebook and Instagram.
He also said large tech companies have an obligation to ensure measures are in place that mitigate risks to users.
A spokesperson for Meta said the company prohibits the display of nudity or sexual activity in its ads and that the company removes ads that violate its policies, but that bad actors are continually evolving their tactics to avoid enforcement.
Ombudsman for Children Dr Niall Muldoon has expressed concern
RollingNews.ie
RollingNews.ie
Nudification apps have already attained notoriety in other countries, including in the United States, where
dozens of teenage girls have been targeted
in schools in California, New Jersey
and Washington
.
Earlier this week, the children's commissioner for England called for the apps to be banned after publishing a report which found that deepfake nudification apps disproportionately target women and girls.
The report contained interviews
from a number of teenage girls, some of whom said they had already changed their online behaviour as a result of nudification technology.
'This chilling effect is causing them to take steps to keep themselves safe, which often requires them to limit their behaviour in some way,' the report said.
'This pattern of behaviour is similar to girls avoiding walking home alone at night, or not going to certain public places alone.'
The Dublin Rape Crisis Centre previously said it was 'deeply concerned' about the capacity of deepfake images to 'amplify harm to women' and said they should not be available to download.
What are nudification apps and how do they work?
Nudification apps can be downloaded via app stores (if they have not already been removed), or accessed via a web browser using a URL; certain bots on the messaging app Telegram also offer nudification services.
Advertisement
The apps encourage users to upload a photo of any woman, and offer to produce a new, deepfake version of the same image in which the person appears without clothes.
The apps are thought to have been trained using open-source artificial intelligence models in which the underlying code is freely available for anyone to copy, tweak and use for whatever purpose they want if they have the skills to do so.
In the case of nudification apps, the artificial intelligence works by creating new images that are based on their attempts to replicate existing images that they have been trained on.
They are specifically thought to have been trained from vast amounts of explicit images of women, which is why they tend to only work on women and teenage girls.
The artificial intelligence is unable to tell when a person is underage or that such images are illegal.
Graphika, a US company that tracks online disinformation,
has said
that open-source AI models are 'the primary driver' behind a surge in the creation and dissemination of non-consensual images of adults, including through the use of nudification apps.
The UK-based Internet Watch Foundation
has also said that
creators of child sexual abuse material have legally used open-source AI models to create explicit deepfake images of children.
An ad for a nudification app seen on Facebook
Meta Ad Library
Meta Ad Library
Deepfake economy
Graphika has also warned that nudification services and the creation of sexually explicit deepfake images has become a 'fully-fledged online industry', which some have dubbed the 'deepfake economy'.
Nudification apps often seek payment to create deepfake images, while they can also be used as part of targeted harassment campaigns and for sextortion.
In many cases, links to nudification services can be found through Google searches.
The Journal
has also uncovered thousands of targeted ads for nudification apps, which claim that apps can 'erase' or 'see through' the clothes of any woman, that are being pushed to Irish social media users on Facebook and Instagram on an ongoing basis.
Advertisements entice users by claiming 'one click to undress', 'upload image, you can see anything about her' and 'your friends in transformed photos'.
The ads link to app stores, where AI editing apps can be downloaded, and third-party websites that can be accessed by anyone with a URL that links to the relevant website.
They often feature side-by-side images of a woman with clothes on and the same image of the woman naked or partly naked; other ads feature videos of women dancing or talking, which occasionally flash in a way that the woman appears with no clothes.
Some versions of the ads use AI-generated images of women, but others use images of real women that appear to be taken from social media.
The ads tend to feature on fake profiles that have small numbers of followers, but which appear to be somewhat co-ordinated: different pages will use the same names and images, or claim that they are based in similar locations.
Many share different links that re-direct to the same website in an apparent attempt to avoid falling foul of Meta's advertising rules.
Since the beginning of April,
The Journal
has found dozens of pages that have advertised nudification services via more than 20 unique links, which re-direct users to a single web-based app.
Meta has removed the majority of ads for these services, though some remain active; in some cases, the ads were only removed once they were flagged by
The Journal
while links to those that were not shared with Meta remained online.
If you have been affected by any of the issues mentioned in this article, you can reach out for support through the following helplines:
Dublin Rape Crisis Centre
- 1800 77 8888 (fre, 24-hour helpline)
Samaritans
- 116 123 or email jo@samaritans.org (suicide, crisis support)
Pieta
- 1800 247 247 or text HELP to 51444 – (suicide, self-harm)
Teenline
- 1800 833 634 (for ages 13 to 19)
Childline
- 1800 66 66 66 (for under 18s)
Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article.
Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.
Learn More
Support The Journal

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Waste management firm Chemishield to create 50 new Irish jobs
Waste management firm Chemishield to create 50 new Irish jobs

Irish Examiner

timean hour ago

  • Irish Examiner

Waste management firm Chemishield to create 50 new Irish jobs

Chemical waste management company Chemishield has announced the creation of 50 new roles across its Irish operations. The jobs will be filled in the next 18 months, the company has said, and will support Chemishield's next phase of growth which will include its expansion into North America. As well as its geographical expansion, the company is also aiming for increased customer acquisition and new product development. The company has also announced that Malcolm Bell, the CEO of Envetec Sustainable Technologies, has made a strategic investment in the company and joined its board of directors. The investment represents an important new phase for Chemishield, with firm said, with Mr Bell boasting "an impressive track record in businesses such as healthcare, life sciences, and sustainability." 'Malcolm's decision to join Chemishield highlights the scale of the opportunity ahead,' said Kevin Walsh, founder and CEO of Chemishield. 'We are changing how chemical and lab waste is managed at its source, and Malcolm's strategic insight, proven track record, and experience scaling companies will be essential as we grow our platform and expand our partnerships. Over the next 18 months, we expect to onboard up to 50 FTEs across sales, support and tech.' Chemishield's cloud-based platform replaces older manual systems with digital tools that automate waste classification, regulatory labelling, and compliance reporting. The software helps organisations reduce risk, cut costs, and meet ESG goals while supporting broader digital transformation agendas. 'Chemishield sits at the intersection of safety, sustainability, and digital transformation," said Malcolm Bell, founder and CEO of Envetec. 'With rising demand across essential industries, Chemishield is uniquely positioned to scale and set a new benchmark for compliance, safety, and operational efficiency.' In addition to its expansion, the company is also exploring white-label opportunities with global environmental service providers. Chemishield focuses on key sectors, including pharmaceuticals, healthcare, life sciences, and food and beverages. Its waste segregation module is the first of its kind, enabling environmental, health and safety, and lab teams to eliminate errors at the source while enhancing data capture and waste stream traceability.

'RTÉ investigates' shows dire practices in privately-run nursing homes
'RTÉ investigates' shows dire practices in privately-run nursing homes

Irish Examiner

timean hour ago

  • Irish Examiner

'RTÉ investigates' shows dire practices in privately-run nursing homes

Residents at two privately-run Irish nursing homes were left to sit in their own urine and subjected to manhandling by staff, among many other abuses, a bombshell new investigation shows. RTÉ Investigates will tonight broadcast Inside Ireland's Nursing Homes, the result of a months-long undercover investigation at two nursing homes — The Residence in Portlaoise and Beneavin Manor in north Dublin City. The programme details a litany of questionable behaviour and practices, including: A resident with mobility challenges being left on their own in a bathroom; A man being refused a toilet break for 25 minutes due to chronic understaffing; A frail female resident with dementia, considered a serious fall risk, being left alone on the edge of her bed for several minutes at night while confused and agitated and seeking a toilet break; A man repeatedly being left sitting in an unchanged incontinence pad despite still being able to use a toilet and having requests to do so denied; 'Fake' lists of activities created for residents' logs in order to show their time as occupied by pursuits, when the sole activity noted for residents was watching television. The cost for a resident staying at the two homes in question is €1,320 and €1,514 per week. The investigation will likely lead to renewed calls for Ireland to enact an adult safeguarding law — which was promised after a similar scandal at the Leas Cross nursing home in north Dublin in 2005, but which has never introduced. The investigation also found that understaffing is endemic at the two homes, which are run by French corporate Emeis. It has been the subject of similar scandals in France in the past five years. The undercover investigation shows one staff member typically assigned to care for more than 20 residents at a time, particularly at night. This means that residents cannot be brought to the bathroom or taken outside for exercise and are typically confined to one overcrowded room. The scene involving the frail female dementia payment is particularly distressing — with the lady in question calling for help for several minutes before finally being attended to by a nurse. Another shows a healthcare assistant declaring that 'these bells are driving me mad' as assistance bells ring out across a corridor in the Portlaoise home, with no staff available to answer them. Staff are informed that if all incontinence pads in their itinerary are used, they will have to make do with 'what's there'. A nurse is heard on camera objecting to this practice, noting that 'incorrect incontinence wear is a form of abuse'. The practice of 'double-padding' — placing two incontinence pads on a person in order to double the amount of moisture to be collected, which can lead to pressure sores — is also depicted in detail. The investigation also shows multiple staff at the two homes ignoring care plans indicating that frail residents should only be moved using hoists, with manual handling forbidden. Instead, staff are seen moving residents by gripping them under their arms or by their trouser belts. Neither Emeis nor the Health Information and Quality Authority (Hiqa) had replied to a request for comment at the time of publication. Hiqa had repeatedly inspected the two homes featured in recent years. The most recent inspection of the Portlaoise home found that the institution was 'short-staffed', with some residents who were at a high risk of malnutrition. Emeis told RTÉ that the evidence of poor care standards, handling care plans being ignored, and the lack of dignity afforded to residents is 'deeply distressing', adding that it 'does not tolerate any individual or systemic neglect or practices'. Addressing the evidence uncovered by the investigators, David Robinson, a consultant geriatrician at St James Hospital in Dublin, said the situation is 'about abuse'. 'There is no other word for it, really,' he said. 'This is going to shorten people's lives and the lives that they have will be more miserable because of the situation that they're in,' said Prof Robinson. Read More Home care regulation will reduce choice and boost big firms' profits

Domestic abuse victims ‘re-traumatised' at custody and access proceedings
Domestic abuse victims ‘re-traumatised' at custody and access proceedings

Irish Independent

time4 hours ago

  • Irish Independent

Domestic abuse victims ‘re-traumatised' at custody and access proceedings

Cian Ó Broin Research into the Irish family law system has found that adult and child victim-survivors of domestic abuse are being 're-traumatised' during guardianship, custody and access proceedings. International evidence has shown that the prevalence of domestic violence and abuse (DVA) in all family law cases is disproportionately high, but the Irish system is 'not sufficiently DVA-informed or responsive'.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store