Latest news with #HarmfulCommunicationsandRelatedOffencesAct

The Journal
5 days ago
- The Journal
Public urged not to share images after four people fall victim to 'sextortion' last weekend
THE PSNI HAS issued a major warning after four people were targeted by online sexual blackmail in one day last weekend. Commonly known as 'sextortion', the blackmail involved threats made by an anonymous individual online to share a person's sexual images, clips or information. The nature of the sextortion can be predatory where the victim is coerced into sending more intimate pictures under the threat of sharing existing ones, or financial where the perpetrator threatens to share the images of the victim unless a ransom is paid. The PSNI says it received 70 reports of sextortion per month at its peak two years ago before falling to an average of 45 between last year and now. The warning comes after the force said it received four reports last Saturday, 24 May, alone from four men in the Belfast area who claimed to have been targeted. Advertisement Detective Inspector Karen Hamill explained how to identify the hallmarks of a sextortion attempt: 'Typically, a person uses a false identity to befriend a victim via social media. 'The exchange may start with flirting or flattery, but ends with the victim coaxed into sending intimate images or performing sexual acts online, unwittingly in front of a camera. 'Behind the fake and attractive guise, there's a criminal. These people are often part of sophisticated and organised crime groups, mostly based overseas. They extort their victims by threatening to share those images or recordings unless demands for money are met.' Detective Inspector Hamill said that the majority of victims are young men, aged between 18 and 23. She urges people to be on their guard and to always be aware of sharing intimate images online. She offered this advice: 'Don't panic; don't respond to demands; and don't enter into further communication. If you can, confide in a trusted friend or family member, and please contact officers immediately on 101.' In the South, sextortion is illegal under the Harassment, Harmful Communications and Related Offences Act, also known as 'Coco's Law', which came into effect in 2021. To date, An Garda Síochána has commenced 72 prosecutions related to 49 investigations under Coco's Law, with 82% of victims being male. Readers like you are keeping these stories free for everyone... A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation. Learn More Support The Journal

The Journal
03-05-2025
- Politics
- The Journal
Children's Ombudsman hugely concerned over use of AI 'nudify' apps on images of underage girls
THE CHILDREN'S OMBUDSMAN has said he is 'hugely concerned' about the potential of AI apps that can be used by anyone to create sexually explicit images of children. Dr Niall Muldoon has warned that stronger laws are needed to tackle the scourge of so-called 'nudification' apps, which allow real photos of women and girls to be edited by artificial intelligence to produce deepfake images that make them appear naked. Nudification apps can be downloaded via online app stores, though some have been removed by Apple and Google ; others can be accessed via a web browser by anyone who has a URL to the relevant app. Although sharing non-consensual sexual images is a crime in Ireland under the Harassment, Harmful Communications and Related Offences Act (also known as Coco's Law), legal experts have said the legislation does not cover the creation of deepfakes . Tens of thousands of ads for these apps have appeared on Facebook and Instagram in recent months, and keep pushing the apps to Irish users despite Meta's repeated attempts to remove them because they breach the company's advertising rules. 'The ease of access by children to this type of technology is a huge concern to the Ombudsman for Children's Office (OCO),' Muldoon told The Journal . 'It is difficult to comprehend any possible need for these apps when the risk of abuse and sexual exploitation of children is so high.' He called for Coimisiún na Meán and the European Commission to strengthen the oversight of major technology companies under the Digital Services Act, to ensure that the apps were not being recommended to children and young people online. A spokesperson for Coimisiún na Meán said that the Online Safety Framework makes big tech platforms accountable for how they protect people, especially children, from harm online. The European Commission's spokesperson for tech sovereignty Thomas Regnier said that the commission is aware that ads for services to create pornographic deepfakes of women were present on Facebook and Instagram. He also said large tech companies have an obligation to ensure measures are in place that mitigate risks to users. A spokesperson for Meta said the company prohibits the display of nudity or sexual activity in its ads and that the company removes ads that violate its policies, but that bad actors are continually evolving their tactics to avoid enforcement. Ombudsman for Children Dr Niall Muldoon has expressed concern Nudification apps have already attained notoriety in other countries, including in the United States, where dozens of teenage girls have been targeted in schools in California, New Jersey and Washington . Earlier this week, the children's commissioner for England called for the apps to be banned after publishing a report which found that deepfake nudification apps disproportionately target women and girls. The report contained interviews from a number of teenage girls, some of whom said they had already changed their online behaviour as a result of nudification technology. 'This chilling effect is causing them to take steps to keep themselves safe, which often requires them to limit their behaviour in some way,' the report said. 'This pattern of behaviour is similar to girls avoiding walking home alone at night, or not going to certain public places alone.' The Dublin Rape Crisis Centre previously said it was 'deeply concerned' about the capacity of deepfake images to 'amplify harm to women' and said they should not be available to download. What are nudification apps and how do they work? Nudification apps can be downloaded via app stores (if they have not already been removed), or accessed via a web browser using a URL; certain bots on the messaging app Telegram also offer nudification services. Advertisement The apps encourage users to upload a photo of any woman, and offer to produce a new, deepfake version of the same image in which the person appears without clothes. The apps are thought to have been trained using open-source artificial intelligence models in which the underlying code is freely available for anyone to copy, tweak and use for whatever purpose they want if they have the skills to do so. In the case of nudification apps, the artificial intelligence works by creating new images that are based on their attempts to replicate existing images that they have been trained on. They are specifically thought to have been trained from vast amounts of explicit images of women, which is why they tend to only work on women and teenage girls. The artificial intelligence is unable to tell when a person is underage or that such images are illegal. Graphika, a US company that tracks online disinformation, has said that open-source AI models are 'the primary driver' behind a surge in the creation and dissemination of non-consensual images of adults, including through the use of nudification apps. The UK-based Internet Watch Foundation has also said that creators of child sexual abuse material have legally used open-source AI models to create explicit deepfake images of children. An ad for a nudification app seen on Facebook Meta Ad Library Meta Ad Library Deepfake economy Graphika has also warned that nudification services and the creation of sexually explicit deepfake images has become a 'fully-fledged online industry', which some have dubbed the 'deepfake economy'. Nudification apps often seek payment to create deepfake images, while they can also be used as part of targeted harassment campaigns and for sextortion. In many cases, links to nudification services can be found through Google searches. The Journal has also uncovered thousands of targeted ads for nudification apps, which claim that apps can 'erase' or 'see through' the clothes of any woman, that are being pushed to Irish social media users on Facebook and Instagram on an ongoing basis. Advertisements entice users by claiming 'one click to undress', 'upload image, you can see anything about her' and 'your friends in transformed photos'. The ads link to app stores, where AI editing apps can be downloaded, and third-party websites that can be accessed by anyone with a URL that links to the relevant website. They often feature side-by-side images of a woman with clothes on and the same image of the woman naked or partly naked; other ads feature videos of women dancing or talking, which occasionally flash in a way that the woman appears with no clothes. Some versions of the ads use AI-generated images of women, but others use images of real women that appear to be taken from social media. The ads tend to feature on fake profiles that have small numbers of followers, but which appear to be somewhat co-ordinated: different pages will use the same names and images, or claim that they are based in similar locations. Many share different links that re-direct to the same website in an apparent attempt to avoid falling foul of Meta's advertising rules. Since the beginning of April, The Journal has found dozens of pages that have advertised nudification services via more than 20 unique links, which re-direct users to a single web-based app. Meta has removed the majority of ads for these services, though some remain active; in some cases, the ads were only removed once they were flagged by The Journal while links to those that were not shared with Meta remained online. If you have been affected by any of the issues mentioned in this article, you can reach out for support through the following helplines: Dublin Rape Crisis Centre - 1800 77 8888 (fre, 24-hour helpline) Samaritans - 116 123 or email jo@ (suicide, crisis support) Pieta - 1800 247 247 or text HELP to 51444 – (suicide, self-harm) Teenline - 1800 833 634 (for ages 13 to 19) Childline - 1800 66 66 66 (for under 18s) Readers like you are keeping these stories free for everyone... A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation. Learn More Support The Journal