logo
#

Latest news with #RevengePornHelpline

Revenge porn victims will feel 'abandoned,' critics warn as less than half of reported crimes end up in court
Revenge porn victims will feel 'abandoned,' critics warn as less than half of reported crimes end up in court

Daily Mail​

time10-05-2025

  • Daily Mail​

Revenge porn victims will feel 'abandoned,' critics warn as less than half of reported crimes end up in court

Fewer than half of revenge porn cases reported to the police have ended up in Scotland's courts. And of those that are prosecuted, only a tiny proportion have resulted in a custodial sentence, figures obtained by The Scottish Mail on Sunday show. In the last full year, just two people were imprisoned for the crime of intimate image abuse, despite there being nearly 800 reports to police. In 2023-24, there were 781 reports made to Police Scotland regarding suspects threatening to or disclosing intimate images. In the previous year, the figure was 805 and in the year before that it was 912. However, the number of charges reported to the Crown Office and Procurator Fiscal Service for those same years was 215, 228 and 312, respectively. Of the 215 charges reported in 2023-24 under the Abusive Behaviour and Sexual Harm (Scotland) 2016 Act, 51 people were convicted, 57 are ongoing, and 20 were marked 'no further action'. In the same period, four people were admonished – effectively let off with a warning – 34 were given a community payback order, three had to pay compensation, five were fined, one was put under a non-harassment order and two were given a restriction of liberty order. Only one was imprisoned, to add to the single criminal jailed in 2022-23. The revelation comes as the Revenge Porn Helpline provided support in more than 22,000 cases in 2024 compared with 18,000 in the previous year, following a 20 per cent rise in demand from victims across the UK. Helpline manager Sophie Mortimer said: 'Our data continuously highlights that intimate image abuse is still one of the most significant and concerning digital harms affecting adults right now. 'The rise in cases, the low levels of conviction and the persistent difficulties and gaps within legislation demonstrate that existing measures are insufficient.' Scottish Conservative spokeswoman for victims Sharon Dowey said: 'Revenge porn is a horrendous crime. 'It is deeply alarming that such a small number of cases are either being brought to court or ending up with offenders being convicted. 'Victims will rightly feel abandoned and question whether it was worth reporting these crimes. 'It is vital that SNP Ministers heed these damning figures.' Scotland's procurator fiscal for domestic abuse, Emma Forbes, said: 'Each year between 2021 and 2024, we took action in at least 86 per cent of charges received. 'We take reports of alleged abuse seriously and will take the appropriate prosecutorial action where there is sufficient evidence, and it is in the public interest to do so.' Detective Inspector Alasdair Penny said: 'Police are experiencing an increase in the number of revenge porn-related incidents being reported and are encouraged that people are more confident in coming forward. 'Every report is investigated thoroughly, and we will do all we can to gather available evidence. Where criminality is established, we will find those responsible and bring them to justice.' A Scottish Government spokesman said: 'The investigation of crime is an independent operational matter for Police Scotland and prosecutions are an independent matter for the Crown Office and Procurator Fiscal Service.'

I've seen British men making deepfake porn of their own mothers and sisters in secret online forums – they trade them like Pokemon cards
I've seen British men making deepfake porn of their own mothers and sisters in secret online forums – they trade them like Pokemon cards

The Sun

time08-05-2025

  • The Sun

I've seen British men making deepfake porn of their own mothers and sisters in secret online forums – they trade them like Pokemon cards

AT 15 years old, Jess Davies felt like her world had ended when she discovered a picture of her in her underwear was being shared around by boys at her school. But, horrifyingly, it wouldn't be the only time she was a victim of picture-based abuse - many years later, her boyfriend would also betray her in the same cruel way. 13 Jess, from Aberystwyth, was working as a part-time model when she found naked photos - which had been taken of her when she was asleep - in a group chat on her boyfriend's phone. She says she quickly deleted the images and accepted it, not realising until years later that it was deeply disturbing and a criminal offence. Now 32, the horrific experiences Jess faced led her to become a women's rights campaigner, raising awareness of online misogyny and images being spread without consent. She told The Sun: "I was so young and it [sharing pics] was something that had been normalised. This is just what happens. It wasn't until I got a little bit older that I realised 'that wasn't right'." Jess, now single, also says there is "so much shame and stigma" towards female victims but that several recent shocking high-profile cases are finally beginning to shine a spotlight on the online abuse women face. Gisele Pelicot, whose husband recruited 72 men online to rape her as she lay drugged, said during his court case, "shame needs to change sides". And Jess could not agree more. 'Traded like Pokémon cards' The Revenge Porn Helpline said it received 22,275 reports of image-based abuse last year, which is the highest it has ever seen. Abuse of women online has been highlighted time and time again as it becomes disturbingly common. Vicky Pattison shares deepfake porn clip of herself as she warns of dangers on C4 doc Jess said most women don't know that pictures of them are even circulating online on Reddit, Discord, Telegram and sick forums like 4chan - famous for its extreme content. While taking a deep dive into the harrowing corners of the internet for her new book, No One Wants to See Your D*ck, she found that some sick individuals had so many nude images of women that they are divided into folders. She says: "People's sons, brothers and friends are trading these photos like Pokémon cards and the women in the images have no idea that someone they trust is doing this. "I saw teachers in there, people making deepfakes of their mothers, their aunts, their sisters. It was crazy. It is happening on such a big scale. You only need one photo to be able to create an explicit deep fake." And no one is safe, even those who have never taken a nude picture, as recent AI development makes it easy for these sick people to "nudify" a woman. With just a profile picture and the click of a button, AI can remove clothes from an innocent photo, make it more seductive or even swap a person's head onto a naked body - creating incredibly realistic deepfakes. "These nudify bots post on their sites that they're getting millions of people using them a day. "Millions of women don't know that they've been turned into explicit deepfakes. And then it's like, 'How do you keep track of that? How do you report that?' "Of course, it's not all men. I have so many great men in my life and my family. But it's not just a select few either." In 2024, nearly 4,000 celebrities were found to be victims of deepfake porn, including actresses Scarlett Johansson and Emma Watson. Speaking about deepfake, Scarlett said: "Nothing can stop someone from cutting and pasting my image or anyone else's onto a different body and making it look as eerily realistic as desired. The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause." And with AI technology advancing so rapidly, most police forces are struggling to find a way to deal with the influx of online abuse cases. 13 13 13 13 Jess says: "I think there's just such a feeling of entitlement over women's bodies. Most women don't know this has happened to them. "And yet, if you go into the forums, there are hundreds and hundreds and hundreds of women - every day you'll find new images being posted of women who are being turned into explicit deepfakes, mostly by men and boys that they know." Describing what she saw while looking into these vile forums, Jess said the photos are divided into locations for men to search for specific women. "So it'd be like North Wales girls and Cumbria, Edinburgh," Jess adds. "It's like, 'Anyone got Jess Davies from Aberystwyth' and someone will be like, 'Yeah, I do. I've got so and so I'll trade you'." Others play sick games between them, such as 'Risk'. One will post an image online, and if another 'catches' them by responding within five minutes, they then have to reveal the woman's full name and socials. Jess has even seen her own modelling images used online for scams, porn sites, escort services and sex chats. 'You know these men' While this might seem like anonymous men in shadowy corners of the internet, these are people women know and likely trust. "These are men that know women personally, because they're men from your hometown. "When you walk down the street or pop to the shops, or you're at the school gates, that could be someone who's actively trading your images without consent that you don't know of. It's like Pokémon cards, right? It's like, Oh, who have you got? I've got this. I'll send that. It's like you've got a sticker book that you're all trading photos for Jess Davies "These are men that we know who are all doing this. And then we have to exist alongside them. It's crazy that this is happening. "It's like Pokémon cards, right? It's like, 'Oh, who have you got? I've got this. I'll send that'. It's like you've got a sticker book that you're all trading photos for. And yet it's the women still that you're angry at." Jess said whenever a survivor of this abuse speaks out, she's immediately slammed as being "irrational". "Why are you more angry at women speaking up about their lived experiences than the men who are giving you a bad name? "It's not just some man's behaviour living out online. It's showing what they would do if they could be anonymous in real life." Jess pointed to the harrowing case of Gavin Plumb, who was jailed for trying to kidnap, rape and murder celebrity Holly Willoughby. His sick scheme was organised on forums and not one person reported him until an undercover police officer came across the twisted plot. 13 13 His desires were fulled by deepfake porn shared online with others who shared his vile perversions. Jess also pointed to the French case of Gisele Pelicot, a rape victim who waivered her anonymity to stand against her own abusive husband. She says: "Her husband found 50 plus men to rape his unconscious wife. He found them on forums." 'It's never in the past' Jess refuses to go on dating apps and finds dating difficult because of all she has suffered. She has seen the worst of men when venturing into shadowy corners of the internet, seeing content that would leave anyone shaken. When Jess was a teenager, a photo she shared with a boy she trusted made its way through the entire school and the football team. Classmates texted her saying, "nice pictures, didn't think you were that type of girl", mocking her as she sat in art class. Later, when she was a student and modelling part-time, her boyfriend took a photo of her while she was sleeping naked. Jess saw it on his phone when he was in the shower - he had posted it in a group chat with his friends. She deleted the image that was taken without her knowledge or permission but never confronted him, thinking "that is just what happens". The Welsh activist said the trauma of being a victim of image-based abuse is lifelong and she still feels the impact today. "A lot of women are suicidal when it comes to this. It's something that they carry around with them every single day. "One victim said to me, I can't say I've got PTSD. Because it's never in the past, because you're always thinking, 'Where was it shared before that, or who might have it, and who might upload it again?'" Jess said most people don't even realise what they're doing is illegal. What are deepfakes? Here's what you need to know... Deepfakes use artificial intelligence and machine learning to produce face-swapped videos with barely any effort They can be used to create realistic videos that make celebrities appear as though they're saying something they didn't Deepfakes have also been used by sickos to make fake porn videos that feature the faces of celebrities or ex-lovers To create the videos, users first track down an XXX clip featuring a porn star that looks like an actress They then feed an app with hundreds – and sometimes thousands – of photos of the victim's face A machine learning algorithm swaps out the faces frame-by-frame until it spits out a realistic, but fake, video To help other users create these videos, pervs upload "facesets", which are huge computer folders filled with a celebrity's face that can be easily fed through the "deepfakes" app While in recent years there has been a better grasp on what consent means in the physical world, the same can't be said for the digital world, she explains. Jess says: "If you're a woman online, then you're probably going to be sexually harassed and receive threats. You might have your images stolen. It's like 'Oh, well, that's just what happens'. "There seems to be a free-for-all when it comes to women's bodies, specifically that they can do anything they want." Schoolboys are creating deepfakes of their classmates, and Jess has spoken to girls who talk about being bullied into sending nudes. She thinks misogynistic ideology being picked up by teenagers stems from online content and masculinity influencers like Andrew Tate. "Obviously Andrew Tate's the loudest one out there. But there are so many of them. A lot of teenage boys think that's funny and are reading it every single day. 13 13 13 "We might see a one-minute TikTok, but they're doing hours and hours of live streams every single day into the bedrooms of these teenage boys. "These live streams are unchecked, unregulated. They can say anything they want." Teenage boys being radicalised through online content was recently highlighted by Netflix's hit drama Adolescence. In the fictional series, a teen boy murders a girl who rejected him. He had asked her out when she was vulnerable after having her private photos shared at school - and he was outraged that she would dare refuse him. And while Jess praised the production, she noted how frustrating it is that it took a fictional TV show about a man, written by men, to draw attention to the problem. "Women have been shouting about this for many years about what's happening and trying to draw attention. "Every single day there are real-life stories in the news of women losing their lives at the hands of male violence, or being followed and experiencing sexual harassment and sexual assault. And that wasn't what spurred people's empathy. "It's a sad reflection of how we need men as part of the conversation. "Because a lot of men don't want to listen to women when we're talking about this. So that's why we need men to join the conversation." No One Wants to See Your D*ck is available to buy from today. You're Not Alone EVERY 90 minutes in the UK a life is lost to suicide It doesn't discriminate, touching the lives of people in every corner of society – from the homeless and unemployed to builders and doctors, reality stars and footballers. It's the biggest killer of people under the age of 35, more deadly than cancer and car crashes. And men are three times more likely to take their own life than women. Yet it's rarely spoken of, a taboo that threatens to continue its deadly rampage unless we all stop and take notice, now. That is why The Sun launched the You're Not Alone campaign. The aim is that by sharing practical advice, raising awareness and breaking down the barriers people face when talking about their mental health, we can all do our bit to help save lives. Let's all vow to ask for help when we need it, and listen out for others… You're Not Alone. If you, or anyone you know, needs help dealing with mental health problems, the following organisations provide support: CALM, 0800 585 858 Heads Together, HUMEN Mind, 0300 123 3393 Papyrus, 0800 068 41 41 Samaritans, 116 123

Calls for Ofcom to investigate hacked nude photos posted online
Calls for Ofcom to investigate hacked nude photos posted online

BBC News

time20-03-2025

  • BBC News

Calls for Ofcom to investigate hacked nude photos posted online

When Jane was told by someone she knew that there were nude photos of her on an image-sharing website, she was in complete had left crude and misogynistic comments under the photos. Some were urging the posters to upload more images of her."I couldn't believe it," Jane told the BBC. "Men were getting sexual gratification from pictures of me that I hadn't consented to being shared."Jane, whose name has been changed for this article, is one of more than a dozen women in the same area of England who had their social media accounts hacked two years BBC understands all of them have had intimate images - which were originally sent in private direct messages on social media - posted on the same of the women's names have been posted alongside their photos. Two of the women were under 18 when the images were taken, meaning legally, they would also be classed as indecent images of later discovered some of the photos of the other women had also been posted on different sites too."It makes me feel very angry that somebody's taking ownership over content that I thought was mine and was sent privately," Jane said. "I feel really exposed... It's disgusting."It is a criminal offence to post intimate pictures of someone online without their consent, punishable with up to two years in people colloquially refer to crimes of this nature as "revenge porn", even when the perpetrator is not a former partner and there's no revenge motive - but those affected prefer the term "non-consensual image abuse". 'Swift and decisive action' Jane is now calling on media regulator Ofcom to investigate using its powers under the Online Safety law lists 130 "priority offences" that companies should focus on preventing, including the non-consensual posting of intimate the Online Safety Act is - and how to keep children safe onlineEarlier this week, Ofcom was given new powers to crack down on illegal content. Tech companies will now have to ensure staff are prioritising taking down the material when alerted to it, and have systems in place that help them do that. Companies that break the new rules could be fined up to £ Gregory, a partner at the law firm Leigh Day, is representing Jane in her appeal to Ofcom. She told the BBC that the Revenge Porn Helpline, which supports adults who experience intimate image abuse, helped Jane track down the images by carrying out reverse image searches online and contacting pornographic sites on her Gregory said she now wants Ofcom to "take swift and decisive action" against the she said, is supposed to make an announcement online when they begin an investigation."We're asking if they're already investigating this issue, because it seems so prevalent we would expect them to be, but we have found no evidence that they are," Ms Gregory added that she and Jane want Ofcom to take action not just against the sites hosting the images, but also against the search engines directing people to those told the BBC it was aware of Jane's case and that it was "considering any appropriate next steps".It added that it had "a broad range of enforcement powers to hold tech firms accountable" for carrying out their legal responsibilities under the Online Safety Act, and that it "won't hesitate to use them where necessary".Insiders at Ofcom told the BBC the regulator was likely to concentrate on larger sites to begin with, because of their reach. 'We thought it was private' The Internet Watch Foundation (IWF), which aims to eliminate child sexual abuse images online, launched a campaign last year called Think Before You Share. It educates young people about the potential pitfalls of sharing nude pictures, and how they can be posted elsewhere without their Hardy, from the IWF, told the BBC that often, by the time they found an image or video, it was "already out of control"."It's already spread further than that trusted partnership, and it's being potentially sold online," she said. "It's being harvested and collated in places where it's being made available for people with a sexual interest in that age or sex of person."Jane worries that if action isn't taken quickly, her photos will continue to spread."I know somebody has those pictures saved on their computer, so I don't have any control of those images," she said. "I think revenge porn and non-consensual image abuse grows arms and legs. You think they're gone, and then they can be posted in the future."But she believes it's unrealistic to expect people not to privately share intimate photos of themselves - and that the onus should be on websites not to host them if they get stolen or leaked."I think it's a mistake to think people will never send nudes. Certain websites have a duty to protect your privacy."There needs to be a shift in mindset, and not blame the victim for sending the image in the first place. We thought it was private." If you, or someone you know, has been affected by intimate image abuse, these organisations may be able to help.

Online forums being used to request and trade explicit images of local women, charity warns
Online forums being used to request and trade explicit images of local women, charity warns

The Guardian

time14-02-2025

  • The Guardian

Online forums being used to request and trade explicit images of local women, charity warns

An underground network of men are using peer-to-peer internet message boards to order, share and trade explicit images of local women, according to the UK's leading 'revenge porn' charity. In a development that has echoes of the forums used to organise the drugging and rape of Gisèle Pelicot in France, the Revenge Porn Helpline said that 'systematic deep-seated misogyny' was behind a dark trade in images in the UK. 'We are seeing images posted by strangers, collected and re-shared peer to peer,' said Sophie Mortimer, a helpline manager. 'Men state they are looking for images of named individual women from York or from Huddersfield or anywhere in the country. Pictures are then being shared with derogatory comments about the women and what they would do to them.' She added: 'I think that's the most frightening area for me, because we're not talking about the sharing of images after a relationship breaks down. We're talking about a systematic deep-seated misogyny.' Mortimer said the hotline had experienced an increase in reports about images shared on networks such as Discourse – an online communities app – and messaging app Telegram, which has been criticised for an alleged lack of control on extreme content. The helpline, which is marking its 10-year anniversary this week, has seen a 57% average yearly increase in reports. The hotline has introduced a chatbot to meet rising demand for its services which helps about 50 people a day and receives between 350 and 400 in-person calls a month. 'Revenge porn' – the sharing of private or sexual images or videos of a person without their consent – became an offence in England and Wales in April 2015, but the number of convictions remains low, 277 people were convicted in the year ending June 2024 according to Crown Prosecution Service (CPS) figures. Scotland and Northern Ireland introduced similar laws in 2016, but in some cases, the law is unable to help. Alice*, was grieving after the death of her partner of 10 years when she received Instagram messages that revealed he had shared nude pictures with her name online, and both were on a pornography website. The charity helped remove more than 4,000 intimate images – more than 90% of the total – of her across the internet. Images are now less likely to be found on mainstream sites such as Facebook or Pornhub, which had made improvements at managing and taking down content, said Mortimer. But this could result in content being pushed to the 'less compliant' margins, she said. 'The landscape is a lot more complicated,' she said. 'These images can be much harder for us to access because they're not visible. People have so much more content and it moves much faster because there are many more websites hosting and sharing it.' Some women can battle for years to get such images erased. One woman has been working with the charity for more than eight years to remove around 150 different images created in a consensual relationship but then shared under her full name by her ex-partner after they split up. 'It has had the most devastating impact on her entire life, and understandably she has really struggled,' said Mortimer. Another woman, who had images taken of her while she was being abused by a former partner, was told by strangers that they have seen intimate pictures online alongside her name and the town where she lives. 'We are much better at removing content but if it is still there, you can't put it behind you, you can't move on,' said Mortimer. 'It's just endlessly debilitating.' The charity argues that the law needs to be widened to make adult non-consensual intimate images illegal, rather than just the sharing of them – and say this would make them easier to remove from the internet, and is already the case for child abuse images. Minister for victims and violence against women and girls, Alex Davies-Jones, said the government was strengthening the law and that includes the Online Safety Act, which forces platforms to remove intimate images. 'Sharing intimate images online without consent is an abhorrent violation that can inflict profound and lasting harm on victims, particularly women and girls,' she said. 'Women have the right to feel safe wherever they are, in both the online and offline world. This government is determined to make that happen.' But a widening of legislation – and multinational cooperation – must be combined with work to create a societal shift said Mortimer. 'I think this is actually a really dangerous time for women,' she said. 'What I've learned from doing this work is that men who doing this are not monsters living in their mothers' basements … This is all around us, under our noses, the men that we think are allies, in many cases, are not as benign as they appear.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store