Latest news with #sextortion


Daily Mail
5 days ago
- Entertainment
- Daily Mail
Loose Women sex tape panic as Coleen Nolan and Nadia Sawalha targeted by scammers who threaten to leak clips of them 'enjoying some alone time'
Loose Women panellists Coleen Nolan and Nadia Sawalha have opened up about being targeted by scammers who threatened to leak intimate videos of them 'enjoying some alone time.' Speaking on the ITV panel show, the pair candidly revealed the terrifying ordeals they had been subjected to by fraudsters trying to extort money from them. The topic of sex extortion has been circulating in recent weeks, as evidence emerges that sex scammers drive teenagers to take their own lives out of fear of what would happen if their intimate photos were leaked. Actress and TV personality Nadia Sawalha shared that she received a 'sextortion' email just days earlier, beginning crudely: 'Hello pervert.' The 60-year-old said: 'The email read "We have all the videos of you having a happy time - giving yourself some alone time." 'And I thought, "I don't think I have any videos of that." But then I felt fear again because, with deep fakes, could there be?' 'Deep fakes' are videos of a person in which their face or body has been digitally altered so they appear to be someone else. For example, someone's face could be digitally applied to the body of someone else performing a sex act - although the innocent party had no involvement at all. 'I felt anxious,' Nadia continued. 'I'm a 60-year-old woman who's listened to countless podcasts about this. 'I know the information, I've read stuff about it, I've read terrifying stories about children killing themselves.' She added: 'Because I had all that information, by the time I got to the end of the email, I simply deleted the email because I knew that this was a phishing expedition.' In the modern dating era, the panellists said, it is not uncommon for youngsters to share sexual photos or videos with their partners - meaning they are much more susceptible to such scams. Singer Coleen Nolan, also 60, revealed she had 'the exact same' experience around 18 months ago. Coleen explained how panic set in when she realised someone could have hacked her phone. 'Reading down, it said, "We've got all your videos and what you've done." I was just like, "Please can you send them to me, I'd be interested in seeing them as I've never made one in my life." 'But it was horrendous,' she added, while Nadia said: 'You do feel panicked, don't you?' Coleen continued: 'I knew I hadn't made the videos but had they hacked into my phone? Were they going to lie on that? 'They said they were going to send it to my family and my work and kids. I did click in and then I deleted it.' The discussion follows a powerful Channel 4 documentary titled Hunting My Sextortion Scammer, which sees Rizzle Kicks musician Jordan Stephens fly thousands of miles to confront a fraudster. The 33-year-old, who is one half of the pop duo alongside Harley Alexander-Sule, also 33, 'gets himself sextorted' to expose the tactics criminals use to sexually blackmail young British men and boys. He tracks down his blackmailer and attempts to confront them. In an exclusive clip obtained by MailOnline, the music star tries to keep his cool as he has a calm conversation with his scammer on the phone via Instagram. Jordan tells viewers: 'It sounds like my sextorter is getting more and more wound up,' as the camera zooms in on their conversation on the social media app. The scammer messages him: 'Show me your account now,' to which Jordan replies: 'No, I've already paid.' The sextorter tells Jordan: 'No, is not working bro' alongside three angry face emojis. Moments later the singer gets a call and answers with: 'Hello.' The scammer replies: 'Let tell you something now. I swear down with my life if you don't even try to understand me, I swear down, I'll destroy your life. 'It's better you get me $200 now.' Jordan replies: 'I gave it to you! I gave it to you!' He then tells viewers: 'I need my sextorter to believe he'll get his money if he clicks on that share location link on our fake gift card site.' The scammer says: 'I don't know how to redeem it. Do you want me to destroy your life?' Jordan tells him in a panic: 'I don't have any more money. You put the code in.' 'It's not working, I swear down,' the fraudster says. 'I've done it several times. It's not working. 'You should redeem the card at your place and then send me $200 back. Then I free you!'


Daily Mail
22-07-2025
- Daily Mail
Distraught mom says son, 14, shot himself dead 10 days after he began talking to stranger on TikTok
A distraught mother has spoken out after her 14-year-old son took his own life just days after becoming the target of a cruel 'sextortion' scam on TikTok. Morgan Moore said her son Caleb, the oldest of five children from Mississippi, was just weeks away from starting the 8th grade when he died from a self-inflicted gunshot wound in June. 'He died terrified, scared and ashamed,' his mother told Fox8. Caleb had been messaging with someone he believed was another teenager on TikTok, his mom explained. The exchange quickly moved to text messages and without his parents knowing, he was soon allegedly pressured into sending explicit images. Then the threats began. Police say the scammer used the compromising material to blackmail Caleb. He allegedly threatened to expose the photos unless he paid or sent more content, which is a growing form of cybercrime known as 'sextortion.' Morgan Moore (pictured left) said her son Caleb (right), the oldest of five children from Mississippi , was just weeks away from starting the 8th grade when he died from a self-inflicted gunshot wound in June On June 10, Moore's four younger children discovered their brother's dead body in their home. 'I just hit the floor in the kitchen,' Moore told the outlet. 'The police didn't want me to see it. Well, all four of my kids saw it.' 'I believe that my son was murdered and that he was manipulated into doing something he did not want to do,' she added. Moore described her son as a joyful, mild-mannered boy who loved sports. 'I couldn't have asked for a better son,' Moore told the outlet. 'He was very funny and mild-mannered, so he got along with everybody.' The FBI has issued repeated warnings about the rise in sextortion scams targeting boys between 14 and 17. The scams are often carried out by predators in Nigeria, the Ivory Coast and the Philippines. There have been at least 12,600 victims from between October 2021 and March 2023, according to federal investigators. And at least 20 victims tragically committed suicide. In June, a Kentucky teen took his own life after being targeted in a cruel 'sextortion' scam that used AI-generated nude images to try to blackmail him. Elijah Heacock, 16, of Glasgow, was getting ready for bed on February 27 when he received a chilling text demanding $3,000 to keep an AI-generated nude photo of him from being shared with friends and family, KFDA News reported. Only hours later, his family found him inside their home's laundry room gravely injured by a self-inflicted gunshot wound.
Yahoo
18-07-2025
- Yahoo
Green Bay Police & School District team up with Bellin to raise awareness of online sextortion scams
GREEN BAY, Wis. (WFRV) – Local police and the Green Bay Area School District joined Emplify Health by Bellin to raise awareness of an online scam known as 'sextortion.' According to the release, these scams often involve people who pose as members of the opposite sex, targeting teenagers and young adults. These scammers start conversations with victims before persuading them to exchange sexual content. Neil Armstrong's watch inspires $17K donation for students across northeast Wisconsin 'Do not send any images or videos to someone you do not know,' Police Commander Rick Belanger said. 'Sextortion scammers prey on trust and will threaten/intimidate to make private content public unless specific demands are met.' Superintendent Vicki Bayer added in a release that children are encouraged to reach out to a trusted adult, no matter who it may be, if they are victims of these crimes. Some cases do not make headlines, which makes the full impact of these scams hard to know; however, a recent case reportedly led to a teenager's death in Wisconsin. Green Bay man charged with harboring a felon in relation to reckless homicide charge against his son 'Sextortion scams rely on embarrassment and isolation. Predators trick victims into thinking they are alone and cannot escape,' Pediatric Psychologist Tiffany Born said. 'But you are not alone. If you or someone you know has become a victim of sextortion, please reach out to a loved one or professional.' Anybody who is in need of mental support or the crisis lifeline is urged to either call 988 or click this link. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Solve the daily Crossword


Malay Mail
17-07-2025
- Malay Mail
Why AI-generated nudes are pushing minors to suicide; and tech firms can't stop it
WASHINGTON, July 17 — After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding US$3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of 'nudify' apps – AI tools that digitally strip off clothing or generate sexualized imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. 'The people that are after our children are well organized,' John Burnett, the boy's father, said in a CBS News interview. 'They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child.' US investigators were looking into the case, which comes as nudify apps – which rose to prominence targeting celebrities – are being increasingly weaponized against children. The FBI has reported a 'horrific increase' in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an 'alarming number of suicides,' the agency warned. Instruments of abuse In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes. 'Reports of fakes and deepfakes – many of which are generated using these 'nudifying' services – seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images,' the British watchdog Internet Watch Foundation (IWF) said in a report last year. 'Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful – maybe even as harmful as real images in some cases – can be produced using generative AI.' The IWF identified one 'pedophile guide' developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to US$36 million a year. The analysis from Indicator, a US publication investigating digital deception, estimates that 18 of the sites made between US$2.6 million and US$18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. 'Whack-a-mole' The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualized images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in jail. And in May, US President Donald Trump signed the bipartisan 'Take It Down Act,' which criminalizes the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient. 'To date, the fight against AI nudifiers has been a game of whack-a-mole,' Indicator said, calling the apps and sites 'persistent and malicious adversaries.' — AFP

News.com.au
17-07-2025
- News.com.au
AI-powered 'nudify' apps fuel deadly wave of digital blackmail
After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding $3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of "nudify" apps -- AI tools that digitally strip off clothing or generate sexualized imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. "The people that are after our children are well organized," John Burnett, the boy's father, said in a CBS News interview. "They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child." US investigators were looking into the case, which comes as nudify apps -- which rose to prominence targeting celebrities -- are being increasingly weaponized against children. The FBI has reported a "horrific increase" in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an "alarming number of suicides," the agency warned. - Instruments of abuse - In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes. "Reports of fakes and deepfakes -- many of which are generated using these 'nudifying' services -- seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images," the British watchdog Internet Watch Foundation (IWF) said in a report last year. "Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful -- maybe even as harmful as real images in some cases -- can be produced using generative AI." The IWF identified one "pedophile guide" developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a year. The analysis from Indicator, a US publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. - 'Whack-a-mole' - The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualized images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in jail. And in May, US President Donald Trump signed the bipartisan "Take It Down Act," which criminalizes the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient.