
Essex man who used AI to create deepfake pornography is jailed
An online pervert who used artificial intelligence (AI) to create deepfake pornography of women he knew has been jailed for five years.Brandon Tyler, 26, manipulated images from social media pages and posted them in a forum that glorified "rape culture", Chelmsford Crown Court heard.The bar worker, from Braintree, Essex, was described by the judge as "showing the worst kind of toxic masculinity".He admitted 18 counts of causing harassment without violence and 15 counts of sharing an intimate photo or film of a person for sexual gratification.
The court heard 20 women were targeted in 173 online posts by Tyler, of Railway Street, between March 2023 and May 2024.Posting a picture of one victim, he asked users "which one deserves to be gang raped?"Judge Alexander Mills said: "Toxic masculinity is a term that is often heard. "There is perhaps no greater example of it than your humiliating and degrading conduct to the victims in this case."These people had a right to post their images on social media platforms without fear of those images being warped for sexual purposes."As Tyler was sentenced, one victim said: "I am grateful he has to face the reality that we are all aware of his secret."
The creation of sexually explicit deepfake images was made a criminal offence in England and Wales last April.Tyler, who has a daughter, used the online alias "Cumhoney88" to upload his graphic cache of images online, the court heard.His posts were accompanied by "sexualised and explicit" comments by others, prosecutor Emily Farrelly said, describing the forum as one that championed a "rape culture and sexual violence to women"She revealed the victims were "completely naked" in some, with others having "semen edited onto them".
'Distress and humiliation'
Tyler also downloaded a bikini image posted by one complainant on Instagram and used AI to remove the clothing, the court heard.Another explicit edit was made on a photograph of a 16-year-old girl on her prom night."The defendant clearly planned his behaviour, trawling across social media accounts and editing the images as he wished," Ms Farrelly said.Tyler also posted his victims' names and social media handles online, as well as their phone numbers, the court heard."The defendant clearly intended to maximise distress and humiliation for the victims," the prosecutor added.
Tyler was caught when he accidentally included his own Instagram handle in a screenshot taken of one of the women's accounts.Two of the victims wept in the witness box as they told the judge of the impact Tyler's offending had on them.One said she continued to receive "anonymous sexual comments" from people calling her phone, claiming it led to her long-term relationship ending."I have and still do feel completely violated by what Brandon did to me," another added. "I feel mortified and disgusted."Judge Mills said Tyler lived in a "dark world of fantasy" and "clearly intended to degrade and humiliate" them with his actions.He added: "The posts read like an advertising poster for the worst kind of toxic masculinity."In mitigation, Michael Edmonds said Tyler had "struggled with a porn addiction from a young age", insisting he was not an "incel" or a "misogynist"."This is not a man who has offended in any way, shape or form before. It is just bizarre that he did it," Mr Edmonds added.
Follow Essex news on BBC Sounds, Facebook, Instagram and X.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Telegraph
2 hours ago
- Telegraph
Why a blank cheque won't solve Britain's policing woes
In 2023, a productivity review led by two former chief constables identified 26 ways of freeing-up 38 million hours of police time. That would equate to 20,000 extra police officers. The recommendations included cutting red tape, reducing sickness absence and using computer technology for clerical tasks. A second report from the productivity panel, in 2024, said a further 23 million hours could be saved – including through the expansion of AI. 'Modern technology is the golden key to police efficiency and effectiveness,' says Winsor. Yet, progress on technology has been painfully slow – and not helped by a failure to manage large-scale projects, such as ESN (Emergency Services Network), an upgrade on the ageing emergency services communications network Airwave, which is a decade behind schedule and £3.1 billion over budget. 'You have to lay much of it at the door of the Home Office,' says Trotter. 'The replacement of Airwave has gone on for years – it's an area that has not been a success, it's wasted a lot of money and is still not resolved. It needs an inquiry,' he adds. Failing to see 'beyond force boundaries' There are glaring inefficiencies in other areas, too. Across England and Wales, each of the 43 forces, no matter how large or small, has its own leadership team, civilian support set-up and administrative functions, such as payroll, legal affairs and human resources. Pooling some of that work would make financial sense, says Winsor. 'The back office stuff could and should be done either regionally or nationally, in the way it's done in the NHS or the military,' he says. In 2022, a report from the independent think-tank, the Police Foundation, estimated that forces in England and Wales could save 'hundreds of millions' of pounds annually by combining support teams – as well as purchasing police uniform, equipment, vehicles, forensic services and computers centrally, rather than negotiating individual contracts with suppliers, as many constabularies do. But it seems the introduction of police and crime commissioners, a decade earlier, cemented a 'localist' approach, hindering prospects for developing a more cohesive and less fragmented system of policing, with the economies of scale that would result. 'The police and crime commissioner model has some strengths but it can hold things back, because in my time there were far too many who could not see beyond their force boundaries – and crime doesn't stop at force boundaries,' says Winsor, who left the watchdog three years ago. The author of the Police Foundation report, its former director Rick Muir, is now working as a Home Office adviser, developing plans for a white paper, based around the establishment of a new National Centre of Policing. It is long overdue. Rowley and other police leaders support the case for a reorganisation. Although their immediate concern is whether they'll have enough resources over the next three years, they are aware that it is not just about the money – radical structural reform is needed to put forces on a long-term sustainable financial footing and ensure the public get the police service they deserve. As Peter Kyle, the Science and Technology Secretary, put it at the weekend, the police must 'do their bit' and 'embrace change'.


Daily Mail
5 hours ago
- Daily Mail
'Furious' pensioner falls victim to Dr Chris Brown love scam as fraudsters use AI to trick her out of $780
A New Zealand grandmother is the latest victim to be targeted by a love scam using AI-generated videos and images of Dr Chris Brown. Daana Tomlin, 73, was conned out of $786 when she was tricked into believing she had struck up a five-year long online romance with the 46-year-old Bondi Vet star. The fraudsters used emotionally manipulative messages to convince the pensioner she was in a romantic relationship with the star, reported The Daily Telegraph. They first made contact with a 'cheeky message' on Facebook while posing as Chris, before moving Daana to Telegram and WhatsApp to continue the manipulation. Daana, a semi-retired naturopath from Dunedin, said she was conned into sending the scammers almost $800 through gift cards, Apple cards and PayPal. From A-list scandals and red carpet mishaps to exclusive pictures and viral moments, subscribe to the DailyMail's new showbiz newsletter to stay in the loop. The hoaxers used sophisticated AI-generated videos of Chris to speak to Daana and convince her to hand over her cash. 'Whenever I became suspicious and tried to end contact, he'd get angry and send me a video saying he was real. It certainly sounded and looked like him,' she said. 'I'm furious that it's not the real Chris Brown, I paid a few hundred dollars for a meet and greet at New Zealand airport through Apple Cards and gift cards.' However, when Chris was a no-show at the airport, Daana asked security for help and they said 'they had not seen him'. Daana said she 'feels silly and embarrassed' for believing it was real, adding the fake Chris would call her 'his wife' and rang 'for five years at 5.30 every morning'. 'What the scammer did was evil, it was devious, exploitative and invasive,' said Daana, who was married for 19 years and has a grown son. She added she 'thought a need was being met' in her and didn't discover she was being scammed until her support worker discovered the PayPal payments. The fraudster had claimed they bought Daana a car as a present, but demanded she send several hundred dollars through PayPal to pay for the registration. Daana isn't the first to fall victim to the sophisticated and long-running love scam. Last month it was reported UK pensioner Lisa Nock was also conned out of her life savings after falling for the same AI scam. The 44-year-old from Staffordshire was browsing Instagram in 2022 when a fake account posing as Chris bombarded her with direct messages. Lisa admitted she was lonely and vulnerable at the time after losing her partner in a car crash, and had been left disabled in another traffic accident. But the avid animal lover said her life changed and she was delighted when the TV star said he wanted to meet her in England. 'I was chuffed that Chris Brown had messaged me, I'm a huge fan and hoped this might be our chance to meet,' Lisa told the Daily Telegraph at the time. The surprise messages began a chain of correspondence that spanned two-and-a-half years, but the scammers soon told her Chris needed money to visit her. They moved the conversation to WhatsApp and continued to groom her, using artificial intelligence to convince her she was in a romantic relationship with him. 'After a few months, I admit I was enamoured. He told me he loved me and wanted us to marry – of course I said no, and asked if it was a scam,' Lisa said. She tried calling the WhatsApp number, but her attempts were blocked. Scammers then used a sophisticated AI program to call Lisa via the encrypted messaging app Telegram. An AI-generated version of Chris said he hoped the call had cleared her doubts. The conmen also used AI image generators to create 'photos' of the TV doctor and shared them with Lisa, a volunteer English and drama teacher. She lives off just $1246 each month from her UK pension, two thirds of which she pays to her parents in rent. The remaining $400 however ended up being sent to the cruel scammers each month for almost three years. Lisa sent the money through gift cards, Bitcoin and Crypto information. 'I was vulnerable and wanted to believe we could be friends, we both love animals, I had lost my partner in a car crash a few years ago,' she said. Lisa finally realised it was all fake when the conmen posed as Chris' 'management' team and told her he had been kidnapped, before demanding $40 million. Lisa has now reported the scam to British police.


Scottish Sun
10 hours ago
- Scottish Sun
I was 15 when my nude pics were leaked – grown men sent them around at the football club & everyone blamed ME
NOT KIDDING I was 15 when my nude pics were leaked – grown men sent them around at the football club & everyone blamed ME Click to share on X/Twitter (Opens in new window) Click to share on Facebook (Opens in new window) LIKE many young girls, Jess Davies wanted to impress her school crush and decided to send him an explicit photo of herself. Little did the 15-year-old know that he would send it around the school and she would become a victim of image abuse. Sign up for Scottish Sun newsletter Sign up 2 Jess Davies revealed her nude photo was leaked at just 15-years-old Credit: Instagram/jessdavies "That image got bluetoothed around my school, and then it got shared around my hometown, which was a small hometown in Wales, everyone knows everyone," she explained on the Should I Delete That podcast. Image-based sexual abuse is a criminal offence, it's when someone takes, shares, or threatens to share sexually explicit images or videos of a person without their knowledge or consent, and with the aim of causing them distress or harm. This can include digitally altered images, also known as 'deepfakes' - something Jess has gone on to lobby the government to include in the Online Harms Safety Bill. Now 32, Jess has opened up about the trauma it caused and more shockingly, how she was blamed for the abuse. She revealed that once the photo had circulated in her hometown, it was then shared to grown adult men on the local football team. Instead of seeing Jess as a victim, whose private photo was shared without her consent, people blamed her. "Everyone knew my age because it was a small town, and yet, the whole narrative was around how it was my fault," Jess added. "That I shouldn't have sent it, what kind of girl are you? "There was never any conversation around why are men in their twenties and thirties passing around a child's image?" Jess was left as a teenage girl worrying about how to navigate the situation, and she decided she had to laugh it off. 2 Now, she advocates for sexual abuse victims Credit: Instagram/jessdavies Vicky Pattison shares deepfake porn clip of herself as she warns of dangers on C4 doc She revealed that boys in year 7 would run up and ask for a hug as they had seen the image as well. "I was laughing but secretly, this was humiliating," she said. In the end, her parents also found out about the image, as her nan was told about it from one of the men on the football team, where the image was being circulated. Now, as Jess has gotten older, she realises that the way people treated her for the image was not okay and that she was held more accountable than the grown men sharing the image. It has now led Jess to become an advocate for female rights and sexual abuse. Her BBC documentary 'Deepfake Porn: Could You Be Next' was used to lobby the UK government to criminalise deepfake porn. Jess also has a new book, No One Wants To See Your Dick, a guide for surviving the digital age to help us understand and tackle online misogyny and question society's understanding of consent.