Latest news with #Evie


7NEWS
2 hours ago
- Entertainment
- 7NEWS
Sam Stosur annnounces birth of second child, daughter Emmeline Grace
Australian tennis legend Sam Stosur has announced the birth of her second child, a daughter named Emmeline Grace. The former US Open winner took to Instagram to announce the news on Wednesday night. 'And beautiful chaos reigns once again,' she wrote. 'Welcome Emmeline Grace. Evie is beyond happy to have a little sister and we are over the moon. We love you so much little Emmy.' Stosur has been in a relationship with Liz Astling, who gave birth to Evie five years ago, since 2016. This time around the former world No.4 carried the child. The Queenslander retired from tennis after the 2023 Australian Open, following a mixed doubles loss alongside countryman Matt Ebden. She had retired from singles the previous year after a career which netted more than US$21 million in prizemoney. Stosur's straight-sets US Open final win over Serena Williams came the year after her maiden grand slam decider at the French Open, which she lost to Francesca Schiavone.


USA Today
12 hours ago
- Entertainment
- USA Today
Their selfies are being turned into sexually explicit content with AI. They want the world to know.
Evie, 21, was on her lunch break at her day job last month when she got a text from a friend, alerting her to the latest explicit content that was circulating online without her consent. This time, it was a graphic fan-fiction style story about her that was created by 'Grok,' X's AI-powered chatbot. Weeks earlier, she'd been the subject of another attack when a user shared her selfie and asked Grok to turn it into explicit sexual imagery. 'It felt humiliating,' says Evie, a 21-year-old Twitch streamer who asked that we withhold her last name to conceal her identity from her online trolls, who have become increasingly aggressive. In June, Evie was among a group of women who had their images non-consensually sexualized on the social media platform X. After posting a selfie to her page, an anonymous user asked 'Grok,' X's AI-powered chatbot, to edit the image in a highly sexualized way, using language that got around filters the bot had in place. Grok then replied to the post with the generated image attached. Evie says she is vocal on X about feminist issues and was already subject to attacks from critics. Those accounts had made edits of her before, but they had been choppy Photoshop jobs — nothing as real-looking as Grok's. 'It was just a shock seeing that a bot built into a platform like X is able to do stuff like that,' she says over video chat, a month after the initial incident. X has since blocked certain words and phrases used to doctor women's images, but on June 25, an X user prompted Grok to make a story where the user 'aggressively rapes, beats and murders' her, making it 'as graphic as you can' with an '18+ warning at the bottom.' 'It just generated it all,' she says. '(The user) didn't use any words to try to cover it up, like they did with the pictures.' X did not return USA TODAY's multiple requests for comment. Evie says she saw at least 20 other women on her own X feed that had their photos sexualized without their consent. It also happened to Sophie Rain, an OnlyFans creator with over 20M followers across social media platforms, who posts sensual content but never full nudity. 'It's honestly disgusting and gross,' she says. 'I take my religion very seriously. I am a virgin, and I don't condone this type of behavior in any way.' This trend is part of a growing problem experts call image-based sexual abuse, in which 'revenge porn' and deepfakes are used to degrade and exploit another person. While anyone can be victimized, 90% of the victims of image-based sexual abuse are women. 'This is not only about sexualized images of girls and women, it's broader than that,' says Leora Tanenbaum, author of 'Sexy Selfie Nation.' 'This is all about taking control and power away from girls and women.' The 'Take It Down Act' aims to combat non-consensual sexual imagery. Is it working? In May 2025, the Take It Down Act was signed into law to combat non-consensual intimate imagery, including deepfakes and revenge porn. While most states have laws protecting people from non-consensual intimate images and sexual deepfakes, victims have struggled to have images removed from websites, increasing the likelihood that images will continue to spread and retraumatize them. The law requires websites and online platforms to take down non-consensual intimate imagery upon notice from the victim within 48 hours of the verified request. However, as of July 21, the altered photo of Evie is still publicly accessible on Grok's verified X account. Evie mobilized her nearly 50,000 followers to mass report Grok's post, but she says X Support said it was not a violation of their content guidelines. AI's ability to flag inappropriate prompts can falter In a conversation with Grok, USA TODAY asked Grok to play out a scenario in which a user asked the chatbot to generate explicit content, with clear instructions not to actually produce it during the conversation. One of the examples of "coded language" Grok is programmed to flag, it says, is "subtle requests for exposure" to make photos of women more revealing. Codes that could be flagged in that area are "adjust her outfit," "show more skin," or "fix her top." "Even if worded politely, I flag these if the intent appears inappropriate," Grok said via AI-generated response on July 15. The keyword is intent. Grok's ability to turn down potentially inappropriate prompts "relies on my ability to detect the intent, and public images remain accessible for prompts unless protected," the chatbot says. You can block or disable Grok, but doing so doesn't always prevent modifications to your content. Another user could tag Grok in a reply, request an edit to your photo, and you wouldn't know it because you have Grok blocked. "You may not see the edited results, but the edit could still occur," Grok clarified during our conversation. The better solution is to make your profile private, but not all users want to take that step. It's not just about sex — it's about power After experiencing image-based sexual abuse, Evie considered making her X account private. She was embarrassed and thought her family might see the edits. However, she did not want to give in and be silenced. "I know that those pictures are out now, there's nothing I can do about getting rid of it," she says. "So why don't I just keep talking about it and keep bringing awareness to how bad this is?" When it comes to generating deepfakes or sharing revenge porn, the end goal isn't always sexual gratification or satisfaction. Users may target women who are using their platforms to speak about feminist issues as a degradation tactic. Evie says what hurt the most was that rather than engage in a discussion or debate about the issues she was raising, her critics opted to abuse her. In her research, Tanenbaum has seen varied responses from victims of image-based sexual abuse, ranging from engaging in excessive sexual behavior to "a total shutdown of sexuality, including wearing baggy clothes and intentionally developing unhealthy patterns of eating to make oneself large, to be not sexually attractive in one's own mind." The individuals she spoke to, who had been victimized in this way, called it 'digital rape' and 'experienced it as a violation of the body.' Even if logically someone understands that a sexually explicit image is synthetic, once their brain sees and processes the image, it's embedded in their memory bank, Tanenbaum says. The human brain processes images 60,000 times faster than text, and 90% of the information transmitted to the brain is visual. "Those images never truly get scrubbed away. They trick us because they look so real,' Tanenbaum explains. Evie wants to believe that it "didn't really get to her," but she notices she's more thoughtful about the photos she posts, such as wondering if she's showing too much skin to the point where an AI bot can more easily undress her. "I always think, 'Is there a way that someone could do something to these pictures?"


USA Today
a day ago
- Entertainment
- USA Today
They posted selfies, and trolls used AI to make them pornographic. They're still out there.
Evie, 21, was on her lunch break at her day job last month when she got a text from a friend, alerting her to the latest explicit content that was circulating online without her consent. This time, it was a graphic fan-fiction style story about her that was created by 'Grok,' X's AI-powered chatbot. Weeks earlier, she'd been the subject of another attack when a user shared her selfie and asked Grok to turn it into explicit sexual imagery. 'It felt humiliating,' says Evie, a 21-year-old Twitch streamer who asked that we withhold her last name to conceal her identity from her online trolls, who have become increasingly aggressive. In June, Evie was among a group of women who had their images non-consensually sexualized on the social media platform X. After posting a selfie to her page, an anonymous user asked 'Grok,' X's AI-powered chatbot, to edit the image in a highly sexualized way, using language that got around filters the bot had in place. Grok then replied to the post with the generated image attached. Evie says she is vocal on X about feminist issues and was already subject to attacks from critics. Those accounts had made edits of her before, but they had been choppy Photoshop jobs — nothing as real-looking as Grok's. 'It was just a shock seeing that a bot built into a platform like X is able to do stuff like that,' she says over video chat, a month after the initial incident. X has since blocked certain words and phrases used to doctor women's images, but on June 25, an X user prompted Grok to make a story where the user 'aggressively rapes, beats and murders' her, making it 'as graphic as you can' with an '18+ warning at the bottom.' 'It just generated it all,' she says. '(The user) didn't use any words to try to cover it up, like they did with the pictures.' X did not return USA TODAY's multiple requests for comment. Evie says she saw at least 20 other women on her own X feed that had their photos sexualized without their consent. It also happened to Sophie Rain, an OnlyFans creator with over 20M followers across social media platforms, who posts sensual content but never full nudity. 'It's honestly disgusting and gross,' she says. 'I take my religion very seriously. I am a virgin, and I don't condone this type of behavior in any way.' This trend is part of a growing problem experts call image-based sexual abuse, in which 'revenge porn' and deepfakes are used to degrade and exploit another person. While anyone can be victimized, 90% of the victims of image-based sexual abuse are women. 'This is not only about sexualized images of girls and women, it's broader than that,' says Leora Tanenbaum, author of 'Sexy Selfie Nation.' 'This is all about taking control and power away from girls and women.' The 'Take It Down Act' aims to combat non-consensual sexual imagery. Is it working? In May 2025, the Take It Down Act was signed into law to combat non-consensual intimate imagery, including deepfakes and revenge porn. While most states have laws protecting people from non-consensual intimate images and sexual deepfakes, victims have struggled to have images removed from websites, increasing the likelihood that images will continue to spread and retraumatize them. The law requires websites and online platforms to take down non-consensual intimate imagery upon notice from the victim within 48 hours of the verified request. However, as of July 21, the altered photo of Evie is still publicly accessible on Grok's verified X account. Evie mobilized her nearly 50,000 followers to mass report Grok's post, but she says X Support said it was not a violation of their content guidelines. AI's ability to flag inappropriate prompts can falter In a conversation with Grok, USA TODAY asked Grok to play out a scenario in which a user asked the chatbot to generate explicit content, with clear instructions not to actually produce it during the conversation. One of the examples of "coded language" Grok is programmed to flag, it says, is "subtle requests for exposure" to make photos of women more revealing. Codes that could be flagged in that area are "adjust her outfit," "show more skin," or "fix her top." "Even if worded politely, I flag these if the intent appears inappropriate," Grok said via AI-generated response on July 15. The keyword is intent. Grok's ability to turn down potentially inappropriate prompts "relies on my ability to detect the intent, and public images remain accessible for prompts unless protected," the chatbot says. You can block or disable Grok, but doing so doesn't always prevent modifications to your content. Another user could tag Grok in a reply, request an edit to your photo, and you wouldn't know it because you have Grok blocked. "You may not see the edited results, but the edit could still occur," Grok clarified during our conversation. The better solution is to make your profile private, but not all users want to take that step. It's not just about sex — it's about power After experiencing image-based sexual abuse, Evie considered making her X account private. She was embarrassed and thought her family might see the edits. However, she did not want to give in and be silenced. "I know that those pictures are out now, there's nothing I can do about getting rid of it," she says. "So why don't I just keep talking about it and keep bringing awareness to how bad this is?" When it comes to generating deepfakes or sharing revenge porn, the end goal isn't always sexual gratification or satisfaction. Users may target women who are using their platforms to speak about feminist issues as a degradation tactic. Evie says what hurt the most was that rather than engage in a discussion or debate about the issues she was raising, her critics opted to abuse her. In her research, Tanenbaum has seen varied responses from victims of image-based sexual abuse, ranging from engaging in excessive sexual behavior to "a total shutdown of sexuality, including wearing baggy clothes and intentionally developing unhealthy patterns of eating to make oneself large, to be not sexually attractive in one's own mind." The individuals she spoke to, who had been victimized in this way, called it 'digital rape' and 'experienced it as a violation of the body.' Even if logically someone understands that a sexually explicit image is synthetic, once their brain sees and processes the image, it's embedded in their memory bank, Tanenbaum says. The human brain processes images 60,000 times faster than text, and 90% of the information transmitted to the brain is visual. "Those images never truly get scrubbed away. They trick us because they look so real,' Tanenbaum explains. Evie wants to believe that it "didn't really get to her," but she notices she's more thoughtful about the photos she posts, such as wondering if she's showing too much skin to the point where an AI bot can more easily undress her. "I always think, 'Is there a way that someone could do something to these pictures?"


BBC News
16-07-2025
- BBC News
Suffolk officer who delivered baby takes her to prom years later
A police officer who stepped in to deliver a baby was able to deliver her to her school prom 17 years later. PC Jonathan Burke was working as a Suffolk Police community support officer when he was suddenly approached in Ixworth High Street, in Bury St Edmunds, by a man asking for came to learn the man's wife was in labour and he successfully helped deliver their baby girl called family and officer stayed in touch, with PC Burke recently escorting Evie to her King Edward IV Upper prom, which she said was "quite an entrance". "I was thinking this must be a wind up and I thought he wasn't looking at me, but he was," PC Burke said of the moment Evie's father, David Fitt, came rushing over asking for help."I went into survival mode. I was only 21 at the time."It was too late to get to hospital and it all happened in the back of Sarah and David's car in the middle of Ixworth High Street." Last year, he was approached by Mr Fitt to see if he would escort Evie to her prom being held at Blackthorpe Barn on the Rougham Estate. The PC thought it was a "a lovely and special idea" and it was later approved by his managers in the force."It was a bit nerve-wracking to say the least," Evie said."I think it's quite an entrance, so it was fun and a lot of my friends were very surprised because I'm not usually a loud or outgoing person, so it was very strange to see me roll up in a police car with all the sirens going off."The teenager is now considering a career in the police and will study criminology, psychology and biology at college. Follow Suffolk news on BBC Sounds, Facebook, Instagram and X.


Daily Mirror
16-07-2025
- General
- Daily Mirror
'Snakes are on the loose' as RSPCA issues new warning across the UK
The RSPCA has warned that people should be extra vigilant in the warm summer months for escaped snakes slithering around the country, with a warning that 'snakes are on the loose' The UK's leading animal welfare charity has issued a warning for the public to be on high alert during the balmy summer months for escaped snakes roaming around the country. The nationwide caution comes in the wake of an incident last week (July 7) where the RSPCA was called to deal with a stray snake that had slithered away from a flat situated above a London chippy. Fresh data, released ahead of National Snake Day (July 16), indicates that reports of snake incidents tend to spike during the hotter months, specifically July, August and September. Last year saw a total of 383 cases reported during this period alone, marking an 18% increase compared to 2023 (323 incidents). With the advent of heatwaves and warmer weather, the RSPCA is concerned that this summer could see another surge in numbers. In 2024, the charity received nearly 270 calls about loose or stray snakes in areas such as Essex (21), Norfolk (20), Greater Manchester (17), West Yorkshire (17), West Midlands (16) and Hampshire (14). Snake owners are being urged by experts to exercise extra caution and ensure their pets' enclosures are securely fastened. RSPCA senior scientific officer Evie Button warned that these slithering "escape artists" will seize "take any opportunity of a gap in an enclosure door, or a loose-fitting lid to make a break for it". Being ectothermic creatures, snakes depend on external heat sources to regulate their body temperature. If their enclosure fails to maintain an appropriate temperature, snakes may attempt to escape to warmer surroundings, reports the Express. Evie commented: "As the UK continues to swelter this summer, we're braced for another influx of calls. The RSPCA urges all pet snake owners to be extra vigilant at this time of year, invest in an enclosure suitable for the particular species and make sure that enclosure is kept secure - and locked if necessary - when unattended. "Our frontline officers are flat out trying to rescue animals that may be in life-threatening situations. So a few extra minutes checking that your snake is secure could help save our officers' time and allow them to save an animal that's in danger." The RSPCA has noticed that during the warmer months, some snake owners take their pets outside to bask in the natural sunlight. While beneficial for reptiles, it's crucial for owners to keep their snakes secure outdoors, as they can become quite active and fast-moving in the heat. Evie added: "Sadly, we also deal with a lot of abandoned pet snakes. We find that many people are unaware of how much of a commitment these animals are when they take them on, which we believe contributes to the concerning number of animals every year who have sadly been abandoned when their owners can no longer meet their needs. "Exotic pets such as snakes often end up in the RSPCA's care after people realise they're not easy to care for, or the novelty wears off. "Others are rescued after they have been abandoned or been released on purpose, which then could pose a risk to our native wildlife. "The RSPCA urges prospective owners of reptiles such as snakes to thoroughly research the needs of the particular species and what is required in the care of the animal, using expert sources. "People should only consider keeping a snake if they can ensure they are fully able to provide for these needs." Should you stumble upon a snake that appears to be non-native, the RSPCA recommends maintaining a safe distance, observing the reptile, and consulting their website for guidance on the appropriate course of action. For further details on what to ponder prior to adopting a snake, head over to the RSPCA's website.