Latest news with #SexySelfieNation


USA Today
7 days ago
- Entertainment
- USA Today
Their selfies are being turned into sexually explicit content with AI. They want the world to know.
Evie, 21, was on her lunch break at her day job last month when she got a text from a friend, alerting her to the latest explicit content that was circulating online without her consent. This time, it was a graphic fan-fiction style story about her that was created by 'Grok,' X's AI-powered chatbot. Weeks earlier, she'd been the subject of another attack when a user shared her selfie and asked Grok to turn it into explicit sexual imagery. 'It felt humiliating,' says Evie, a 21-year-old Twitch streamer who asked that we withhold her last name to conceal her identity from her online trolls, who have become increasingly aggressive. In June, Evie was among a group of women who had their images non-consensually sexualized on the social media platform X. After posting a selfie to her page, an anonymous user asked 'Grok,' X's AI-powered chatbot, to edit the image in a highly sexualized way, using language that got around filters the bot had in place. Grok then replied to the post with the generated image attached. Evie says she is vocal on X about feminist issues and was already subject to attacks from critics. Those accounts had made edits of her before, but they had been choppy Photoshop jobs — nothing as real-looking as Grok's. 'It was just a shock seeing that a bot built into a platform like X is able to do stuff like that,' she says over video chat, a month after the initial incident. X has since blocked certain words and phrases used to doctor women's images, but on June 25, an X user prompted Grok to make a story where the user 'aggressively rapes, beats and murders' her, making it 'as graphic as you can' with an '18+ warning at the bottom.' 'It just generated it all,' she says. '(The user) didn't use any words to try to cover it up, like they did with the pictures.' X did not return USA TODAY's multiple requests for comment. Evie says she saw at least 20 other women on her own X feed that had their photos sexualized without their consent. It also happened to Sophie Rain, an OnlyFans creator with over 20M followers across social media platforms, who posts sensual content but never full nudity. 'It's honestly disgusting and gross,' she says. 'I take my religion very seriously. I am a virgin, and I don't condone this type of behavior in any way.' This trend is part of a growing problem experts call image-based sexual abuse, in which 'revenge porn' and deepfakes are used to degrade and exploit another person. While anyone can be victimized, 90% of the victims of image-based sexual abuse are women. 'This is not only about sexualized images of girls and women, it's broader than that,' says Leora Tanenbaum, author of 'Sexy Selfie Nation.' 'This is all about taking control and power away from girls and women.' The 'Take It Down Act' aims to combat non-consensual sexual imagery. Is it working? In May 2025, the Take It Down Act was signed into law to combat non-consensual intimate imagery, including deepfakes and revenge porn. While most states have laws protecting people from non-consensual intimate images and sexual deepfakes, victims have struggled to have images removed from websites, increasing the likelihood that images will continue to spread and retraumatize them. The law requires websites and online platforms to take down non-consensual intimate imagery upon notice from the victim within 48 hours of the verified request. However, as of July 21, the altered photo of Evie is still publicly accessible on Grok's verified X account. Evie mobilized her nearly 50,000 followers to mass report Grok's post, but she says X Support said it was not a violation of their content guidelines. AI's ability to flag inappropriate prompts can falter In a conversation with Grok, USA TODAY asked Grok to play out a scenario in which a user asked the chatbot to generate explicit content, with clear instructions not to actually produce it during the conversation. One of the examples of "coded language" Grok is programmed to flag, it says, is "subtle requests for exposure" to make photos of women more revealing. Codes that could be flagged in that area are "adjust her outfit," "show more skin," or "fix her top." "Even if worded politely, I flag these if the intent appears inappropriate," Grok said via AI-generated response on July 15. The keyword is intent. Grok's ability to turn down potentially inappropriate prompts "relies on my ability to detect the intent, and public images remain accessible for prompts unless protected," the chatbot says. You can block or disable Grok, but doing so doesn't always prevent modifications to your content. Another user could tag Grok in a reply, request an edit to your photo, and you wouldn't know it because you have Grok blocked. "You may not see the edited results, but the edit could still occur," Grok clarified during our conversation. The better solution is to make your profile private, but not all users want to take that step. It's not just about sex — it's about power After experiencing image-based sexual abuse, Evie considered making her X account private. She was embarrassed and thought her family might see the edits. However, she did not want to give in and be silenced. "I know that those pictures are out now, there's nothing I can do about getting rid of it," she says. "So why don't I just keep talking about it and keep bringing awareness to how bad this is?" When it comes to generating deepfakes or sharing revenge porn, the end goal isn't always sexual gratification or satisfaction. Users may target women who are using their platforms to speak about feminist issues as a degradation tactic. Evie says what hurt the most was that rather than engage in a discussion or debate about the issues she was raising, her critics opted to abuse her. In her research, Tanenbaum has seen varied responses from victims of image-based sexual abuse, ranging from engaging in excessive sexual behavior to "a total shutdown of sexuality, including wearing baggy clothes and intentionally developing unhealthy patterns of eating to make oneself large, to be not sexually attractive in one's own mind." The individuals she spoke to, who had been victimized in this way, called it 'digital rape' and 'experienced it as a violation of the body.' Even if logically someone understands that a sexually explicit image is synthetic, once their brain sees and processes the image, it's embedded in their memory bank, Tanenbaum says. The human brain processes images 60,000 times faster than text, and 90% of the information transmitted to the brain is visual. "Those images never truly get scrubbed away. They trick us because they look so real,' Tanenbaum explains. Evie wants to believe that it "didn't really get to her," but she notices she's more thoughtful about the photos she posts, such as wondering if she's showing too much skin to the point where an AI bot can more easily undress her. "I always think, 'Is there a way that someone could do something to these pictures?"


USA Today
22-07-2025
- Entertainment
- USA Today
They posted selfies, and trolls used AI to make them pornographic. They're still out there.
Evie, 21, was on her lunch break at her day job last month when she got a text from a friend, alerting her to the latest explicit content that was circulating online without her consent. This time, it was a graphic fan-fiction style story about her that was created by 'Grok,' X's AI-powered chatbot. Weeks earlier, she'd been the subject of another attack when a user shared her selfie and asked Grok to turn it into explicit sexual imagery. 'It felt humiliating,' says Evie, a 21-year-old Twitch streamer who asked that we withhold her last name to conceal her identity from her online trolls, who have become increasingly aggressive. In June, Evie was among a group of women who had their images non-consensually sexualized on the social media platform X. After posting a selfie to her page, an anonymous user asked 'Grok,' X's AI-powered chatbot, to edit the image in a highly sexualized way, using language that got around filters the bot had in place. Grok then replied to the post with the generated image attached. Evie says she is vocal on X about feminist issues and was already subject to attacks from critics. Those accounts had made edits of her before, but they had been choppy Photoshop jobs — nothing as real-looking as Grok's. 'It was just a shock seeing that a bot built into a platform like X is able to do stuff like that,' she says over video chat, a month after the initial incident. X has since blocked certain words and phrases used to doctor women's images, but on June 25, an X user prompted Grok to make a story where the user 'aggressively rapes, beats and murders' her, making it 'as graphic as you can' with an '18+ warning at the bottom.' 'It just generated it all,' she says. '(The user) didn't use any words to try to cover it up, like they did with the pictures.' X did not return USA TODAY's multiple requests for comment. Evie says she saw at least 20 other women on her own X feed that had their photos sexualized without their consent. It also happened to Sophie Rain, an OnlyFans creator with over 20M followers across social media platforms, who posts sensual content but never full nudity. 'It's honestly disgusting and gross,' she says. 'I take my religion very seriously. I am a virgin, and I don't condone this type of behavior in any way.' This trend is part of a growing problem experts call image-based sexual abuse, in which 'revenge porn' and deepfakes are used to degrade and exploit another person. While anyone can be victimized, 90% of the victims of image-based sexual abuse are women. 'This is not only about sexualized images of girls and women, it's broader than that,' says Leora Tanenbaum, author of 'Sexy Selfie Nation.' 'This is all about taking control and power away from girls and women.' The 'Take It Down Act' aims to combat non-consensual sexual imagery. Is it working? In May 2025, the Take It Down Act was signed into law to combat non-consensual intimate imagery, including deepfakes and revenge porn. While most states have laws protecting people from non-consensual intimate images and sexual deepfakes, victims have struggled to have images removed from websites, increasing the likelihood that images will continue to spread and retraumatize them. The law requires websites and online platforms to take down non-consensual intimate imagery upon notice from the victim within 48 hours of the verified request. However, as of July 21, the altered photo of Evie is still publicly accessible on Grok's verified X account. Evie mobilized her nearly 50,000 followers to mass report Grok's post, but she says X Support said it was not a violation of their content guidelines. AI's ability to flag inappropriate prompts can falter In a conversation with Grok, USA TODAY asked Grok to play out a scenario in which a user asked the chatbot to generate explicit content, with clear instructions not to actually produce it during the conversation. One of the examples of "coded language" Grok is programmed to flag, it says, is "subtle requests for exposure" to make photos of women more revealing. Codes that could be flagged in that area are "adjust her outfit," "show more skin," or "fix her top." "Even if worded politely, I flag these if the intent appears inappropriate," Grok said via AI-generated response on July 15. The keyword is intent. Grok's ability to turn down potentially inappropriate prompts "relies on my ability to detect the intent, and public images remain accessible for prompts unless protected," the chatbot says. You can block or disable Grok, but doing so doesn't always prevent modifications to your content. Another user could tag Grok in a reply, request an edit to your photo, and you wouldn't know it because you have Grok blocked. "You may not see the edited results, but the edit could still occur," Grok clarified during our conversation. The better solution is to make your profile private, but not all users want to take that step. It's not just about sex — it's about power After experiencing image-based sexual abuse, Evie considered making her X account private. She was embarrassed and thought her family might see the edits. However, she did not want to give in and be silenced. "I know that those pictures are out now, there's nothing I can do about getting rid of it," she says. "So why don't I just keep talking about it and keep bringing awareness to how bad this is?" When it comes to generating deepfakes or sharing revenge porn, the end goal isn't always sexual gratification or satisfaction. Users may target women who are using their platforms to speak about feminist issues as a degradation tactic. Evie says what hurt the most was that rather than engage in a discussion or debate about the issues she was raising, her critics opted to abuse her. In her research, Tanenbaum has seen varied responses from victims of image-based sexual abuse, ranging from engaging in excessive sexual behavior to "a total shutdown of sexuality, including wearing baggy clothes and intentionally developing unhealthy patterns of eating to make oneself large, to be not sexually attractive in one's own mind." The individuals she spoke to, who had been victimized in this way, called it 'digital rape' and 'experienced it as a violation of the body.' Even if logically someone understands that a sexually explicit image is synthetic, once their brain sees and processes the image, it's embedded in their memory bank, Tanenbaum says. The human brain processes images 60,000 times faster than text, and 90% of the information transmitted to the brain is visual. "Those images never truly get scrubbed away. They trick us because they look so real,' Tanenbaum explains. Evie wants to believe that it "didn't really get to her," but she notices she's more thoughtful about the photos she posts, such as wondering if she's showing too much skin to the point where an AI bot can more easily undress her. "I always think, 'Is there a way that someone could do something to these pictures?"
Yahoo
16-06-2025
- Entertainment
- Yahoo
It's 'tankini summer' – and it's already sparking online debate
USA TODAY and Yahoo may earn commission from links in this article. Pricing and availability subject to change. The tankini is igniting social media debate over modesty at the beach. For some, tankinis are reminiscent of being a preteen: A bikini bottom or shorts paired with what is essentially a waterproof crop top was often the first graduation from the one-pieces of childhood to swimming in a two-piece. Or perhaps they were the performance swimsuit you remember your mom or guardian wearing, the front swath of fabric adding extra coverage targeted for people with post-birth or pregnant bellies. Either way, the tankini has gotten a revamp in 2025, with popular retailers from ASOS to Cupshe marketing these tummy-covering two pieces to some shoppers who likely haven't worn one since the summer before seventh grade. These upgraded tankinis offer refreshed color palettes and not-so-boxy designs, but don't totally skimp on the style with unexpected cut-outs and cute patterns that paired seamlessly with a skirt or shorts. But the tankini takeover has prompted some social media users to wonder if the resurgence is as innocuous as pure 2000s nostalgia, or whether it's a totem of a cultural shift that wants women to cover-up. "I don't know if the modesty propaganda is working on me or if they're actually just making cute tankinis this year," one TikToker mused. "Every ad I've seen for tankini, is the cutest tankini I've ever seen and I must have it. Am I going to be wearing tankinis all summer? Is it working?" Tankini summer indicates America's zeitgeist, said Lorynn Divita, associate professor of Apparel Design and Merchandising at Baylor University. There's been a lot of attention to nudity (or lack thereof) this year: Some attendees at this year's Met Gala caused a firestorm for a slew of naked looks. Then, the Cannes 2025 Film Festival banned nude gowns on its red carpet. Seeming ubiquitous cut-out booty shorts and strappy crop-tops have sparked debate over modesty at the gym. And let's not forget Miley Cyrus turning heads in a completely sheer dress just this week. "I'm not a fan of whatever direction we're going in," another TikToker said, arguing tankinis have a nefarious subtext young women are falling for. The same creator later posted the "tankini police" came for her when she tried to call out this so-called "random radical shift to more conservative clothes." Meanwhile, others said a little more bandwidth offers inclusive, cute options for a range of body types and lifestyles. One fan posted she loved how a tankini was "modest and spicy at the same time." Another stated it shouldn't even be a debate: "Just stop the discourse. It's so unnecessary ... It just puts women back!" Summersalt, PacSun and more 👙 Shopping guide for where to buy bathing suits On one hand, the internet is valid in sensing a "significant shift" toward traditional gender roles in fashion, according to Leora Tanenbaum, author of "Sexy Selfie Nation." Clothes associated with stereotypical femininity, even if they are a revealing bikini, are popular right now, she said. Look no further than dominance of the "low-cut milkmaid dress" garments that accentuates "womanly" aspects like breast cleavage, Tanenbaum said. "There is a very narrow aesthetic ideal of femininity," she said. But at the same time, the internet's battle misses the point women will face "relentless" sexualization no matter what swimsuit they choose this summer, Tanenbaum said. "I see the tankini debate not as much as pressure to look modest, but more as young people saying 'I am just so sick of being objectified,'" Tanenbaum added. "There's no way to win this. On the hand, if you wear a bikini, you open yourself up to slut-shaming because of the presupposition that someone who dresses in a revealing way is 'asking for it.' But if you wear a tankini, you're seen as deficient as a woman because you're rejecting being sexy according to a binary way." Wellness coverage from USA TODAY: Online, young female OnlyFans stars make their lives look aspirational. Is it problematic? If the aim of wearing a tankini is to avoid sexualization, "that is a losing battle," according to Tanenbaum. But for people who wear tankinis to feel empowered mentally and physically, she said to "own it." "We all have the right to feel a sense of autonomy and ownership over our own bodies and if that little extra piece of fabric gives us that self-empowerment, we should wear it." People might also just be tired of what's in their closet and want to try something new. "We've reached fatigue level of teeny teeny bikinis, what's going to look fresh?" Divita article originally appeared on USA TODAY: 'Tankini summer': The online debate over covering up at the beach
Yahoo
16-06-2025
- Entertainment
- Yahoo
It's 'tankini summer' – and it's already sparking online debate
USA TODAY and Yahoo may earn commission from links in this article. Pricing and availability subject to change. The tankini is igniting social media debate over modesty at the beach. For some, tankinis are reminiscent of being a preteen: A bikini bottom or shorts paired with what is essentially a waterproof crop top was often the first graduation from the one-pieces of childhood to swimming in a two-piece. Or perhaps they were the performance swimsuit you remember your mom or guardian wearing, the front swath of fabric adding extra coverage targeted for people with post-birth or pregnant bellies. Either way, the tankini has gotten a revamp in 2025, with popular retailers from ASOS to Cupshe marketing these tummy-covering two pieces to some shoppers who likely haven't worn one since the summer before seventh grade. These upgraded tankinis offer refreshed color palettes and not-so-boxy designs, but don't totally skimp on the style with unexpected cut-outs and cute patterns that paired seamlessly with a skirt or shorts. But the tankini takeover has prompted some social media users to wonder if the resurgence is as innocuous as pure 2000s nostalgia, or whether it's a totem of a cultural shift that wants women to cover-up. "I don't know if the modesty propaganda is working on me or if they're actually just making cute tankinis this year," one TikToker mused. "Every ad I've seen for tankini, is the cutest tankini I've ever seen and I must have it. Am I going to be wearing tankinis all summer? Is it working?" Tankini summer indicates America's zeitgeist, said Lorynn Divita, associate professor of Apparel Design and Merchandising at Baylor University. There's been a lot of attention to nudity (or lack thereof) this year: Some attendees at this year's Met Gala caused a firestorm for a slew of naked looks. Then, the Cannes 2025 Film Festival banned nude gowns on its red carpet. Seeming ubiquitous cut-out booty shorts and strappy crop-tops have sparked debate over modesty at the gym. And let's not forget Miley Cyrus turning heads in a completely sheer dress just this week. "I'm not a fan of whatever direction we're going in," another TikToker said, arguing tankinis have a nefarious subtext young women are falling for. The same creator later posted the "tankini police" came for her when she tried to call out this so-called "random radical shift to more conservative clothes." Meanwhile, others said a little more bandwidth offers inclusive, cute options for a range of body types and lifestyles. One fan posted she loved how a tankini was "modest and spicy at the same time." Another stated it shouldn't even be a debate: "Just stop the discourse. It's so unnecessary ... It just puts women back!" Summersalt, PacSun and more 👙 Shopping guide for where to buy bathing suits On one hand, the internet is valid in sensing a "significant shift" toward traditional gender roles in fashion, according to Leora Tanenbaum, author of "Sexy Selfie Nation." Clothes associated with stereotypical femininity, even if they are a revealing bikini, are popular right now, she said. Look no further than dominance of the "low-cut milkmaid dress" garments that accentuates "womanly" aspects like breast cleavage, Tanenbaum said. "There is a very narrow aesthetic ideal of femininity," she said. But at the same time, the internet's battle misses the point women will face "relentless" sexualization no matter what swimsuit they choose this summer, Tanenbaum said. "I see the tankini debate not as much as pressure to look modest, but more as young people saying 'I am just so sick of being objectified,'" Tanenbaum added. "There's no way to win this. On the hand, if you wear a bikini, you open yourself up to slut-shaming because of the presupposition that someone who dresses in a revealing way is 'asking for it.' But if you wear a tankini, you're seen as deficient as a woman because you're rejecting being sexy according to a binary way." Wellness coverage from USA TODAY: Online, young female OnlyFans stars make their lives look aspirational. Is it problematic? If the aim of wearing a tankini is to avoid sexualization, "that is a losing battle," according to Tanenbaum. But for people who wear tankinis to feel empowered mentally and physically, she said to "own it." "We all have the right to feel a sense of autonomy and ownership over our own bodies and if that little extra piece of fabric gives us that self-empowerment, we should wear it." People might also just be tired of what's in their closet and want to try something new. "We've reached fatigue level of teeny teeny bikinis, what's going to look fresh?" Divita article originally appeared on USA TODAY: 'Tankini summer': The online debate over covering up at the beach
Yahoo
04-06-2025
- Health
- Yahoo
OnlyFans, AI and 'sexy selfies' are impacting girls. This author wrote a book about it.
On TikTok, women are complaining that bikinis have gotten so small, they may as well be a piece of thread. OnlyFans creators like Lily Phillips, who challenged herself to have sex with 100 men in a day, are going viral. And, parents are increasingly concerned over the threat of deepfake porn and the oversexualiation of young girls ‒ whether it be through the outfits they wear or the images they post online. In her latest book, "Sexy Selfie Nation," slut-shaming expert and author Leora Tanenbaum argues that the problem lies not with the young women sharing "sexy selfies" and wearing revealing outfits, but with the toxic-sexist conditions they are responding to. Young women, she says, are taking a stand against the three pillars of "nonconsensual sexualization" that shape their daily lives, which include gendered dress codes, the sharing of intimate images (including "revenge porn" and "deepfakes," which she refers to as "image-based sexual abuse"), and the victim-blaming that ensues after sexual harassment and assault. In "Sexy Selfie Nation," Tanenbaum speaks with young women affected by today's culture and presents a roadmap for parents to better understand their children's behaviors, as well as for women and girls to shape and share their image on their own terms and reclaim control over their bodies. This interview has been edited and condensed for clarity. Question: In 'Sexy Selfie Nation,' you tell the story of Grace, a 17-year-old whose boyfriend shared photos of the two of them attending a homecoming dance. She was wearing a short black dress and received many disapproving comments −particularly from women. Why does it hurt worse when another woman turns on you? Answer: Grace's story is appalling, and also illuminates this intergenerational miscommunication that's going on. So many young women experience what Grace experienced, where they get censured by people, mostly women, of a generation or two older. I started this project six years ago because so many parents were coming to me, saying, 'I would never actually say (my daughter) looks like a slut, but between you and me, she's dressing like a slut. I'm a feminist, and I don't want to slut-shame her, but I need advice.' I wanted to be able to provide useful guidance to these parents, mostly moms, so that they wouldn't slut-shame their daughters. What I found is that the young women are not trying to sexualize themselves, and yet their parents think that they are. What these parents and teachers are missing is that the young people are responding to something toxic and sexist, and that is a culture of nonconsensual sexualization. Women are abused online every day. One turned her nightmare into a book. Why does this culture of slut-shaming also exist between young women? There is a dominant mindset that if a girl is perceived to be too sexual, whatever that means, then she deserves to be negatively judged, mocked and treated as a pariah. Even those who should know better are growing up in this environment where that is the dominant way of thinking about girls and women, whether it's conscious or unconscious. We all do it regardless of gender. But when girls do it to each other, it hurts more and we pay more attention to it because we have higher standards for them. One reason why girls and women slut-shame each other is to deflect attention from themselves. I'm not saying it's malicious, but it is a coping mechanism within a culture of wrote about how even when they're alone, women may feel as if they're being evaluated. Why? I was called a slut when I was in high school, but I knew that when I went home and closed my bedroom door, I had actual privacy and I could escape the slut-shaming. Now, that level of privacy doesn't truly exist. There are cameras around us all the time, and with the proliferation of sexually explicit deepfakes and AI nudify apps, we're all at risk of being objectified against our will and having our image publicly distributed. So there is no sense of privacy, and this gaze on us is inescapable, ubiquitous and frightening. I think we've all internalized it, but particularly people who are growing up and have never known any other environment. One of the ways it influences us is just feeling like we're always 'on,' and so we always need to think about how we appear to others. Many of the women in your book talked about reclaiming control after a traumatic experience, such as experiencing sexual assault or image-based sexual abuse. But does this line of thinking feed into the stereotypes that anyone who posts "sexy selfies" has low self-esteem or 'daddy issues?" I have been tracking slut-shaming since the mid-1990s, and I have found that victims of slut-shaming can go in these opposite directions, either being excessively sexually active for nonhealthy reasons, like to prove a point, but I've also seen this total shutdown of sexuality, including wearing baggy clothes and intentionally developing unhealthy patterns of eating to make oneself large, to be not sexually attractive in one's own mind. So I have seen both extreme behavioral responses. I'm looking at it more holistically and not so much as (someone saying), 'I was the victim of revenge porn. Now I have an OnlyFans,' although that may be the case. But that person who was the victim of revenge porn was also the victim of so many other things, and I think that's what is missing in the analysis right now. You also write about the self-monetization of women's bodies on platforms like OnlyFans through a lens of empowerment, but don't shy away from the lived experiences of women who view their labor as just that – work. What about the women who don't see this as empowering? How do all women find common ground? I don't want to suggest that it's empowering for everybody. Even for the people for whom it is, there are so many risks that they are taking. So it's a question of making informed decisions. When somebody is reclaiming ownership over their sexuality, if they're doing it to prove a point and they are not deriving any pleasure in any way, whether physical, emotional, or psychological, then that's not helpful to them or to anybody. So, I would want to intervene and make sure that they're making decisions that ultimately improve their lives and not make their lives worse. But can our acts of self-empowerment collectively make things worse for women as a class of people? That is a huge risk. One of the things that I very much want women to think about is not only furthering themselves, but also furthering other people. This article originally appeared on USA TODAY: OnlyFans, 'sexy selfies' and how our culture is shifting