10-07-2025
AI is Creating a New Gender Divide
Based on facts, either observed and verified firsthand by the reporter, or reported and verified from knowledgeable sources.
Newsweek AI is in beta. Translations may contain inaccuracies—please refer to the original content.
The AI revolution isn't ahead of us; it's here. But, for a technology that's been heralded as the future, it risks bringing with it problems from the past.
Women are adopting generative AI technology at a slower rate than men—data from the Survey of Consumer Expectations found that 50 percent of men are using generative AI tools, compared to 37 percent of women. Further research from Harvard Business School Associate Professor Rembrand Koning found that women are adopting AI tools at a 25 percent lower rate.
So, what's behind women's hesitation to adopt AI?
Whether it's deepfake pornography, discrimination from AI hiring technology, or forms of digital violence online, research and data suggest that women have a fundamentally different relationship to AI than men do. The result? An AI gender gap, where women are being left behind in the technological revolution.
Photo-illustration by Newsweek/Getty/Canva
Newsweek spoke to the experts to find out more about how AI's misogyny maintenance is creating a new gender divide.
What Is The AI Gender Gap?
A 2025 survey from the National Organization for Women (NOW) and Icogni found that 25 percent of women had experienced harassment enabled by technology, including AI-generated deepfake pornography. A study from the Berkeley Haas Center for Equity, Gender, and Leadership, meanwhile, analyzed 133 AI systems from different industries. It found that 44 percent showed gender bias.
Beyond the studies and the data, what is the actual impact of this gender disparity on women?
Enter: the AI gender gap.
Professor Ganna Pogrebna, Lead for Behavioral Data Science at the Alan Turing Institute and Executive Director at the AI and Cyber Futures Institute, told Newsweek over email, "There is mounting evidence that early negative experiences with AI systems—particularly those involving misogyny, sexualization, or coercion—can have profound psychological, behavioral, and societal consequences for women and girls."
"These harms are not abstract; they are embodied in concrete experiences, amplified through algorithmic systems," Pogrebna said.
And AI-inflicted harms begin at a young age. A 2024 report from the Center for Democracy & Technology found that generative AI technologies are worsening the sharing of non-consensual intimate imagery in schools and that female students are most often depicted in this deepfake imagery.
So, what might be the long-term impacts on women and girls if they are having negative or traumatic experiences with AI?
Laura Bates, activist and author of The New Age of Sexism: How AI and Emerging Technologies Are Reinventing Misogyny, told Newsweek, "I think we will see a widening gap in terms of women's access to and uptake of new technologies."
Bates said that this will include AI and that this will have "a devastating impact on everything from women's job prospects and careers to their involvement in further developments in the sector, which will, in turn, continue to intensify the problem because it will mean that new tools are tailored towards men as the majority of users."
Asked if there is a risk that these negative experiences could lead to disengagement with future technologies, putting women on the back foot, Bates said, "Absolutely."
"We already see how differently men and women use and experience existing forms of technology," Bates said. Both men and women experience forms of online harassment, according to the Pew Research Centre, which found in 2021 that 41 percent of Americans had experienced some kind of harassment online; harassment takes different forms. The Pew Research Centre found that 33 percent of women under 35 report experiencing sexual harassment online, compared to 11 percent of men, a figure which doubled from 2017 to 2021.
"Women's use of tech is mediated by an entirely different online experience than men's, marked by abuse, harassment, doxing, threats, stalking and other forms of tech facilitated gender-based violence," Bates said, adding, "It is inevitable that the barrage of abuse women and girls face online, combined with the gender bias inherently baked into many emerging tools, are going to have a chilling effect in terms of women's uptake and participation in new forms of tech."
Pogrebna echoed this: "These traumatic experiences can embed deep mistrust in AI systems and digital institutions."
Woman photographs a Humanoid Robot from AI Life with Bio-Inspired communicative AI, on display at the Consumer Electronics Show (CES) in Las Vegas, Nevada on January 10, 2024.
Woman photographs a Humanoid Robot from AI Life with Bio-Inspired communicative AI, on display at the Consumer Electronics Show (CES) in Las Vegas, Nevada on January 10, 2024.
FREDERIC J. BROWN/AFP via Getty Images
Newsweek also spoke with Dr. Sarah Myers West, co-executive director at the AI Now Institute. In a phone call with Newsweek, she said, "There are disproportionate patterns of reinforcing inequality in ways that lead to harm for women and girls and people of other minorities."
West pointed to "the way AI is intermediating access to our resources or our life chances," and noted, "the AI that gets used, say, in a hiring process and reinforces is historical employment-based discrimination." West said that this is affecting people in ways that are "profoundly consequential."
In 2018, Reuters reported that Amazon had scrapped an AI recruiting tool that was showing bias against women. In 2024, UNESCO's research highlighted that gender bias in AI hiring tools may penalize women through the reproduction of regressive stereotypes.
Asked if negative experiences with AI in hiring scenarios could lead to a sense of mistrust and disengagement, West said, "I think rightly so, if it's being used in that way."
A Problem from the Past, Reinvented for the Future
AI might be increasingly prevalent, but the discourse over it is increasingly polarized. A 2025 survey from YouGov found that one-third of Americans are concerned about the possibility that AI will cause the end of the human race. Additionally, the survey found that Americans are more likely to say that AI will have a negative effect on society than on their own life and that most Americans don't trust AI to make ethical decisions.
But as these apocalyptic alarms sound, concerns over how AI is further encoding misogyny into the fabric of society fall through the cracks. Back in 2024, a report from the UN said that AI is mirroring gendered bias in society, and gender disparity is already pronounced in the tech industry, with the World Economic Forum reporting in 2023 that women account for only 29 percent of science, technology, engineering and math (STEM) workers.
"There is a growing body of evidence showing that AI systems reflect and amplify biases present in the datasets on which they are trained. This includes gender biases, sexualization of women, and reinforcement of harmful stereotypes," Pogrebna said. She added that large language models trained on "internet corpora" are risking "encoding toxic gender stereotypes and normalizing misogynistic narratives."
A 2024 report from UNESCO found that "AI-based systems often perpetuate (and even scale and amplify) human, structural and social biases," producing gender bias, as well as homophobia and racial stereotyping.
Newsweek spoke with Sandra Wachter, a professor of technology and regulation at the Oxford Internet Institute at the University of Oxford in the United Kingdom, about this.
"If AI is somewhat a mirror of society," Wachter said, "It kind of indirectly shows you where your place in the world is."
Wachter then pointed to examples of gender bias in AI, including bias in image generators and text prediction, where AI is more likely to assume a male gender for professions like doctors, and a female gender for professions like nurses. A 2024 study in the JAMA Open Network found that when generating images of physicians, AI text-to-image generators are more likely to depict people who are white and male.
"It's a tacit kind of reminder that certain spots are reserved for you and others are not," Wachter said. "We have to think about what it does to young women and girls."
"How can we praise the technology to be so perfect when it is so problematic for a large portion of our society, right? And just ask the question, who is this technology actually good for? And who does it actually benefit?" Wachter said. She added, "It gives people a very early idea of what your role is supposed to look like in society."
Pointing to the issues with AI, Wachter said, "We would never do this with a car, right? We would never just say, you go and drive. I know it's failing all the time."
"What does it say about the value of being a woman?" she said. "If it's okay that this injury will happen, we know it will happen, but we're going to bring it on the market anyway, and we're going to fix it later."
Newsweek also spoke with Dr. Kanta Dihal, a lecturer in science communication at Imperial College London, who shared some of the concerns that Wachter does. "There is so much that regularly goes wrong around the topics of women and technology in the broader sense," Dihal said.
In terms of the relationship women have with AI, Dihal said there is a feeling of "Is this for me, or is this meant to keep me in my place? Or make things worse for me? Am I the kind of person that the creators of this technology had in mind when they designed it?"
"So many different career paths and our schools as well are indeed introducing AI related technologies that if you don't want to use them, you're already sometimes on the back foot," Dihal said, adding, "It's going to be both a matter of being disadvantaged in school and career progression."
A woman walks past the neon art installation "imAGIne AGI" during the press preview of the XXIV Triennale di Milano at Triennale Design Museum on May 12, 2025, in Milan, Italy.
A woman walks past the neon art installation "imAGIne AGI" during the press preview of the XXIV Triennale di Milano at Triennale Design Museum on May 12, 2025, in Milan, Ahead
So, what would inclusion in AI look like?
Bates told Newsweek that we need to see government regulation of AI technology "at the point they are rolled out to public or corporate use" in order to ensure that safety and ethics standards are met before implementation, "not after women and marginalized communities have already faced significant discrimination."
She added, "With AI technologies poised to become inextricably intertwined with almost every aspect of our personal and professional lives, that must change in order to ensure that women, girls, and marginalized groups are able to reap the same benefits from these technologies as everybody else, without suffering negative consequences."
Meanwhile, Pogrebna told Newsweek, "The marginalisation of women in AI is not an inevitable by-product of technological advancement—it is the result of design choices, governance gaps, and historical inequities embedded in data and institutions. A multi-pronged approach that includes technical, procedural, legal, and cultural reforms is not only possible but has already demonstrated early success in multiple domains."
She added that technical fixes are necessary but insufficient without regulatory frameworks to enforce accountability.
As AI technology continues to develop and become more prevalent, it's clear that the fabric of society continues to change at a rapid pace, and the dream of a tech revolution that leads to a fairer society is still there. What's unclear is if AI is doomed to code a world that's bugged with the same prejudice as the one that came before it.