Latest news with #nonconsensual


The Independent
5 days ago
- Politics
- The Independent
Senior aide to New Zealand prime minister resigns over secret recordings of sex workers
A senior aide to New Zealand prime minister Christopher Luxon has resigned after being accused of secretly recording sex workers and taking non-consensual photos and videos of women. The allegations against Michael Forbes surfaced when a sex worker discovered his phone was recording audio while he was in the shower, NZ 's Stuff reported earlier this week. Mr Luxon's deputy chief press secretary has since offered his 'sincerest apologies to the women I have harmed'. PM Luxon called the revelations a 'shock' and said that Mr Forbes' behaviour was 'unacceptably short of the standards that I expect from our people'. 'My sympathy is with the women who raised these allegations and were made to feel unsafe due to the actions of this person,' Mr Luxon said on Thursday. According to Stuff, Mr Forbes' encounter with the sex worker took place in July 2024. After being confronted about the secret audio recording, he reportedly handed over his phone password. The woman, along with other sex workers from that brothel, discovered numerous audio recordings of similar sessions, along with photos and videos on the device. At the time, Mr Forbes was serving as press secretary to social development minister Louise Upston. He was appointed acting deputy chief press secretary to the prime minister in February this year. Police reportedly looked into the allegations in July last year but ultimately chose not to pursue charges. Mr Luxon said: 'We had no awareness, no knowledge of it. The concerns were raised with us by a journalist at four o'clock on Tuesday,' he said. The prime minister said that the police investigation was not disclosed to Mr Forbes' employer under the 'no surprises' convention, nor did Mr Forbes disclose the allegations himself. 'He was vetted coming into Louise Upston's office. As I understand it, the incidents happened subsequent to that,' he said. 'He has an obligation to actually declare those issues or those incidences to us. That didn't happen, which is why his employment would have been terminated.' In a statement to RNZ on Thursday, detective inspector John Van Den Heuvel from Wellington police said that 'on examining the phones, police also found a number of photos and video of women in public spaces, and what appears to be women in private addresses, taken from a distance away'. 'Police considered the available evidence and concluded it did not meet the requirements for criminality, and therefore charges could not be filed. 'The individual concerned voluntarily spoke with police and admitted to taking the images and recordings. He was reminded of the inappropriateness of his behaviour and encouraged to seek help.' Meanwhile, the prime minister said: 'When you have an incident like this, it actually creates a whole bunch of new questions.' He added: 'They need to do a deep dive on understanding how and what happened here and why. And importantly, then look at what we need to do to strengthen our processes.' In a statement, Mr Forbes said on Wednesday: 'In the past, I was in a downward spiral due to unresolved trauma and stress, and when confronted with the impacts of my behaviour a year ago, I sought professional help, which is something I wish I had done much earlier.' He said: 'What I failed to do then was make a genuine attempt to apologise. Instead, I tried to move on without offering those I had harmed the acknowledgement, accountability, or amends they deserved. I recognise how wrong that was.' 'I understand that my past actions may have undermined the trust people place in me. So, I have resigned from my job to focus on the work I need to do.' The office of Mr Forbes' former boss Mr Upston said in a statement: 'The minister was not aware of any allegations before they were raised with PMO yesterday. Minister Upston has nothing further to add to the PMs statement on this.'


CBS News
7 days ago
- Politics
- CBS News
Deepfake porn website operator settles with San Francisco, agrees to shut down
The San Francisco City Attorney's office has settled with a company that operated websites creating "deepfake nudes", where artificial intelligence is used to turn photos of children and adults into pornography. City Attorney David Chiu announced Monday that Briver LLC has agreed to a permanent injunction prohibiting the company and its owners from operating any websites that create nonconsensual deepfakes. Briver LLC has also agreed to pay $100,000 in civil penalties. According to Chiu, Briver LLC operated two such websites, which offered users the opportunity to upload clothed images of real people to create realistic-looking nude images, usually for a fee. Before being shut down, the company's websites allowed users to create pornographic images of adults and children. In August of last year, Chiu's office filed suit against 16 of the most visited deepfake nude websites. The websites targeted in the lawsuit had reportedly been visited more than 200 million times in the first six months of 2024. As a result of the investigation, 10 of the sites are now offline or no longer accessible in California, according to the city attorney. Meanwhile, the lawsuit will proceed against the remaining defendants. "While our lawsuit has so far led to an initial settlement as well as shut down 10 websites that exploit women and children, we won't stop until all owners are held accountable and blocked from opening similar sites," Chiu said in a statement. "While generative AI holds enormous promise, these website operators are engaged in blatant sexual abuse and must be stopped." Chiu's office said the images are often used to extort, bully and humiliate women and girls. One such incident involved students at a middle school in Southern California last year. Celebrities, including Taylor Swift, have also been victimized by AI-generated explicit images. Anyone who may have been t he victim of nonconsensual deepfake pornography or has relevant information in the case is asked to contact the San Francisco City Attorney's Office through the agency's consumer complaint web portal or by calling 415-554-3977.


Forbes
22-05-2025
- Politics
- Forbes
Victims Of Explicit Deepfakes Could Sue Under Proposed Law
Rep. Alexandria Ocasio-Cortez is one of the primary sponsors of the DEFIANCE Act (Photo by Drew ...) Earlier this week, President Trump signed the Take It Down Act to force platforms to take down deepfake nudes, revenge porn and other types of non-consensual intimate imagery. Building on that momentum, the DEFIANCE Act was reintroduced today to allow victims to sue those who created or shared these explicit images. The Take It Down Act, already signed into law, requires platforms to remove nonconsensual explicit images within 48 hours of a valid request. Offenders can face fines or up to three years in prison, and platforms that don't comply may be penalized by the FTC. Today, the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act was reintroduced to fill a gap in the Take It Down Act, allowing the subjects of these images to sue for damages. 'We are reintroducing the DEFIANCE Act to grant survivors and victims of nonconsensual deepfake pornography the legal right to pursue justice,' said Representative Alexandria Ocasio-Cortez in a statement. Like the Take It Down Act, the DEFIANCE Act has bipartisan support, and it passed the Senate last summer unanimously. 'I am proud to lead this legislation with Representative Lee, and Senators Durbin and Graham,' Ocasio-Cortez added. If signed, the law would allow victims to take civil action to seek justice against perpetrators who created, distributed, solicited or published deepfake explicit images. AI apps now make it easy to take any photo and create realistic nude images of someone without their consent. In addition to these deepfakes, authentic explicit photos can also be shared without permission. A 2019 report by cybersecurity firm DeepTrace found that 96% of online deepfake videos were nonconsensual and pornographic. By 2023, research from another cybersecurity company, Security Hero, showed that deepfake pornography accounted for 98% of all deepfake videos online, 99% of which targeted women. The new legislation would allow victims to collect compensation for the damage caused by the explicit image. 'Civil recourse is essential because it puts power directly in the hands of victims. Unlike criminal cases, which depend on a prosecutor's decision and are harder to win, civil cases are victim-led and offer a more accessible path to justice,' Omny Miranda Martone, founder and CEO of the Sexual Violence Prevention Association, explained via email. Under the new bill, victims can seek financial restitution for harms, like job loss, therapy or personal security costs. In addition, the Take It Down Act only applies to explicit content that's shared publicly, but online posting is just one way these images are distributed. When professionals are targeted, perpetrators often use other tactics, such as emailing explicit photos directly to a boss or circulating the images among coworkers. When Martone was personally attacked with deepfake pornography, the content was posted on social media and emailed to her employer with demands to fire Martone, unaware that Martone was the company CEO. While the Take It Down Act could have helped remove the content posted on social media, the law wouldn't have protected Martone if they had repercussions from the email. 'The DEFIANCE Act fills this gap,' Martone explains. 'It empowers victims to seek justice against those who create, distribute or send explicit images—whether it's posted online, emailed to a boss or shared among coworkers,' they add. It's worth noting that the sharing of explicit deepfakes is more about exerting control rather than anything sexual. In the workplace, it's used to shame, silence, and undermine women in positions of influence. It's also important to consider the broader consequences. Explicit deepfake image sharing is a type of objectification that reduces people (typically women) to their physical appearance and can strip them of their power and identity. In addition to the psychological consequences for the victim, there can also be career repercussions. Research shows that objectified women are seen as less competent, less relatable, and less human. In addition, researchers found that voters were less likely to support political candidates who had been objectified, and that objectified women were perceived as lacking qualities like helpfulness or emotional depth. In other words, exposure to explicit deepfakes may alter one's views about a woman's skills and capabilities. Whether the harm comes from emotional distress, reputational damage, being seen as less competent, or even a job loss, the DEFIANCE Act offers hope that victims may finally receive some financial restitution. It might even serve as a deterrent, stopping some people from generating the images.


CNET
20-05-2025
- Politics
- CNET
Trump Signs Bill Banning Deepfakes, Nonconsensual Images: What to Know
President Donald Trump signed the bipartisan Take It Down Act into law on Monday, a significant step in regulating the nonconsensual sharing of intimate images, including AI-generated deepfakes. The legislation aims to protect individuals from the harmful effects of such content, which has been increasingly prevalent in the digital age with the expansion of artificial intelligence. "This legislation is a powerful step forward in our efforts to ensure that every American, especially young people, can feel better protected from their image or identity being abused through nonconsensual, intimate imagery," First Lady Melania Trump, who has been an outspoken advocate of the Act, said yesterday at the signing ceremony. Read more: Jamie Lee Curtis Celebrates Meta's Removal of Fake AI Ad What are deepfakes? Deepfakes are realistic but fake images, videos or audio created using artificial intelligence to mimic someone's appearance, voice or actions. One widely reported example was a 2022 viral video of Ukraine's President Volodymyr Zelenskyy. In the altered clip, Zelenskyy appeared to urge Ukrainian soldiers to surrender to Russian forces, a message he never actually delivered. The video was quickly debunked, but it raised serious concerns about the use of deepfakes in disinformation campaigns, especially during wartime. Deepfakes have also been used to spread sexually explicit content or revenge porn. "No organization is immune to impersonation attacks," Matt Moynahan, CEO of GetReal Security, told CNET in an email. "Right now, we are bearing witness to an unprecedented number of AI deepfakes. With the rise of synthetic AI and malicious deepfakes, we can no longer trust our own known experiences. Instead, we must implement tools that integrate greater security at every level and entry point to protect against these looming threats." Read more: Election Deepfakes Are Here and Better Than Ever What are the key aspects of the Take It Down Act? The Take It Down Act prohibits knowingly sharing or threatening to share intimate images of someone without their permission, including digitally altered or AI-generated deepfakes. Here is a breakdown of the law and what it targets: Criminalization of nonconsensual sharing : The act makes it a federal offense to distribute intimate images without the subject's consent. The legislation applies to both real and AI-generated content. : The act makes it a federal offense to distribute intimate images without the subject's consent. The legislation applies to both real and AI-generated content. Mandatory removal : Online platforms, such as tech and social media sites, are required to remove flagged content, including any copies of the material, within 48 hours of notification by the victim. : Online platforms, such as tech and social media sites, are required to remove flagged content, including any copies of the material, within 48 hours of notification by the victim. Mandatory restitution: Violators will face mandatory restitution and criminal penalties such as prison time, fines or both. Violators will face mandatory restitution and criminal penalties such as prison time, fines or both. Protection of minors : The legislation imposes stricter penalties for offenses involving minors, aiming to provide enhanced safeguards for vulnerable individuals. : The legislation imposes stricter penalties for offenses involving minors, aiming to provide enhanced safeguards for vulnerable individuals. Enforcement by the Federal Trade Commission: The FTC is designated as the primary agency responsible for enforcing the provisions of the act. Who supports the Take It Down Act? The First Lady has been a vocal advocate for the legislation over the last several months, emphasizing the need to protect children and teenagers from the damaging effects of online exploitation. Her efforts included public appearances and discussions with lawmakers to garner support for the bill. "It's heartbreaking to witness young teens, especially girls, grappling with the overwhelming challenges posed by malicious online content like deepfakes," Melania Trump said during an event on Capitol Hill in March advocating for the legislation. "This toxic environment can be severely damaging. We must prioritize their well-being by equipping them with support and tools necessary to navigate this hostile digital landscape. Every young person deserves a safe online space to express themselves free without the looming threat of exploitation or harm." The bill, introduced by Sen. Ted Cruz, a Republican of Texas, gained bipartisan backing, with cosponsors including Democratic senators Amy Klobuchar of Minnesota and Cory Booker of New Jersey. It passed the Senate unanimously in February, followed by House approval in April with a 409-2 vote. What are the criticisms and concerns about the bill? While the act has been praised for addressing a growing issue, it has also faced criticism from various groups. Some digital rights organizations express concerns that the law could infringe on privacy and free speech, particularly regarding the potential for false reports and the impact on encrypted communications. There are also apprehensions about the enforcement of the law and its potential misuse for political purposes. For instance, representatives of The Cyber Civil Rights Initiative, a nonprofit that supports victims of online abuse, voiced strong concerns about the bill, according to PBS News. The group criticized the takedown provision as overly broad, vaguely written and lacking clear protections to prevent misuse. Why does this matter? Trump's signing of the Take It Down Act on Monday marks only the sixth bill signed into law so far in his second term. By his 100th day back in office, he had enacted only five, marking the lowest number of new laws signed by a president in the first 100 days of a term since the Eisenhower administration in the 1950s, based on an analysis of congressional records by NBC News. The signing of the Take It Down Act also represents a significant move toward regulating nonconsensual intimate imagery in the digital realm. While it aims to provide greater protection for individuals, ongoing discussions will be essential to address the concerns and ensure the law's effective and fair implementation.


News24
20-05-2025
- Politics
- News24
Trump signs ‘Take it Down' revenge porn bill into law
The 'Take It Down Act' was signed into US federal law. The law makes it a crime to post 'revenge porn'. Websites will be obliged to remove the content within 48 hours. US President Donald Trump signed a bill on Monday making it a federal crime to post 'revenge porn' - whether it is real or generated by artificial intelligence. The 'Take It Down Act', passed with overwhelming bipartisan congressional support, criminalises non-consensual publication of intimate images, while also mandating their removal from online platforms. 'With the rise of AI image generation, countless women have been harassed with deepfakes and other explicit images distributed against their will,' Trump said at a signing ceremony in the Rose Garden of the White House. 'And today we're making it totally illegal,' the president said. 'Anyone who intentionally distributes explicit images without the subject's consent will face up to three years in prison.' Websites that fail to remove the images promptly, within 48 hours, will face civil liabilities, Trump said. First Lady Melania Trump endorsed the bill in early March and attended the signing ceremony in a rare public White House appearance. The First Lady has largely been an elusive figure at the White House since her husband took the oath of office on 20 January, spending only limited time in Washington. In remarks at the signing ceremony, she described the bill as a 'national victory that will help parents and families protect children from online exploitation'. This legislation is a powerful step forward in our efforts to ensure that every American, especially young people, can feel better protected from their image or identity being abused. Melania Trump Deepfakes often rely on artificial intelligence and other tools to create realistic-looking fake videos. They can be used to create falsified pornographic images of real women, which are then published without their consent and proliferate. Some US states, including California and Florida, have laws criminalising the publication of sexually explicit deepfakes, but critics have voiced concerns the 'Take It Down Act' grants the authorities increased censorship power. The Electronic Frontier Foundation, a nonprofit focused on free expression, has said the bill gives 'the powerful a dangerous new route to manipulate platforms into removing lawful speech that they simply don't like'. The bill would require social media platforms and websites to have procedures in place to swiftly remove non-consensual intimate imagery upon notification from a victim. An online boom in non-consensual deepfakes is currently outpacing efforts to regulate the technology around the world due to a proliferation of AI tools, including photo apps digitally undressing women. While high-profile politicians and celebrities, including singer Taylor Swift and Democratic congresswoman Alexandria Ocasio-Cortez, have been victims of deepfake porn, experts say women not in the public eye are equally vulnerable. A wave of AI porn scandals has been reported at schools across US states with hundreds of teenagers targeted by their own classmates. Such non-consensual imagery can lead to harassment, bullying or blackmail, sometimes causing devastating mental health consequences, experts warn. Renee Cummings, an AI and data ethicist and criminologist at the University of Virginia, said the bill is a 'significant step' in addressing the exploitation of AI-generated deepfakes and non-consensual imagery. 'Its effectiveness will depend on swift and sure enforcement, severe punishment for perpetrators and real-time adaptability to emerging digital threats,' Cummings told AFP. At least one mother hailed the new legislation as a step in the right direction. 'It's a very important first step,' Dorota Mani told AFP on Monday, calling it a 'very powerful bill'. As the mother of a young victim, Mani said she felt empowered because 'now I have a legal weapon in my hand, which nobody can say no to'.