logo
Hingham family files Title IX complaint after student creates deepfake image of their daughter

Hingham family files Title IX complaint after student creates deepfake image of their daughter

Yahooa day ago

Megan Mancini filed a Title IX complaint in Hingham Public Schools after she says her daughter was a victim of sexual harassment.
Mancini says another student created a Deepfake pornographic image of her daughter using artificial intelligence.
'She was devastated, I mean she definitely felt violated, she wanted something to be done about it, and at that point we had notified the school, the police,' said Mancini.
After Mancini filed a complaint about the incident in January, Hingham schools launched an investigation.
After about four and a half months, the district sent a letter to Mancini, saying that while the student's conduct was 'inappropriate and hurtful, there is insufficient evidence to conclude it occurred in the District's schools.'
'The image was shared in the school hallways, amongst other students during school hours, and it was also shared via text,' said Mancini.
Mancini was disappointed to learn the student responsible for creating that nude photo of her daughter would not be disciplined at Hingham Middle School.
'It makes me feel like the school failed,' said Mancini.
Legal expert Peter Elikann says families could press charges for this under the state's new Revenge Porn and Sexting law.
'The word needs to go out among young people that you can be criminally prosecuted in juvenile court for sending nude images of someone else without their consent,' said Elikann.
He says that includes Deepfakes or AI-generated photos.
'The fact that people can create all kinds of fake pornography online, and young people seem to know how to do it, it's really hit a huge crisis point,' said Elikann.
'I think it's important to have swift action, and I think we missed that critical window,' said Mancini.
Mancini hopes school leaders can start to take more action on these cases to prevent them from happening again, even if districts claim not to have jurisdiction.
'There was not one communication sent out from the school department or the school administration about this issue, and for you know, a heads up, awareness to parents that this is going on, this is going on in middle school, and it's going to get nothing but worse,' said Mancini.
This conduct is becoming such a problem that the state has a youth diversion program to teach minors about the dangers of sharing nude photos, if they're prosecuted in cases like this.
Boston 25 News reached out to Hingham Public Schools multiple times on this issue, but they haven't responded.
This is a developing story. Check back for updates as more information becomes available.
Download the FREE Boston 25 News app for breaking news alerts.
Follow Boston 25 News on Facebook and Twitter. | Watch Boston 25 News NOW

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Guatemalan national indicted after federal agent seriously injured in Rhode Island, US Attorney says
Guatemalan national indicted after federal agent seriously injured in Rhode Island, US Attorney says

Yahoo

time4 hours ago

  • Yahoo

Guatemalan national indicted after federal agent seriously injured in Rhode Island, US Attorney says

A federal grand jury has indicted a Guatemalan national after a Homeland Security agent suffered a serious injury while trying to arrest him on a warrant in April, the U.S. Attorney said. Miguel Tamup-Tamup, a/k/a Miguel US Tamup, 28, who was previously deported and is in the U.S. illegally, is charged with two counts of assaulting, resisting, opposing, impeding, or interfering with federal officers engaged in official duties, Acting U.S. Attorney Sara Miron Bloom said in a statement on Thursday. Federal authorities arrested Tamup-Tamup on May 16 at a home in Providence. He has been detained since making an initial appearance in federal court on that day. On April 30, a Homeland Security Investigations agent suffered a serious leg injury during an encounter with Tamup-Tamup, who fled authorities that day after he was previously arrested after a DUI crash, Miron Bloom said. On April 19, police arrested Tamup-Tamup for driving under the influence after his car allegedly collided with another vehicle, prosecutors said. He was subsequently arraigned and released. His fingerprints matched ICE fingerprint records associated with a person flagged as being in the United States illegally. On April 30, an ICE deportation officer and Homeland Security Investigations agents stopped a car that Tamup-Tamup was driving, Miron Bloom said. They tried to apprehend him on an arrest warrant. After he refused to get out of his car, Tamup-Tamup was guided out of the vehicle, prosecutors said. While the agents tried to place Tamup-Tamup in handcuffs, he 'allegedly resisted, threw his upper body and shoulders against the agents, flailed his arms, and broke an agent's hold,' Miron Bloom said. One of the agents fell to the ground and suffered a serious leg injury, Miron Bloom said. Tamup-Tamup ran away from law enforcement as others were helping the injured agent. The agent's condition was not known on Thursday. This is a developing story. Check back for updates as more information becomes available. Download the FREE Boston 25 News app for breaking news alerts. Follow Boston 25 News on Facebook and Twitter. | Watch Boston 25 News NOW

Two children seriously injured after being struck by car in NH, police say
Two children seriously injured after being struck by car in NH, police say

Yahoo

time5 hours ago

  • Yahoo

Two children seriously injured after being struck by car in NH, police say

Police blocked off several roadways on Thursday afternoon after two children were struck by a car. Authorities say the collision happened on Belmont Street. The two juvenile victims were transported to an area hospital with serious injuries. It's unclear if the driver remained at the scene or if they will face charges. Belmont at Hanover and Belmont from Lake Avenue to Spruce Street are closed to traffic. All motorists are being asked to avoid the area. The incident remains under investigation. This is a developing story. Check back for updates as more information becomes available. Download the FREE Boston 25 News app for breaking news alerts. Follow Boston 25 News on Facebook and Twitter. | Watch Boston 25 News NOW

Meta sues maker of explicit deepfake app for dodging its rules to advertise AI ‘nudifying' tech
Meta sues maker of explicit deepfake app for dodging its rules to advertise AI ‘nudifying' tech

CNN

time14 hours ago

  • CNN

Meta sues maker of explicit deepfake app for dodging its rules to advertise AI ‘nudifying' tech

Meta is suing the Hong Kong-based maker of the app CrushAI, a platform capable of creating sexually explicit deepfakes, claiming that it repeatedly circumvented the social media company's rules to purchase ads. The suit is part of what Meta (META) described as a wider effort to crack down on so-called 'nudifying' apps — which allow users to create nude or sexualized images from a photo of someone's face, even without their consent — following claims that the social media giant was failing to adequately address ads for those services on its platforms. As of February, the maker of CrushAI, also known as Crushmate and by several other names, had run more than 87,000 ads on Meta platforms that violated its rules, according to the complaint Meta filed in Hong Kong district court Thursday. Meta alleges the app maker, Joy Timeline HK Limited, violated its rules by creating a network of at least 170 business accounts on Facebook or Instagram to buy the ads. The app maker also allegedly had more than 55 active users managing over 135 Facebook pages where the ads were displayed. The ads primarily targeted users in the United States, Canada, Australia, Germany and the United Kingdom. 'Everyone who creates an account on Facebook or uses Facebook must agree to the Meta Terms of Service,' the complaint states. Some of those ads included sexualized or nude images generated by artificial intelligence and were captioned with phrases like 'upload a photo to strip for a minute' and 'erase any clothes on girls,' according to the lawsuit. CNN has reached out to Joy Timeline HK Limited for comment on the lawsuit. Tech platforms face growing pressure to do more to address non-consensual, explicit deepfakes, as AI makes it easier than ever to create such images. Targets of such deepfakes have included prominent figures such as Taylor Swift and Rep. Alexandria Ocasio-Cortez, as well as high school girls across the United States. The Take It Down Act, which makes it illegal for individuals to share non-consensual, explicit deepfakes online and requires tech platforms to quickly remove them, was signed into law last month. But a series of media reports in recent months suggest that these nudifying AI services have found an audience by advertising on Meta's platforms. In January, reports from tech newsletter Faked Up and outlet 404Media found that CrushAI had published thousands of ads on Instagram and Facebook and that 90% of the app's traffic was coming from Meta's platforms. That's despite the fact that Meta prohibits ads that contain adult nudity and sexual activity, and forbids sharing non-consensual intimate images and content that promotes sexual exploitation, bullying and harassment. Following those reports, Sen. Dick Durbin, Democrat and ranking member of the Senate Judiciary Committee, wrote to Meta CEO Mark Zuckerberg asking 'how Meta allowed this to happen and what Meta is doing to address this dangerous trend.' Earlier this month, CBS News reported that it had identified hundreds of advertisements promoting nudifying apps across Meta's platforms, including ads that featured sexualized images of celebrities. Other ads on the platforms pointed to websites claiming to animate deepfake images of real people to make them appear to perform sex acts, the report stated. In response to that report, Meta said it had 'removed these ads, deleted the Pages responsible for running them and permanently blocked the URLs associated with these apps.' Meta says it reviews ads before they run on its platforms, but its complaint indicates that it has struggled to enforce its rules. According to the complaint, some of the CrushAI ads blatantly advertised its nudifying capabilities with captions such as 'Ever wish you could erase someone's clothes? Introducing our revolutionary technology' and 'Amazing! This software can erase any clothes.' Now, Meta said its lawsuit against the CrushAI maker aims to prevent it from further circumventing its rules to place ads on its platforms. Meta alleges it has lost $289,000 because of the costs of the investigation, responding to regulators and enforcing its rules against the app maker. When it announced the lawsuit Thursday, the company also said it had developed new technology to identify these types of ads, even if the ads themselves didn't contain nudity. Meta's 'specialist teams' partnered with external experts to train its automated content moderation systems to detect the terms, phrases and emojis often present in such ads. 'This is an adversarial space in which the people behind it — who are primarily financially motivated — continue to evolve their tactics to avoid detection,' the company said in a statement. 'Some use benign imagery in their ads to avoid being caught by our nudity detection technology, while others quickly create new domain names to replace the websites we block.' Meta said it had begun sharing information about nudifying apps attempting to advertise on its sites with other tech platforms through a program called Lantern, run by industry group the Tech Coalition. Tech giants created Lantern in 2023 to share data that could help them fight child sexual exploitation online. The push to crack down on deepfake apps comes after Meta dialed back some of its automated content removal systems — prompting some backlash from online safety experts. Zuckerberg announced earlier this year that those systems would be focused on checking only for illegal and 'high-severity' violations such as those related to terrorism, child sexual exploitation, drugs, fraud and scams. Other concerns must be reported by users before the company evaluates them.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store