logo
#

Latest news with #CrushAI

New Facebook lawsuit addresses huge privacy issue
New Facebook lawsuit addresses huge privacy issue

Miami Herald

time14 hours ago

  • Business
  • Miami Herald

New Facebook lawsuit addresses huge privacy issue

Consumers have long been concerned about privacy issues with Facebook, with controversies dating back as far as 2006, when many site users were initially displeased about their content being rebroadcast on their friends' pages as part of the company's newly introduced news feed. Since then, a series of issues emerged related to the unauthorized sharing of information with advertisers, apps, or on Facebook feeds; unauthorized experimentation on Facebook users; and concerns about the company tracking Facebook users across the Internet. Don't miss the move: Subscribe to TheStreet's free daily newsletter Meta (Facebook's owner and operator) has been sued for some of these issues, and the Federal Trade Commission has also taken action against the company. But this time, Meta is actually on the other side of a lawsuit. Specifically, it has filed a suit in Hong Kong against a company called Joy Timeline HK Limited. Meta's lawsuit has high stakes because the software programs that Joy Timeline HK Limited creates and promotes may infringe upon your privacy in very serious ways that most people would hate. Meta is suing Joy Timeline HK Limited because the Hong Kong company won't stop trying to get around Facebook's rules. Joy Timeline created an app called CrushAI, which is a "nudify" app. Essentially, the app includes an AI tool that allows users to upload pictures and "see anyone naked" by using artificial intelligence to generate nude images. Meta bans "non-consensual intimate imagery" on its platform and has removed ads for nudify technology in the past. It has also permanently blocked any websites associated with apps that provide this nudify functionality, and has deleted pages on its platform that run advertising for Joy Timeline HK Limited and the makers of similar apps. Related: Meta commits absurd money to top Google, Microsoft in critical race However, despite banning the app-maker from advertising on Facebook, Meta alleges that Joy Timeline has made "multiple attempts" to circumvent the company's ad review process and get the advertisements back up, despite clearly violating Meta's policies. According to Meta, Joy Timeline has employed a number of different techniques to try to avoid detection when placing content on Facebook, including using inoffensive imagery so that the ads can bypass technology that identifies and blocks ads, in violation of Facebook's terms of service. More on Meta: Meta (Facebook) shocks retail world with unexpected newsMeta quietly plans rude awakening for employees after layoffsSurprising earnings send Meta Platforms stock soaring Meta has been working to develop better tools to delete ads from prohibited companies like this one when those ads appear benign, stating, "We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect with these ads." Still, since Joy Timeline has managed to evade efforts to keep its ads off Facebook's platform, Meta now wants a court order to stop the company from persisting in trying to reach Facebook users. Meta is also doing more than just suing. The company indicated it is planning to share information about the nudify apps, including ad URLs, with other technology companies through Lantern, which is a cross-platform signal-sharing program aimed at promoting child safety. Related: ChatGPT's Sam Altman sends strong 2-word message on the future Still, Meta believes the lawsuit is crucial to helping protect Facebook users and has made clear that it is pursuing legal action because these issues are a key priority. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," a Meta statement reads. "We'll continue to take necessary steps - which could include legal action - against those who abuse our platforms like this." The Arena Media Brands, LLC THESTREET is a registered trademark of TheStreet, Inc.

Meta sues deepfake ‘nudify' app which uses AI to remove clothing from photos
Meta sues deepfake ‘nudify' app which uses AI to remove clothing from photos

The Independent

time18 hours ago

  • Entertainment
  • The Independent

Meta sues deepfake ‘nudify' app which uses AI to remove clothing from photos

Meta is suing a Chinese app maker that uses artificial intelligence to take images of clothed people and turn them into nudes. "CrushAI" — the company behind the app used to make the deepfake nudes — is operated by Joy Timeline HK Limited. Meta filed a lawsuit against the company in Hong Kong to ban it from advertising its services on Meta platforms, CBS News reports. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said in a statement. "We'll continue to take necessary steps — which could include legal action — against those who abuse our platforms like this." According to the lawsuit, Joy Timeline made "multiple attempts" to try to get around Meta's ad review process. Joy Timeline's app isn't the first app of its kind and previous apps that promise to make clothed photos into nudes have actually managed to bypass ad filters on major social media platforms — including Meta — in order to hawk their software. The company said that the "nudify" apps have devised various ways of skirting past the ad filter, including by using inoffensive imagery to try to fly under the radar. "We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect with these ads," Meta said in a statement. Alexios Mantzarlis, the author of the Faked Up blog, told the BBC there had been "at least 10,000 ads" promoting nudify apps on Meta's Facebook and Instagram platforms. "Even as [Meta] was making this announcement, I was able to find a dozen ads by CrushAI live on the platform and a hundred more from other 'nudifiers'," he told the broadcaster. "This abuse vector requires continued monitoring from researchers and the media to keep platforms accountable and curtail the reach of these noxious tools." The threat of the software is that anyone could feasibly take a photo and, without the photo subject's consent, turn it into a fake nude. Meta said that it bans "non-consensual intimate imagery" on its platforms, and previously told CBS News that it removes any ads on its platforms for "nudify" apps. On Thursday, Meta said it would work with the Tech Coalition's Lantern Program — aimed at tracking sites that break child safety rules — to share information with other tech companies about apps, sites, or companies that violate its policies.

Meta sues company for using Facebook ads to promote AI app that creates fake nude images
Meta sues company for using Facebook ads to promote AI app that creates fake nude images

India Today

timea day ago

  • Business
  • India Today

Meta sues company for using Facebook ads to promote AI app that creates fake nude images

Meta (Credit: REUTERS/Gonzalo Fuentes) Meta sues Hong Kong firm Joy Timeline over AI nude image ads The company has been promoting the CrushAI app that creates non-consensual explicit AI images The lawsuit aims to block such ads on Meta platforms in Hong Kong Earlier this year, news reports pointed out that Meta's Instagram was witnessing a bug, showcasing explicit content in the Reel feed. This content included violent sexual attacks, grievous injuries, and other inappropriate videos. While Meta called it a mistake at that time, another similar issue has caught the eye. But this time, Meta is on the other side. In a recent blogpost, the company called out a Hong Kong-based company for advertising consensual intimate imagery using Facebook ads. Meta has also decided to take legal actions against the same. The blogpost stated, "We're suing Joy Timeline HK Limited, the entity behind CrushAI apps, which allow people to create AI-generated nude or sexually explicit images of individuals without their consent. We've filed a lawsuit in Hong Kong, where Joy Timeline HK Limited is based, to prevent them from advertising CrushAI apps on Meta platforms." Meta stated that the app alters photos of individuals, frequently targeting women, to produce non-consensual nude images. The company said Joy Timeline repeatedly broke advertising policies by trying to evade its ad review system, leading Meta to file a lawsuit in Hong Kong to block the ads from resurfacing. 'This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,' Meta said in a statement. 'We'll continue to take the necessary stepsâ€'which could include legal actionâ€'against those who abuse our platforms like this.' The lawsuit follows rising concern from researchers and lawmakers over the spread of so-called 'nudify' apps. These apps have been found online, in app stores, and in Meta's own ad system. In February, US Senator Dick Durbin wrote to Meta CEO Mark Zuckerberg, urging the company to address its involvement in allowing Joy Timeline to advertise CrushAI. The app is accused of breaching Meta's own policies on adult content, sexual activity and harassment. Durbin's concerns were backed by investigations from tech outlet 404 Media and research conducted by Cornell Tech's Alexios Mantzarlis, which uncovered that over 8,000 CrushAI advertisements appeared on Meta's platforms within just the first fortnight of the year. In response, Meta has said it is stepping up enforcement measures by deploying enhanced detection systems capable of identifying such adverts even in the absence of explicit nudity. The company is also implementing content matching tools to swiftly detect and remove duplicate or copycat ads. Furthermore, Meta stated it has been collaborating with both external experts and its own specialised internal teams to monitor how developers of 'nudify' apps attempt to circumvent content moderation efforts. 'We have applied the same strategies used to dismantle coordinated inauthentic behaviour networks in order to locate and eliminate groups of accounts responsible for these advertisements,' Meta explained. The company reported that it has taken down four such networks since the beginning of the year. Meta added that it intends to share intelligence on these apps with other technology companies to assist them in addressing similar threats on their own platforms. Earlier this year, news reports pointed out that Meta's Instagram was witnessing a bug, showcasing explicit content in the Reel feed. This content included violent sexual attacks, grievous injuries, and other inappropriate videos. While Meta called it a mistake at that time, another similar issue has caught the eye. But this time, Meta is on the other side. In a recent blogpost, the company called out a Hong Kong-based company for advertising consensual intimate imagery using Facebook ads. Meta has also decided to take legal actions against the same. The blogpost stated, "We're suing Joy Timeline HK Limited, the entity behind CrushAI apps, which allow people to create AI-generated nude or sexually explicit images of individuals without their consent. We've filed a lawsuit in Hong Kong, where Joy Timeline HK Limited is based, to prevent them from advertising CrushAI apps on Meta platforms." Meta stated that the app alters photos of individuals, frequently targeting women, to produce non-consensual nude images. The company said Joy Timeline repeatedly broke advertising policies by trying to evade its ad review system, leading Meta to file a lawsuit in Hong Kong to block the ads from resurfacing. 'This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,' Meta said in a statement. 'We'll continue to take the necessary stepsâ€'which could include legal actionâ€'against those who abuse our platforms like this.' The lawsuit follows rising concern from researchers and lawmakers over the spread of so-called 'nudify' apps. These apps have been found online, in app stores, and in Meta's own ad system. In February, US Senator Dick Durbin wrote to Meta CEO Mark Zuckerberg, urging the company to address its involvement in allowing Joy Timeline to advertise CrushAI. The app is accused of breaching Meta's own policies on adult content, sexual activity and harassment. Durbin's concerns were backed by investigations from tech outlet 404 Media and research conducted by Cornell Tech's Alexios Mantzarlis, which uncovered that over 8,000 CrushAI advertisements appeared on Meta's platforms within just the first fortnight of the year. In response, Meta has said it is stepping up enforcement measures by deploying enhanced detection systems capable of identifying such adverts even in the absence of explicit nudity. The company is also implementing content matching tools to swiftly detect and remove duplicate or copycat ads. Furthermore, Meta stated it has been collaborating with both external experts and its own specialised internal teams to monitor how developers of 'nudify' apps attempt to circumvent content moderation efforts. 'We have applied the same strategies used to dismantle coordinated inauthentic behaviour networks in order to locate and eliminate groups of accounts responsible for these advertisements,' Meta explained. The company reported that it has taken down four such networks since the beginning of the year. Meta added that it intends to share intelligence on these apps with other technology companies to assist them in addressing similar threats on their own platforms. Join our WhatsApp Channel

Meta sues Hong Kong firm over AI app making non-consensual explicit images
Meta sues Hong Kong firm over AI app making non-consensual explicit images

South China Morning Post

timea day ago

  • Business
  • South China Morning Post

Meta sues Hong Kong firm over AI app making non-consensual explicit images

Meta Platforms is taking a Hong Kong company to court for allegedly using its social media accounts to promote an app that uses artificial intelligence to generate sexually explicit images of people without their consent. In a statement released on its website on Thursday, the American multinational technology company said it had filed a lawsuit in Hong Kong against Joy Timeline HK Limited to prevent the latter from advertising CrushAI apps on Meta's platforms. The app in question allows people to use AI software to create nude or sexually explicit images of people without their consent, the company said. Meta alleged that the Hong Kong company had repeatedly tried to circumvent the tech giant's ad review processes and continued to show content promoting the app after it was removed for breaking Meta's rules. The techniques allegedly used in attempts to get past the review procedures included disguising the adverts' content or their landing page, according to Meta. The lawsuit is part of Meta's efforts to crack down on 'nudify' apps.

Facebook-parent Meta sues company behind app that uses AI to generate nude images
Facebook-parent Meta sues company behind app that uses AI to generate nude images

Time of India

timea day ago

  • Business
  • Time of India

Facebook-parent Meta sues company behind app that uses AI to generate nude images

Facebook parent company Meta has filed a lawsuit against Joy Timeline HK Limited, the entity behind CrushAI, an app that uses artificial intelligence (AI) to generate nude images without consent. The lawsuit, filed in Hong Kong where the company is based, aims to prevent CrushAI from advertising on Meta's platforms, including Instagram and Facebook. The legal action follows a January report from 404 Media, which revealed that CrushAI—also known as Crushmate—had placed over 5,000 ads on Meta's platform. Data showed that 90% of CrushAI's traffic originated from Meta, indicating that the app's advertisements were highly effective in directing users to nonconsensual image-generation tools. What Meta said about the measures it has taken Meta acknowledged the challenge of enforcing its policies against such advertisers. 'This is a highly adversarial space and bad actors are constantly evolving their tactics to avoid enforcement, which is why we continue to invest in the best tools and technology to help identify and remove violating content,' the company said previously. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like 5 Books Warren Buffett Wants You to Read In 2025 Blinkist: Warren Buffett's Reading List Undo While talking about the latest action, Meta said, 'This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it.' 'We'll continue to take the necessary steps—which could include legal action—against those who abuse our platforms like this,' Meta added. Meta announced that any removed ads from nudify apps will now be shared with other tech companies via the Tech Coalition's Lantern programme. This will enable major platforms such as Google, Discord, Roblox, Snap, and Twitch to take similar actions against violators. Additionally, Meta is enhancing its detection technology to better identify deceptive ads, even when they do not explicitly feature nudity. AI Masterclass for Students. Upskill Young Ones Today!– Join Now

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store