Latest news with #JoyTimelineHKLimited

Miami Herald
15 hours ago
- Business
- Miami Herald
New Facebook lawsuit addresses huge privacy issue
Consumers have long been concerned about privacy issues with Facebook, with controversies dating back as far as 2006, when many site users were initially displeased about their content being rebroadcast on their friends' pages as part of the company's newly introduced news feed. Since then, a series of issues emerged related to the unauthorized sharing of information with advertisers, apps, or on Facebook feeds; unauthorized experimentation on Facebook users; and concerns about the company tracking Facebook users across the Internet. Don't miss the move: Subscribe to TheStreet's free daily newsletter Meta (Facebook's owner and operator) has been sued for some of these issues, and the Federal Trade Commission has also taken action against the company. But this time, Meta is actually on the other side of a lawsuit. Specifically, it has filed a suit in Hong Kong against a company called Joy Timeline HK Limited. Meta's lawsuit has high stakes because the software programs that Joy Timeline HK Limited creates and promotes may infringe upon your privacy in very serious ways that most people would hate. Meta is suing Joy Timeline HK Limited because the Hong Kong company won't stop trying to get around Facebook's rules. Joy Timeline created an app called CrushAI, which is a "nudify" app. Essentially, the app includes an AI tool that allows users to upload pictures and "see anyone naked" by using artificial intelligence to generate nude images. Meta bans "non-consensual intimate imagery" on its platform and has removed ads for nudify technology in the past. It has also permanently blocked any websites associated with apps that provide this nudify functionality, and has deleted pages on its platform that run advertising for Joy Timeline HK Limited and the makers of similar apps. Related: Meta commits absurd money to top Google, Microsoft in critical race However, despite banning the app-maker from advertising on Facebook, Meta alleges that Joy Timeline has made "multiple attempts" to circumvent the company's ad review process and get the advertisements back up, despite clearly violating Meta's policies. According to Meta, Joy Timeline has employed a number of different techniques to try to avoid detection when placing content on Facebook, including using inoffensive imagery so that the ads can bypass technology that identifies and blocks ads, in violation of Facebook's terms of service. More on Meta: Meta (Facebook) shocks retail world with unexpected newsMeta quietly plans rude awakening for employees after layoffsSurprising earnings send Meta Platforms stock soaring Meta has been working to develop better tools to delete ads from prohibited companies like this one when those ads appear benign, stating, "We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect with these ads." Still, since Joy Timeline has managed to evade efforts to keep its ads off Facebook's platform, Meta now wants a court order to stop the company from persisting in trying to reach Facebook users. Meta is also doing more than just suing. The company indicated it is planning to share information about the nudify apps, including ad URLs, with other technology companies through Lantern, which is a cross-platform signal-sharing program aimed at promoting child safety. Related: ChatGPT's Sam Altman sends strong 2-word message on the future Still, Meta believes the lawsuit is crucial to helping protect Facebook users and has made clear that it is pursuing legal action because these issues are a key priority. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," a Meta statement reads. "We'll continue to take necessary steps - which could include legal action - against those who abuse our platforms like this." The Arena Media Brands, LLC THESTREET is a registered trademark of TheStreet, Inc.


India Today
a day ago
- Business
- India Today
Meta sues company for using Facebook ads to promote AI app that creates fake nude images
Meta (Credit: REUTERS/Gonzalo Fuentes) Meta sues Hong Kong firm Joy Timeline over AI nude image ads The company has been promoting the CrushAI app that creates non-consensual explicit AI images The lawsuit aims to block such ads on Meta platforms in Hong Kong Earlier this year, news reports pointed out that Meta's Instagram was witnessing a bug, showcasing explicit content in the Reel feed. This content included violent sexual attacks, grievous injuries, and other inappropriate videos. While Meta called it a mistake at that time, another similar issue has caught the eye. But this time, Meta is on the other side. In a recent blogpost, the company called out a Hong Kong-based company for advertising consensual intimate imagery using Facebook ads. Meta has also decided to take legal actions against the same. The blogpost stated, "We're suing Joy Timeline HK Limited, the entity behind CrushAI apps, which allow people to create AI-generated nude or sexually explicit images of individuals without their consent. We've filed a lawsuit in Hong Kong, where Joy Timeline HK Limited is based, to prevent them from advertising CrushAI apps on Meta platforms." Meta stated that the app alters photos of individuals, frequently targeting women, to produce non-consensual nude images. The company said Joy Timeline repeatedly broke advertising policies by trying to evade its ad review system, leading Meta to file a lawsuit in Hong Kong to block the ads from resurfacing. 'This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,' Meta said in a statement. 'We'll continue to take the necessary stepsâ€'which could include legal actionâ€'against those who abuse our platforms like this.' The lawsuit follows rising concern from researchers and lawmakers over the spread of so-called 'nudify' apps. These apps have been found online, in app stores, and in Meta's own ad system. In February, US Senator Dick Durbin wrote to Meta CEO Mark Zuckerberg, urging the company to address its involvement in allowing Joy Timeline to advertise CrushAI. The app is accused of breaching Meta's own policies on adult content, sexual activity and harassment. Durbin's concerns were backed by investigations from tech outlet 404 Media and research conducted by Cornell Tech's Alexios Mantzarlis, which uncovered that over 8,000 CrushAI advertisements appeared on Meta's platforms within just the first fortnight of the year. In response, Meta has said it is stepping up enforcement measures by deploying enhanced detection systems capable of identifying such adverts even in the absence of explicit nudity. The company is also implementing content matching tools to swiftly detect and remove duplicate or copycat ads. Furthermore, Meta stated it has been collaborating with both external experts and its own specialised internal teams to monitor how developers of 'nudify' apps attempt to circumvent content moderation efforts. 'We have applied the same strategies used to dismantle coordinated inauthentic behaviour networks in order to locate and eliminate groups of accounts responsible for these advertisements,' Meta explained. The company reported that it has taken down four such networks since the beginning of the year. Meta added that it intends to share intelligence on these apps with other technology companies to assist them in addressing similar threats on their own platforms. Earlier this year, news reports pointed out that Meta's Instagram was witnessing a bug, showcasing explicit content in the Reel feed. This content included violent sexual attacks, grievous injuries, and other inappropriate videos. While Meta called it a mistake at that time, another similar issue has caught the eye. But this time, Meta is on the other side. In a recent blogpost, the company called out a Hong Kong-based company for advertising consensual intimate imagery using Facebook ads. Meta has also decided to take legal actions against the same. The blogpost stated, "We're suing Joy Timeline HK Limited, the entity behind CrushAI apps, which allow people to create AI-generated nude or sexually explicit images of individuals without their consent. We've filed a lawsuit in Hong Kong, where Joy Timeline HK Limited is based, to prevent them from advertising CrushAI apps on Meta platforms." Meta stated that the app alters photos of individuals, frequently targeting women, to produce non-consensual nude images. The company said Joy Timeline repeatedly broke advertising policies by trying to evade its ad review system, leading Meta to file a lawsuit in Hong Kong to block the ads from resurfacing. 'This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,' Meta said in a statement. 'We'll continue to take the necessary stepsâ€'which could include legal actionâ€'against those who abuse our platforms like this.' The lawsuit follows rising concern from researchers and lawmakers over the spread of so-called 'nudify' apps. These apps have been found online, in app stores, and in Meta's own ad system. In February, US Senator Dick Durbin wrote to Meta CEO Mark Zuckerberg, urging the company to address its involvement in allowing Joy Timeline to advertise CrushAI. The app is accused of breaching Meta's own policies on adult content, sexual activity and harassment. Durbin's concerns were backed by investigations from tech outlet 404 Media and research conducted by Cornell Tech's Alexios Mantzarlis, which uncovered that over 8,000 CrushAI advertisements appeared on Meta's platforms within just the first fortnight of the year. In response, Meta has said it is stepping up enforcement measures by deploying enhanced detection systems capable of identifying such adverts even in the absence of explicit nudity. The company is also implementing content matching tools to swiftly detect and remove duplicate or copycat ads. Furthermore, Meta stated it has been collaborating with both external experts and its own specialised internal teams to monitor how developers of 'nudify' apps attempt to circumvent content moderation efforts. 'We have applied the same strategies used to dismantle coordinated inauthentic behaviour networks in order to locate and eliminate groups of accounts responsible for these advertisements,' Meta explained. The company reported that it has taken down four such networks since the beginning of the year. Meta added that it intends to share intelligence on these apps with other technology companies to assist them in addressing similar threats on their own platforms. Join our WhatsApp Channel


The Hill
2 days ago
- Business
- The Hill
Meta sues developer of ‘nudify' app CrushAI
Meta filed a lawsuit against a developer for allegedly running advertisements to promote its 'nudify' apps which use artificial intelligence to create non-consensual nude or sexually explicit images. The suit accuses Joy TimelineHK Limited, the developer behind CrushAI apps, of violating Meta's rules against non-consensual intimate imagery. Meta noted its policies were updated more than a year ago to further clarify the promotion of nudify apps or related products is not permitted on their platforms. Meta claimed the Hong Kong-based company attempted to 'circumvent' Meta's ad review process and continued to run the ads even after the social media firm removed them. The Hill reached out to Joy TimelineHK Limited for comment. 'This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,' Meta wrote in a release Thursday. The Facebook and Instagram parent company touted how it removes these types of ads once its teams are made aware. Meta also blocks links to websites and restricts search terms like 'nudify,' 'delete clothing,' or 'undress.' The lawsuit is part of Meta's broader fight against nudify apps. In addition the work on its own platforms, the technology firm said has started sharing links for violating apps with other tech companies, proviing more than 3,800 links since the end of March. Meta also is developing new technology designed to more easily identify these ads, even if they do not include nudity, and have expert teams tracking down account networks accused of running these ads. Social media companies have faced increased pressure to limit this type of content on its platforms, from both lawmakers and tech safety groups. This comes just weeks after President Trump signed the Take It Down Act, making it a crime to knowingly publish sexually explicit 'deepfake' images and videos online. Meta said it 'welcomes legislation that helps fight intimate image abuse across the internet' and applauded the Take it Down Act.
Yahoo
2 days ago
- Business
- Yahoo
Meta sues maker of explicit deepfake app for dodging its rules to advertise AI ‘nudifying' tech
Meta is suing the Hong Kong-based maker of the app CrushAI, a platform capable of creating sexually explicit deepfakes, claiming that it repeatedly circumvented the social media company's rules to purchase ads. The suit is part of what Meta (META) described as a wider effort to crack down on so-called 'nudifying' apps — which allow users to create nude or sexualized images from a photo of someone's face, even without their consent — following claims that the social media giant was failing to adequately address ads for those services on its platforms. As of February, the maker of CrushAI, also known as Crushmate and by several other names, had run more than 87,000 ads on Meta platforms that violated its rules, according to the complaint Meta filed in Hong Kong district court Thursday. Meta alleges the app maker, Joy Timeline HK Limited, violated its rules by creating a network of at least 170 business accounts on Facebook or Instagram to buy the ads. The app maker also allegedly had more than 55 active users managing over 135 Facebook pages where the ads were displayed. The ads primarily targeted users in the United States, Canada, Australia, Germany and the United Kingdom. 'Everyone who creates an account on Facebook or uses Facebook must agree to the Meta Terms of Service,' the complaint states. Some of those ads included sexualized or nude images generated by artificial intelligence and were captioned with phrases like 'upload a photo to strip for a minute' and 'erase any clothes on girls,' according to the lawsuit. CNN has reached out to Joy Timeline HK Limited for comment on the lawsuit. Tech platforms face growing pressure to do more to address non-consensual, explicit deepfakes, as AI makes it easier than ever to create such images. Targets of such deepfakes have included prominent figures such as Taylor Swift and Rep. Alexandria Ocasio-Cortez, as well as high school girls across the United States. The Take It Down Act, which makes it illegal for individuals to share non-consensual, explicit deepfakes online and requires tech platforms to quickly remove them, was signed into law last month. But a series of media reports in recent months suggest that these nudifying AI services have found an audience by advertising on Meta's platforms. In January, reports from tech newsletter Faked Up and outlet 404Media found that CrushAI had published thousands of ads on Instagram and Facebook and that 90% of the app's traffic was coming from Meta's platforms. That's despite the fact that Meta prohibits ads that contain adult nudity and sexual activity, and forbids sharing non-consensual intimate images and content that promotes sexual exploitation, bullying and harassment. Following those reports, Sen. Dick Durbin, Democrat and ranking member of the Senate Judiciary Committee, wrote to Meta CEO Mark Zuckerberg asking 'how Meta allowed this to happen and what Meta is doing to address this dangerous trend.' Earlier this month, CBS News reported that it had identified hundreds of advertisements promoting nudifying apps across Meta's platforms, including ads that featured sexualized images of celebrities. Other ads on the platforms pointed to websites claiming to animate deepfake images of real people to make them appear to perform sex acts, the report stated. In response to that report, Meta said it had 'removed these ads, deleted the Pages responsible for running them and permanently blocked the URLs associated with these apps.' Meta says it reviews ads before they run on its platforms, but its complaint indicates that it has struggled to enforce its rules. According to the complaint, some of the CrushAI ads blatantly advertised its nudifying capabilities with captions such as 'Ever wish you could erase someone's clothes? Introducing our revolutionary technology' and 'Amazing! This software can erase any clothes.' Now, Meta said its lawsuit against the CrushAI maker aims to prevent it from further circumventing its rules to place ads on its platforms. Meta alleges it has lost $289,000 because of the costs of the investigation, responding to regulators and enforcing its rules against the app maker. When it announced the lawsuit Thursday, the company also said it had developed new technology to identify these types of ads, even if the ads themselves didn't contain nudity. Meta's 'specialist teams' partnered with external experts to train its automated content moderation systems to detect the terms, phrases and emojis often present in such ads. 'This is an adversarial space in which the people behind it — who are primarily financially motivated — continue to evolve their tactics to avoid detection,' the company said in a statement. 'Some use benign imagery in their ads to avoid being caught by our nudity detection technology, while others quickly create new domain names to replace the websites we block.' Meta said it had begun sharing information about nudifying apps attempting to advertise on its sites with other tech platforms through a program called Lantern, run by industry group the Tech Coalition. Tech giants created Lantern in 2023 to share data that could help them fight child sexual exploitation online. The push to crack down on deepfake apps comes after Meta dialed back some of its automated content removal systems — prompting some backlash from online safety experts. Zuckerberg announced earlier this year that those systems would be focused on checking only for illegal and 'high-severity' violations such as those related to terrorism, child sexual exploitation, drugs, fraud and scams. Other concerns must be reported by users before the company evaluates them.
Yahoo
2 days ago
- Business
- Yahoo
Meta files lawsuit against maker of "nudify" app technology
Meta said Thursday that it's suing an app maker that uses artificial intelligence to simulate nude images of real people who appear clothed in pictures. Meta said it filed a lawsuit in Hong Kong against Joy Timeline HK Limited, the entity behind app CrushAI, to prevent it from advertising CrushAI apps on Meta platforms. "This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it," Meta said in a statement. "We'll continue to take necessary steps — which could include legal action — against those who abuse our platforms like this." The legal action comes after Joy Timeline made "multiple attempts" to circumvent Meta's ad review process, Meta alleges. Ads for so-called nudify apps have appeared on Meta's Facebook and instagram platforms despite violating the social media sites' advertising policies. CrushAI, which makes the apps, promoted AI tools that it says lets users upload photos and "see anyone naked," a CBS News investigation found. Meta has said the company bans "non-consensual intimate imagery" on its platforms. The company previously told CBS News that it has removed ads for nudify technology, deleted pages on its platforms that run the spots and permanently blocked websites associated with the apps. Meta on Thursday said it will share information, including ad URLs, about entities that violate its policies with other techn companies through the Tech Coalition's Lantern Program, which tracks behaviors that violate their child safety rules. Since March, Meta has provided the program with information on more than 3,800 sites that is shared with other tech companies, according to the company. Meta said advertisers of nudify apps use various means to to avoid detection on its platforms, including by using inoffensive imagery to try to circumvent tech used to identify such ads on it sites. As a result, it has developed better technology to detect ads from nudify apps that are presented as benign, Meta said Thursday. "We've worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect with these ads," Meta said. An accused woman skips her pedicure, kills her ex-husband Coons blasts Hegseth for request to eliminate funding for Ukraine's war against Russia Watch California Gov. Gavin Newsom's full speech on federal response to Los Angeles protests