Latest news with #CommunityStandards


San Francisco Chronicle
2 days ago
- Politics
- San Francisco Chronicle
Posting about the L.A. protests? Apparently that can get you banned from Facebook
Apparently, acknowledging the existence of violence can get you kicked off Facebook. Rebecca Solnit 's account on Meta's social media network has been suspended, the San Francisco author and activist posted to Bluesky on Monday, June 10. 'Facebook decided to suspend my account because of a piece (below) I wrote Monday about violence which in no way advocates for it (but does point out who is violent in the current ruckus),' Solnit wrote. She included a screenshot of Facebook's explanation of its decision, which reads, 'Your account, or activity on it, doesn't follow our Community Standards on account integrity.' Solnit did not explain how, beyond timing, she believed that the essay in question, 'Some Notes on the City of Angels and the Nature of Violence,' written on her independent site Meditations in an Emergency, was the reason for her ouster. Meta did not immediately respond to the Chronicle's request for comment. 'I think maybe it's begun, the bigger fiercer backlash against the Trump Administration,' her piece begins, referring to the clashes in Los Angeles between protesters of President Donald Trump's immigration policies and the California National Guard deployed by Trump against city and state officials' wishes. 'All they can do is punish and incite, and I hope that some of the protesters are telling them they're violating their mission and maybe the law,' the essay continues. 'We are escalating because they are escalating.' The 'Men Explain Things to Me' author goes on to question longtime right-wing and media narratives that stereotype protesters as violent while giving law enforcement a pass for much more harm to people and property. 'One thing to remember is that they'll claim we're violent no matter what; the justification for this ongoing attack on immigrants and people who resemble immigrants in being brown is the idea that America is suffering an invasion and in essence only a certain kind of white person belongs here,' she writes. The piece never advocates meeting fire with fire. Instead, it argues for a defiant yet nonviolent response. 'I believe ardently that nonviolent resistance is in the big picture and the long term the most effective strategy, but that doesn't mean it must be polite, placid, or please our opponents,' she writes. Solnit concludes by enumerating the kinds of violence the Trump administration has perpetrated — against the environment, against the First Amendment, against women, against his personal enemies, against the very notion of truth. 'It is up to us to defeat that agenda,' she writes. Solnit said she appealed the suspension. On Wednesday, June 11, she shared a screenshot of Facebook's response saying it decided to disable her account: 'It still doesn't follow our Community Standards on account integrity. You cannot request another review of this decision.' Solnit noted that she doesn't think a Meta higher-up has it in for her, despite the popularity of her account. She cited 'inane algorithms that often delete posts' as the likeliest explanation. (In April, the Chronicle reported on Meta's rejection of an ad promoting a Northern California Pride festival.) Even so, Meta CEO Mark Zuckerberg has cozied up to the Trump administration, dining with the president at Mar-a-Lago and appointing Trump ally Dana White to his company's board. Meta also donated $1 million to Trump's inauguration fund. Meta's Community Standards on its account integrity page state that the company reserves the right to restrict or disable accounts that risk 'imminent harm to individual or public safety.' Solnit is the author of more than 30 books, including 'Infinite City: A San Francisco Atlas' and the children's book 'Cinderella Liberator,' which Marin Shakespeare Company
Yahoo
11-04-2025
- Business
- Yahoo
META Introduces New Teen Safety Tools: How Safe is the Stock for You?
Meta Platforms META introduced new updates to its Teen Accounts on Instagram on Tuesday, strengthening its parental control features and enhancing safety for users under 16. The update includes stricter restrictions on who can contact teens, the content they see, and new safety features for Instagram Live and direct messages. Teens will need parental consent to go Live or disable the feature that blurs unwanted images in DMs. With nearly 97% of teens aged 13-15 staying within this protection, META is also expanding Teen Accounts to Facebook and Messenger, aiming to create a more consistent and secure experience across its platforms. More than 54 million teen accounts are already active worldwide, highlighting META's continued commitment to giving parents peace of mind while helping teens engage more safely does expanding safety features across its Instagram, Facebook and Messenger make META a safe stock for investors amid increasing macroeconomic uncertainty? Meta Platforms, Inc. price-consensus-chart | Meta Platforms, Inc. Quote Year to date, META shares lost 6.7%, outperforming the Zacks Computer & Technology sector's decline of 12.6%. The company has outperformed its peers Amazon AMZN, Alphabet GOOGL and Pinterest PINS, which have plunged 17.4%,19.3% and 9.6%, respectively, over the same time frame. Meta has been focusing on enhancing safety across its platforms, including Facebook, Instagram, WhatsApp, and Messenger. The company has implemented various initiatives to ensure that users have a safer and more secure online April 2025, Meta added new WhatsApp features to improve user control and business interactions. Users can opt-in for business messages, share content feedback, and block or report businesses easily. Businesses now have tools like paid broadcasts and strict message limits to ensure relevant, high-quality communication and reduce inbox focuses on safety and integrity by investing in tools, resources, and initiatives like Community Standards and digital literacy programs to protect users, especially young people, from harmful content and scams. Meta is not the only technology company focusing on privacy controls for children and younger adults. Alphabet has introduced advanced digital safety features across its platform. It offers features like SafeSearch, supervision of accounts for Gmail and YouTube, and safety features for YouTube Kids. Pinterest is content filtering by disabling comments, spam detection and age-based control features to ensure digital safety. Among OTT platforms, Amazon has implemented parental control features on its devices like Amazon Kindle and Live TV. Within this landscape of growing emphasis on privacy and user protection, Meta's efforts to enhance privacy controls will likely drive user trust and engagement, enabling the platform to continue expanding its user base. In the fourth quarter of 2024, Family Daily Active People or DAP, defined as a registered and logged-in user who visited at least one of the Family products (Facebook, Instagram, Messenger and/or WhatsApp) on a given day, was 3.35 billion, up 5% year over year. Meta expects total revenues between $39.5 billion and $41.8 billion for the first quarter of 2025, assuming 8-15% year-over-year growth or 11%-18% at is expecting a 3% headwind to year-over-year total revenue growth in the first quarter of 2025 due to favorable lack of monetization of new platforms like Threads also remains a concern. META plans to introduce ads on Threads gradually and does not expect it to be a meaningful driver of overall impression or revenue growth in 2025. The Zacks Consensus Estimate for 2025 earnings is pegged at $24.98 per share, which declined 2.4% in the past 30 days. The figure calls for a year-over-year increase of 4.69%.The Zacks Consensus Estimate for 2025 revenues is currently pegged at $188.33 billion, indicating 13.27% year-over-year growth. Find the latest EPS estimates and surprises on Zacks Earnings Calendar. META currently has a Zacks Rank #3 (Hold). You can see the complete list of today's Zacks #1 Rank (Strong Buy) stocks here. Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free report Inc. (AMZN) : Free Stock Analysis Report Alphabet Inc. (GOOGL) : Free Stock Analysis Report Pinterest, Inc. (PINS) : Free Stock Analysis Report Meta Platforms, Inc. (META) : Free Stock Analysis Report This article originally published on Zacks Investment Research ( Zacks Investment Research Sign in to access your portfolio


Roya News
10-04-2025
- Business
- Roya News
Whistleblower reveals Meta's ties to Chinese authorities
A whistleblower from Meta testified before US senators on Wednesday, alleging that the company compromised national security to establish a USD 18 billion business in China. Sarah Wynn-Williams, a former global public policy director at Facebook, claimed during the congressional hearing that executives at the social media giant decided to grant the Chinese Communist Party access to user data, including information from American users. Meta quickly rejected Wynn-Williams's claims, with spokesman Ryan Daniels stating, 'Sarah Wynn-Williams' testimony is divorced from reality and riddled with false claims.' He acknowledged that CEO Mark Zuckerberg has expressed interest in offering services in China but emphasized, 'The fact is this: we do not operate our services in China today.' However, he noted that Meta does generate advertising revenue from Chinese advertisers. During her testimony before a Senate judiciary subcommittee, Wynn-Williams alleged that Meta collaborated closely with Beijing to develop censorship tools designed to silence critics of the Chinese Communist Party. She pointed out that Meta complied with Chinese demands to remove the Facebook account of Guo Wengui, a Chinese dissident living in the US. In response, Meta stated that Guo's page was unpublished and his profile suspended for violating the company's Community Standards. 'One thing the Chinese Communist Party and Mark Zuckerberg share is that they want to silence their critics. I can say that from personal experience,' Wynn-Williams remarked during the hearing. In March, she published a memoir titled "Careless People," detailing her experiences at the company. Following its release, Meta obtained an emergency ruling to temporarily block her from promoting the book, which includes critical claims about her tenure at Facebook. The company described the book as "false and defamatory." Senator Josh Hawley, a Republican from Missouri, led the Senate hearing and criticized Meta for allegedly attempting to prevent Wynn-Williams from testifying. 'Why is it that Facebook is so desperate to prevent this witness from telling what she knows?' he asked. In a previous January 2024 congressional hearing, Hawley demanded that Zuckerberg apologize to families who alleged that social media had harmed their children. At that hearing, families of individuals who had self-harmed or died by suicide due to social media content were present, prompting Zuckerberg to express sympathy, stating that 'no one should go through' what they had. During Wednesday's hearing, Hawley claimed that Meta had threatened Wynn-Williams with $50,000 in punitive damages for each public mention of Facebook, regardless of the truthfulness of her statements. 'Even as we sit here today, Facebook is attempting her total and complete financial ruin,' he asserted. Meta clarified to the BBC that the USD 50,000 damages pertain to each material violation of the separation agreement she signed upon leaving the company in 2017. Wynn-Williams contended that Meta indicated that creating exceptions to the non-disparagement agreement would 'eat the rule,' a statement that Meta later clarified was made by an arbitrator, not the company itself. While Meta stated that Wynn-Williams was not restricted from testifying before Congress, it did not respond directly to inquiries regarding potential financial penalties related to her statements during the hearing. Wynn-Williams expressed the personal toll of the situation, stating, 'The last four weeks have been very difficult. Even the choice to come and speak to Congress is incredibly difficult.'
Yahoo
20-02-2025
- Business
- Yahoo
Meta starts accepting sign-ups for Community Notes on Facebook, Instagram, and Threads
Meta announced on Thursday that it's now accepting sign-ups for its Community Notes program on Facebook, Instagram, and Threads. The announcement follows Meta news last month that it's going to end its third-party fact-checking program and is instead moving to a Community Notes model similar to the one at X. In a blog post, Meta explains that Community Notes will be a way for users across its platforms to decide when posts are misleading, and allow them to add more context to the posts. Starting today, people can sign up to be among the first contributors to the program. To sign up, users must be based in the United States and be over 18 years of age. Plus, users must have an account that's more than six months old and in good standing, along with a verified phone number or enrollment in two-factor authentication. Meta says contributors will be able to write and submit a Community Note to posts that they think are misleading or confusing. Just like on X, Notes can include things like background information, a tip, or other details that users might find useful. Notes will have a 500-character limit and are required to include a link. "For a Community Note to be published on a post, users who normally disagree, based on how they've rated Notes in the past, will have to agree that a Note is helpful," Meta explains. "Notes will not be added to content when there is no agreement or when people agree a Note is not helpful." Meta says Community Notes will be written and rated by contributors, not by the tech giant itself. All Notes must adhere to Meta's Community Standards. "We intend to be transparent about how different viewpoints inform the Notes displayed in our apps, and are working on the right way to share this information," Meta says. The company plans to introduce Community Notes in the United States over the next couple of months. Meta hasn't shared when it plans to bring the feature to additional countries. Meta's decision to drop fact-checking for Community Notes has been seen as the company repositioning itself for the Trump presidency, as it takes an approach that's in favor of unrestricted speech online. When Meta announced the change, Mark Zuckerberg said in a video that fact-checkers were "too politically biased" and had destroyed "more trust than they've created."


The Independent
28-01-2025
- Business
- The Independent
Meta's ‘bonfire' of safety policies a danger to children, charity says
Meta's recent 'bonfire of safety measures' risks taking Facebook and Instagram back to where they were when Molly Russell died, the charity set up in her name has warned. The Molly Rose Foundation said new online safety regulator Ofcom must strengthen incoming regulation in order to ensure teenagers are protected from harmful content online. The charity was set up by Molly's family after her death in 2017, aged 14, when Molly chose to end her life after viewing harmful content on social media sites, including Meta-owned Instagram. Earlier this month, boss Mark Zuckerberg announced sweeping changes to Meta's policies in the name of 'free expression', including plans to scale back content moderation that will see the firm ending the automated scanning of content for some types of posts, instead relying on user reports to remove certain sorts of content. Campaigners called the move 'chilling' and said they were 'dismayed' by the decision, which has been attributed to Mr Zuckerberg's desire to forge a positive relationship with new US President Donald Trump. Andy Burrows, chief executive of the Molly Rose Foundation, said: 'Meta's bonfire of safety measures is hugely concerning and Mark Zuckerberg's increasingly cavalier choices are taking us back to what social media looked like at the time that Molly died. 'Ofcom must send a clear signal it is willing to act in the interests of children and urgently strengthen its requirements on tech platforms. 'If Ofcom fails to keep pace with the irresponsible actions of tech companies the Prime Minister must intervene. 'Amid a strategic rollback of their safety commitments, preventable harm is being driven by Silicon Valley but the decision to stop it in its tracks now sits with the regulator and Government.' In a letter sent to Ofcom, the foundation has urged Ofcom to strengthen the Online Safety Act by bolster requirements around content moderation, including requiring firms to proactively scan for all types of intense depression, suicide and self-harm content. It also urges the regulator to ensure that Meta's new loosened policies around hate speech are not allowed to apply to children, and gain clarification on whether Meta can change its rules without going through traditional internal processes, after reports suggesting Mr Zuckerberg made the policy changes himself, leaving internal teams 'blindsided' – something Ofcom should ensure cannot happen again, the foundation said. In a statement, a Meta spokesperson said: 'There is no change to how we define and treat content that encourages suicide, self-injury and eating disorders. 'We don't allow it and we'll continue to use our automated systems to proactively identify and remove it. 'We continue to have Community Standards, around 40,000 people working on safety and security to help enforce them, and Teen Accounts in the UK, which automatically limit who can contact teens and the types of content they see.' Earlier this month, Molly's father Ian, the chairman of the Molly Rose Foundation, told the Prime Minister that the UK was 'going backwards' on online safety. Mr Russell said in a letter to Sir Keir Starmer that Ofcom's approach to implementing the Online Safety Act has 'fundamentally failed to grasp the urgency and scale of its mission', and changes were needed to bolster the legislation. The Molly Rose Foundation has also previously warned that Meta's approach to tackling suicide and self-harm content is not fit for purpose, after research found the social media giant was responsible for just 2% of industry-wide takedowns of such content. We are in contact with social media companies, including Meta, about the safety measures they have in place now, and what more they will have to do to comply once the duties are fully in force Ofcom An Ofcom spokesperson said: 'All platforms operating in the UK – including Meta – must comply with the UK's online safety laws, once in force. 'Under the Online Safety Act, tech firms must assess the risks they pose, including to children, and take significant steps to protect them. 'That involves acting swiftly to take down illegal content – including illegal suicide and self-harm material – and ensuring harmful posts and videos are filtered out from children's feeds. 'We'll soon put forward additional measures for consultation on the use of automated content moderation systems to proactively detect this kind of egregious content. 'We are in contact with social media companies, including Meta, about the safety measures they have in place now, and what more they will have to do to comply once the duties are fully in force. 'No one should be in any doubt about Ofcom's resolve to hold tech firms to account, using the full force of our enforcement powers where necessary.'