Latest news with #PaytonGendron


Express Tribune
3 days ago
- Politics
- Express Tribune
Google, Meta escape Buffalo shooter lawsuit
Several social media companies should not be held liable for helping an avowed white supremacist who killed 10 Black people in 2022 at a Buffalo, New York grocery store, a divided New York state appeals court ruled on Friday, reported Reuters. Reversing a lower court ruling, the state Appellate Division in Rochester said defendants including Meta Platforms' Facebook and Instagram, Google's YouTube, and Reddit were entitled to immunity under a federal law that protects online platforms from liability over user content. The case arose from Payton Gendron's racially motivated mass shooting at Tops Friendly Markets on May 14, 2022. Relatives and representatives of victims, as well as store employees and customers who witnessed the attack, claimed the defendants' platforms were defective because they were designed to addict and radicalise users like Gendron. Lawyers for the plaintiffs did not immediately respond to requests for comment. Other defendants included Alphabet, Discord, 4chan, Snap, and Twitch, all of which Gendron used, the mid-level state appeals court said. Writing for a 3-2 majority, Justice Stephen Lindley said holding social media companies liable would undermine the intent behind Section 230 of the federal Communications Decency Act, to promote development of and competition on the internet while keeping government interference to a minimum. While condemning Gendron's conduct and "the vile content that motivated him to assassinate Black people simply because of the color of their skin," Lindley said a liability finding would "result in the end of the Internet as we know it." "Because social media companies that sort and display content would be subject to liability for every untruthful statement made on their platforms, the Internet would over time devolve into mere message boards," he wrote. Justices Tracey Bannister and Henry Nowak dissented, saying the defendants force-fed targeted content to keep users engaged, be it videos about cooking or puppies, or white nationalist vitriol.


The Hindu
4 days ago
- The Hindu
Social media companies not liable for 2022 Buffalo mass shooting, New York court rules
Several social media companies should not be held liable for helping an avowed white supremacist who killed 10 Black people in 2022 at a Buffalo, New York grocery store, a divided New York state appeals court ruled on Friday. Reversing a lower court ruling, the state Appellate Division in Rochester said defendants including Meta Platforms' Facebook and Instagram, Google's YouTube, and Reddit were entitled to immunity under a federal law that protects online platforms from liability over user content. The case arose from Payton Gendron's racially motivated mass shooting at Tops Friendly Markets on May 14, 2022. Relatives and representatives of victims, as well as store employees and customers who witnessed the attack, claimed the defendants' platforms were defective because they were designed to addict and radicalise users like Gendron. Lawyers for the plaintiffs did not immediately respond to requests for comment. Other defendants included Alphabet, Discord, 4chan, Snap and Twitch, all of which Gendron used, the mid-level state appeals court said. Writing for a 3-2 majority, Justice Stephen Lindley said holding social media companies liable would undermine the intent behind Section 230 of the federal Communications Decency Act, to promote development of and competition on the internet while keeping government interference to a minimum. While condemning Gendron's conduct and "the vile content that motivated him to assassinate Black people simply because of the color of their skin," Lindley said a liability finding would "result in the end of the Internet as we know it." "Because social media companies that sort and display content would be subject to liability for every untruthful statement made on their platforms, the Internet would over time devolve into mere message boards," he wrote. Justices Tracey Bannister and Henry Nowak dissented, saying the defendants force-fed targeted content to keep users engaged, be it videos about cooking or puppies, or white nationalist vitriol. "Such conduct does not maintain the robust nature of Internet communication or preserve the vibrant and competitive free market that presently exists for the Internet contemplated by the protections of immunity," the judges wrote. Gendron pleaded guilty to state charges including murder and terrorism motivated by hate, and was sentenced in February 2023 to life in prison without parole. He faces related federal charges that could lead to the death penalty. Questioning of potential jurors in that case is scheduled to begin in August 2026, court records show.

Epoch Times
5 days ago
- Epoch Times
Appeals Court Tosses Lawsuit Against Social Media Companies Over 2022 Buffalo Shooting
A New York appellate court on July 25 dismissed a lawsuit filed against Meta, Google, and several other social media and Internet-based companies, in connection with the 2022 mass shooting at a grocery store in Buffalo, New York. The mass shooting at Tops Friendly Markets in Buffalo on May 14, 2022, left 10 people dead and three injured. Police arrested Payton Gendron, an 18-year-old white man, who, in November 2022, pleaded guilty to murder and hate-motivated terrorism charges. Gendron apologized for the attack.


DW
23-07-2025
- Politics
- DW
World White Hate – DW – 07/23/2025
Racist and right-wing extremist networks are coalescing, worldwide. They carry out terrorist attacks on minorities and democratic institutions. Authorities in the USA and Europe consider this movement to be more dangerous than Islamist terrorism. Right-wing extremist groups are networked worldwide. Driven by the ideology of white supremacy, they spread their propaganda via digital platforms. Social media and encrypted messaging services such as Telegram make it possible to disseminate content in real time and recruit new followers. 18-year-old Payton Gendron killed 10 people, most of them African-Americans, with an assault rifle in a supermarket in Buffalo in the US state of New York. Before committing this crime, he was influenced and then radicalized by right-wing extremist videos posted by British teenager Daniel Harris. Harris has written entire books about his white supremacist beliefs, and published them online. It's a problem with global dimensions: Armed with a machete and Molotov cocktails, a 17-year-old attempted to storm a school in the Brazilian state of Sao Paulo. He was wearing a swastika armband. These are just a few of the cases documented in the film, which clearly show how dire the threat of right-wing terror has become. Germany is also a flashpoint for right-wing terror, with attacks in Hanau, Halle and Munich. Many perpetrators are inspired by Brenton Tarrant, who murdered 51 people in two mosques in Christchurch, New Zealand, and by the Norwegian assassin Anders Breivik, who shot dead 69 young participants at a Labor Party youth camp on the island of Utøya near Olso. Previously, he had detonated a bomb in the government district of Oslo, murdering eight people. He justified his actions in a video and a 1,500-page manifesto that went viral. As with the Australian Tarrant, who also wrote a manifesto entitled 'The Great Replacement', Breivik's message is about the superiority of the white race, which is supposedly being targeted and replaced by migrants. It's a view that is also shared by an increasing number of people outside extremist circles. As a result, hatred and racism are spreading worldwide like a virus. In a major raid in Germany in December 2022, 25 right-wing extremists were arrested, including members of the so-called 'Reichsbürger' or 'Citizens of the Reich' movement, conspiracy theorists, retired military officers and a former member of the Bundestag. According to the German Federal Public Prosecutor's Office, the group had been plotting to overthrow the democratic system. In this context, UN Secretary-General Guterres spoke of the greatest threat to our democracy and its institutions. Filmmaker Dirk Laabs' research shows that soldiers and veterans pose a particularly great danger - in the USA, France, Germany, Spain and Russia. Former and active soldiers network globally and are potential assassins. Right-wing extremist mercenaries also represent a danger. This group is potentially even more threatening - due to their combat experience, access to weapons and professional networks. WORLD WHITE HATE unveils the parallels and overlaps between these very different right-wing extremist groups. But how can this hatred be countered? How can right-wing terror be stopped? What can be done to protect democratic society, people and state institutions from right-wing terror? Filmed in the USA, western and eastern Europe, the UK, Scandinavia and Brazil, WORLD WHITE HATE charts the development of the threat posed by right-wing terror, a danger that has been underestimated for far too long. It is exacerbated by populist politicians such as Donald Trump and radical right-wing parties. The documentary WORLD WHITE HATE by Dirk Laabs analyzes the mechanisms of radicalization and discusses possible counter-strategies for democratic societies. The central question remains: "How can we win the digital and real battle against increasing violence from the right?' DW English SAT 09.08.2025 – 10:30 UTC SAT 09.08.2025 – 21:30 UTC SUN 10.08.2025 – 04:30 UTC Lagos UTC +1 | Cape Town UTC +2 | Nairobi UTC +3 Delhi UTC +5,5 | Bangkok UTC +7 | Hong Kong UTC +8 London UTC +1 | Berlin UTC +2 | Moscow UTC +3 San Francisco UTC -7 | Edmonton UTC -6 | New York UTC -4


The Verge
27-05-2025
- Politics
- The Verge
If algorithms radicalize a mass shooter, are companies to blame?
In New York court on May 20th, lawyers for nonprofit Everytown for Gun Safety argued that Meta, Amazon, Discord, Snap, 4chan, and other social media companies all bear responsibility for radicalizing a mass shooter. The companies defended themselves against claims that their respective design features — including recommendation algorithms — promoted racist content to a man who killed 10 people in 2022, then facilitated his deadly plan. It's a particularly grim test of a popular legal theory: that social networks are products that can be found legally defective when something goes wrong. Whether this works may rely on how courts interpret Section 230, a foundational piece of internet law. In 2022, Payton Gendron drove several hours to the Tops supermarket in Buffalo, New York, where he opened fire on shoppers, killing 10 people and injuring three others. Gendron claimed to have been inspired by previous racially motivated attacks. He livestreamed the attack on Twitch and, in a lengthy manifesto and a private diary he kept on Discord, said he had been radicalized in part by racist memes and intentionally targeted a majority-Black community. Everytown for Gun Safety brought multiple lawsuits over the shooting in 2023, filing claims against gun sellers, Gendron's parents, and a long list of web platforms. The accusations against different companies vary, but all place some responsibility for Gendron's radicalization at the heart of the dispute. The platforms are relying on Section 230 of the Communications Decency Act to defend themselves against a somewhat complicated argument. In the US, posting white supremacist content is typically protected by the First Amendment. But these lawsuits argue that if a platform feeds it nonstop to users in an attempt to keep them hooked, it becomes a sign of a defective product — and, by extension, breaks product liability laws if that leads to harm. That strategy requires arguing that companies are shaping user content in ways that shouldn't receive protection under Section 230, which prevents interactive computer services from being held liable for what users post, and that their services are products that fit under the liability law. 'This is not a lawsuit against publishers,' John Elmore, an attorney for the plaintiffs, told the judges. 'Publishers copyright their material. Companies that manufacture products patent their materials, and every single one of these defendants has a patent.' These patented products, Elmore continued, are 'dangerous and unsafe' and are therefore 'defective' under New York's product liability law, which lets consumers seek compensation for injuries. Some of the tech defendants — including Discord and 4chan — don't have proprietary recommendation algorithms tailored to individual users, but the claims against them allege that their designs still aim to hook users in a way that predictably encouraged harm. 'This community was traumatized by a juvenile white supremacist who was fueled with hate — radicalized by social media platforms on the internet,' Elmore said. 'He obtained his hatred for people who he never met, people who never did anything to his family or anything against him, based upon algorithm-driven videos, writings, and groups that he associated with and was introduced to on these platforms that we're suing.' These platforms, Elmore continued, own 'patented products' that 'forced' Gendron to commit a mass shooting. A meme-fueled shooting In his manifesto, Gendron called himself an 'eco-fascist national socialist' and said he had been inspired by previous mass shootings in Christchurch, New Zealand, and El Paso, Texas. Like his predecessors, Gendron wrote that he was concerned about 'white genocide' and the great replacement: a conspiracy theory alleging that there is a global plot to replace white Americans and Europeans with people of color, typically through mass immigration. Gendron pleaded guilty to state murder and terrorism charges in 2022 and is currently serving life in prison. According to a report by the New York attorney general's office, which was cited by the plaintiff's lawyers, Gendron 'peppered his manifesto with memes, in-jokes, and slang common on extremist websites and message boards,' a pattern found in some other mass shootings. Gendron encouraged readers to follow in his footsteps, and urged extremists to spread their message online, writing that memes 'have done more for the ethno-nationalist movement than any manifesto.' Citing Gendron's manifesto, Elmore told judges that before Gendron was 'force-fed online white supremacist materials,' Gendron never had any problems with or animosity toward Black people. 'He was encouraged by the notoriety that the algorithms brought to other mass shooters that were streamed online, and then he went down a rabbit hole.' Everytown for Gun Safety sued nearly a dozen companies — including Meta, Reddit, Amazon, Google, YouTube, Discord, and 4chan — over their alleged role in the shooting in 2023. Last year, a federal judge allowed the suits to proceed. Racism, addiction, and 'defective' design The racist memes Gendron was seeing online are undoubtedly a major part of the complaint, but the plaintiffs aren't arguing that it's illegal to show someone racist, white supremacist, or violent content. In fact, the September 2023 complaint explicitly notes that the plaintiffs aren't seeking to hold YouTube 'liable as the publisher or speaker of content posted by third parties,' partly because that would give YouTube ammunition to get the suit dismissed on Section 230 grounds. Instead, they're suing YouTube as the 'designers and marketers of a social media product … that was not reasonably safe and that was reasonably dangerous for its intended use.' Their argument is that YouTube and other social media website algorithms' addictive nature, when coupled with their willingness to host white supremacist content, makes them unsafe. 'A safer design exists,' the complaint states, but YouTube and other social media platforms 'have failed to modify their product to make it less dangerous because they seek to maximize user engagement and profits.' The plaintiffs made similar complaints about other platforms. Twitch, which doesn't rely on algorithmic generations, could alter its product so the videos are on a time delay, Amy Keller, an attorney for the plaintiffs, told judges. Reddit's upvoting and karma features create a 'feedback loop' that encourages use. 4chan doesn't require users to register accounts, allowing them to post extremist content anonymously. 'There are specific types of defective designs that we talk about with each of these defendants,' Keller said, adding that platforms that have algorithmic recommendation systems are 'probably at the top of the heap when it comes to liability.' During the hearing, the judges asked the plaintiffs' attorneys if these algorithms are always harmful. 'I like cat videos, and I watch cat videos; they keep sending me cat videos,' one of the judges said. 'There's a beneficial purpose, is there not? There's some thought that without algorithms, some of these platforms can't work. There's just too much information.' After agreeing that he loves cat videos, Glenn Chappell, another attorney for the plaintiffs, said the issue lies with algorithms 'designed to foster addiction and the harms resulting from that type of addictive mechanism are known.' In those instances, Chappell said, 'Section 230 does not apply.' The issue was 'the fact that the algorithm itself made the content addictive,' Keller said. Third-party content and 'defective' products The platforms' lawyers, meanwhile, argued that sorting content in a particular way shouldn't strip them of protections against liability for user-posted content. While the complaint may argue it's not saying web services are publishers or speakers, the platforms' defense counters that this is still a case about speech where Section 230 applies. 'Case after case has recognized that there's no algorithms exception to the application of Section 230,' Eric Shumsky, an attorney for Meta, told judges. The Supreme Court considered whether Section 230 protections applied to algorithmically recommended content in Gonzalez v. Google, but in 2023, it dismissed the case without reaching a conclusion or redefining the currently expansive protections. Shumsky contended that algorithms' personalized nature prevents them from being 'products' under the law. 'Services are not products because they are not standardized,' Shumsky said. Unlike cars or lawnmowers, 'these services are used and experienced differently by every user,' since platforms 'tailor the experiences based on the user's actions.' In other words, algorithms may have influenced Gendron, but Gendron's beliefs also influenced the algorithms. Section 230 is a common counter to claims that social media companies should be liable for how they run their apps and websites, and one that's sometimes succeeded. A 2023 court ruling found that Instagram, for instance, wasn't liable for designing its service in a way that allowed users to transmit harmful speech. The accusations 'inescapably return to the ultimate conclusion that Instagram, by some flaw of design, allows users to post content that can be harmful to others,' the ruling said. Last year, however, a federal appeals court ruled that TikTok had to face a lawsuit over a viral 'blackout challenge' that some parents claimed led to their children's deaths. In that case, Anderson v. TikTok, the Third Circuit court of appeals determined that TikTok couldn't claim Section 230 immunity, since its algorithms fed users the viral challenge. The court ruled that the content TikTok recommends to its users isn't third-party speech generated by other users; it's first-party speech, because users see it as a result of TikTok's proprietary algorithm. The Third Circuit's ruling is anomalous, so much so that Section 230 expert Eric Goldman called it 'bonkers.' But there's a concerted push to limit the law's protections. Conservative legislators want to repeal Section 230, and a growing number of courts will need to decide whether users of social networks are being sold a dangerous bill of goods — not simply a conduit for their speech.