logo
#

Latest news with #PaytonGendron

If algorithms radicalize a mass shooter, are companies to blame?
If algorithms radicalize a mass shooter, are companies to blame?

The Verge

time27-05-2025

  • Politics
  • The Verge

If algorithms radicalize a mass shooter, are companies to blame?

In New York court on May 20th, lawyers for nonprofit Everytown for Gun Safety argued that Meta, Amazon, Discord, Snap, 4chan, and other social media companies all bear responsibility for radicalizing a mass shooter. The companies defended themselves against claims that their respective design features — including recommendation algorithms — promoted racist content to a man who killed 10 people in 2022, then facilitated his deadly plan. It's a particularly grim test of a popular legal theory: that social networks are products that can be found legally defective when something goes wrong. Whether this works may rely on how courts interpret Section 230, a foundational piece of internet law. In 2022, Payton Gendron drove several hours to the Tops supermarket in Buffalo, New York, where he opened fire on shoppers, killing 10 people and injuring three others. Gendron claimed to have been inspired by previous racially motivated attacks. He livestreamed the attack on Twitch and, in a lengthy manifesto and a private diary he kept on Discord, said he had been radicalized in part by racist memes and intentionally targeted a majority-Black community. Everytown for Gun Safety brought multiple lawsuits over the shooting in 2023, filing claims against gun sellers, Gendron's parents, and a long list of web platforms. The accusations against different companies vary, but all place some responsibility for Gendron's radicalization at the heart of the dispute. The platforms are relying on Section 230 of the Communications Decency Act to defend themselves against a somewhat complicated argument. In the US, posting white supremacist content is typically protected by the First Amendment. But these lawsuits argue that if a platform feeds it nonstop to users in an attempt to keep them hooked, it becomes a sign of a defective product — and, by extension, breaks product liability laws if that leads to harm. That strategy requires arguing that companies are shaping user content in ways that shouldn't receive protection under Section 230, which prevents interactive computer services from being held liable for what users post, and that their services are products that fit under the liability law. 'This is not a lawsuit against publishers,' John Elmore, an attorney for the plaintiffs, told the judges. 'Publishers copyright their material. Companies that manufacture products patent their materials, and every single one of these defendants has a patent.' These patented products, Elmore continued, are 'dangerous and unsafe' and are therefore 'defective' under New York's product liability law, which lets consumers seek compensation for injuries. Some of the tech defendants — including Discord and 4chan — don't have proprietary recommendation algorithms tailored to individual users, but the claims against them allege that their designs still aim to hook users in a way that predictably encouraged harm. 'This community was traumatized by a juvenile white supremacist who was fueled with hate — radicalized by social media platforms on the internet,' Elmore said. 'He obtained his hatred for people who he never met, people who never did anything to his family or anything against him, based upon algorithm-driven videos, writings, and groups that he associated with and was introduced to on these platforms that we're suing.' These platforms, Elmore continued, own 'patented products' that 'forced' Gendron to commit a mass shooting. A meme-fueled shooting In his manifesto, Gendron called himself an 'eco-fascist national socialist' and said he had been inspired by previous mass shootings in Christchurch, New Zealand, and El Paso, Texas. Like his predecessors, Gendron wrote that he was concerned about 'white genocide' and the great replacement: a conspiracy theory alleging that there is a global plot to replace white Americans and Europeans with people of color, typically through mass immigration. Gendron pleaded guilty to state murder and terrorism charges in 2022 and is currently serving life in prison. According to a report by the New York attorney general's office, which was cited by the plaintiff's lawyers, Gendron 'peppered his manifesto with memes, in-jokes, and slang common on extremist websites and message boards,' a pattern found in some other mass shootings. Gendron encouraged readers to follow in his footsteps, and urged extremists to spread their message online, writing that memes 'have done more for the ethno-nationalist movement than any manifesto.' Citing Gendron's manifesto, Elmore told judges that before Gendron was 'force-fed online white supremacist materials,' Gendron never had any problems with or animosity toward Black people. 'He was encouraged by the notoriety that the algorithms brought to other mass shooters that were streamed online, and then he went down a rabbit hole.' Everytown for Gun Safety sued nearly a dozen companies — including Meta, Reddit, Amazon, Google, YouTube, Discord, and 4chan — over their alleged role in the shooting in 2023. Last year, a federal judge allowed the suits to proceed. Racism, addiction, and 'defective' design The racist memes Gendron was seeing online are undoubtedly a major part of the complaint, but the plaintiffs aren't arguing that it's illegal to show someone racist, white supremacist, or violent content. In fact, the September 2023 complaint explicitly notes that the plaintiffs aren't seeking to hold YouTube 'liable as the publisher or speaker of content posted by third parties,' partly because that would give YouTube ammunition to get the suit dismissed on Section 230 grounds. Instead, they're suing YouTube as the 'designers and marketers of a social media product … that was not reasonably safe and that was reasonably dangerous for its intended use.' Their argument is that YouTube and other social media website algorithms' addictive nature, when coupled with their willingness to host white supremacist content, makes them unsafe. 'A safer design exists,' the complaint states, but YouTube and other social media platforms 'have failed to modify their product to make it less dangerous because they seek to maximize user engagement and profits.' The plaintiffs made similar complaints about other platforms. Twitch, which doesn't rely on algorithmic generations, could alter its product so the videos are on a time delay, Amy Keller, an attorney for the plaintiffs, told judges. Reddit's upvoting and karma features create a 'feedback loop' that encourages use. 4chan doesn't require users to register accounts, allowing them to post extremist content anonymously. 'There are specific types of defective designs that we talk about with each of these defendants,' Keller said, adding that platforms that have algorithmic recommendation systems are 'probably at the top of the heap when it comes to liability.' During the hearing, the judges asked the plaintiffs' attorneys if these algorithms are always harmful. 'I like cat videos, and I watch cat videos; they keep sending me cat videos,' one of the judges said. 'There's a beneficial purpose, is there not? There's some thought that without algorithms, some of these platforms can't work. There's just too much information.' After agreeing that he loves cat videos, Glenn Chappell, another attorney for the plaintiffs, said the issue lies with algorithms 'designed to foster addiction and the harms resulting from that type of addictive mechanism are known.' In those instances, Chappell said, 'Section 230 does not apply.' The issue was 'the fact that the algorithm itself made the content addictive,' Keller said. Third-party content and 'defective' products The platforms' lawyers, meanwhile, argued that sorting content in a particular way shouldn't strip them of protections against liability for user-posted content. While the complaint may argue it's not saying web services are publishers or speakers, the platforms' defense counters that this is still a case about speech where Section 230 applies. 'Case after case has recognized that there's no algorithms exception to the application of Section 230,' Eric Shumsky, an attorney for Meta, told judges. The Supreme Court considered whether Section 230 protections applied to algorithmically recommended content in Gonzalez v. Google, but in 2023, it dismissed the case without reaching a conclusion or redefining the currently expansive protections. Shumsky contended that algorithms' personalized nature prevents them from being 'products' under the law. 'Services are not products because they are not standardized,' Shumsky said. Unlike cars or lawnmowers, 'these services are used and experienced differently by every user,' since platforms 'tailor the experiences based on the user's actions.' In other words, algorithms may have influenced Gendron, but Gendron's beliefs also influenced the algorithms. Section 230 is a common counter to claims that social media companies should be liable for how they run their apps and websites, and one that's sometimes succeeded. A 2023 court ruling found that Instagram, for instance, wasn't liable for designing its service in a way that allowed users to transmit harmful speech. The accusations 'inescapably return to the ultimate conclusion that Instagram, by some flaw of design, allows users to post content that can be harmful to others,' the ruling said. Last year, however, a federal appeals court ruled that TikTok had to face a lawsuit over a viral 'blackout challenge' that some parents claimed led to their children's deaths. In that case, Anderson v. TikTok, the Third Circuit court of appeals determined that TikTok couldn't claim Section 230 immunity, since its algorithms fed users the viral challenge. The court ruled that the content TikTok recommends to its users isn't third-party speech generated by other users; it's first-party speech, because users see it as a result of TikTok's proprietary algorithm. The Third Circuit's ruling is anomalous, so much so that Section 230 expert Eric Goldman called it 'bonkers.' But there's a concerted push to limit the law's protections. Conservative legislators want to repeal Section 230, and a growing number of courts will need to decide whether users of social networks are being sold a dangerous bill of goods — not simply a conduit for their speech.

Attorneys argue social media not legally responsible for Tops shooting
Attorneys argue social media not legally responsible for Tops shooting

Yahoo

time21-05-2025

  • Yahoo

Attorneys argue social media not legally responsible for Tops shooting

BUFFALO, N.Y. (WIVB) – Attorneys representing social media companies argued in a Rochester courtroom Tuesday that the sites should not be held legally responsible for the racist mass shooting at Tops that killed 10 Black people on May 14, 2022. The argument was made to the New York Court of Appeals as they seek to dismiss the wrongful death lawsuit that was filed by some of the shooting victims' families who say social media is partly to blame for the deaths of their loved ones. Lawsuit targets social media sites, gun retailers for roles in Tops mass shooting The lawsuit argues that convicted gunman Payton Gendron was radicalized by white supremacist theories he found on social media and that addictive algorithms kept leading the shooter back to racist, antisemitic and violent information. The court also heard from attorneys representing the manufacturer of a gun magazine lock, which Gendron removed during his attack to use high-capacity magazines. John Elmore is one of the attorneys who is bringing this lawsuit against the companies. 'They are all, we believe, to be addictive products and they were dangerous products that could have been made in a safer way,' Elmore said. 'As a result of the way they were manufactured, they were dangerous and in society when corporations make a dangerous product and it's foreseeable people are going to be inured, then they're liable under the products liability theory under New York State law, so we're hoping the judge will see our argument and continue.' Attorneys for the social media companies argued they are not liable, with one of them stating a 'premediated, murderous rampage is not a foreseeable risk of having a social media service.' Arguments took more than three hours and wrapped up late Tuesday. The judges did not make a decision. If the judges deny the appeal, it will then proceed to the discovery phase. Attorneys representing the victims families said the judges were very knowledgeable and that they're hopeful the ruling will be in their favor. Marlee Tuskes is an award-winning anchor and reporter who has been part of the News 4 team since 2019. See more of her work here and follow her on Twitter. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Judge denies motion to dismiss hate crime count against Tops mass shooter
Judge denies motion to dismiss hate crime count against Tops mass shooter

Yahoo

time14-05-2025

  • Yahoo

Judge denies motion to dismiss hate crime count against Tops mass shooter

BUFFALO, N.Y. (WIVB) — A federal judge on Monday denied a motion to dismiss one of the hate crime counts the Tops mass shooter is facing. Payton Gendron's attorneys previously filed a motion to dismiss the charge tied to the people who were at the Tops on Jefferson Avenue during the racist attack on May 14, 2022, but not killed or injured. They argued that it was not directed at a specific, identifiable victim, and therefore should be dismissed. Judge Lawrence Vilardo ruled that the count does refer to a particular group of people. 'The government responds that count 27 'clearly delineate[s] a discrete class of victims: the Black people who were present in and around Tops during [Gendron]'s mass shooting,' so there is no ambiguity about the intended victims' identities,' the lawsuit said. This case is expected to be back in a federal court room next week for a status conference. Marlee Tuskes is an award-winning anchor and reporter who has been part of the News 4 team since 2019. See more of her work here and follow her on Twitter. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Judge rejects motion to dismiss death penalty in Tops mass shooting case
Judge rejects motion to dismiss death penalty in Tops mass shooting case

Yahoo

time22-04-2025

  • Politics
  • Yahoo

Judge rejects motion to dismiss death penalty in Tops mass shooting case

BUFFALO, N.Y. (WIVB) — A federal judge rejected a motion to dismiss the government's notice of intent to seek the death penalty in the Buffalo mass shooting case, according to a district court filing Tuesday. The rejection comes after the defense team for admitted gunman Payton Gendron argued last November that it should be unconstitutional for him to receive the death penalty because he was 18 years old at the time of the racially targeted attack, where he killed 10 Black people and injured three others in the Tops on Jefferson Avenue. Gendron's defense based its motion around the 'alleged abuse of the grand jury,' the decision said. The allegations of abuse include the government using the grand jury process to 'compel irrelevant evidence,' by its questions 'having no relationship to the charges under consideration,' according to the filing. Other reasons cited by the defense for the attempted dismissal of the death penalty were 'improper' questioning, including whether Gendron had different disabilities or whether he 'seemed to be racist,' which was said by the defense to be 'impermissibly obtained information.' The government responded, in part, that whether Gendron 'exhibited racial animus' was 'plainly relevant' to the case's hate crime charges. In 2023, Gendron received 11 life sentences on state charges. Only the federal case carries the possibility of the death penalty. According to Tuesday's decision from U.S. District Court Judge Lawrence Vilardo, the court said to allow for the death penalty's dismissal, Gendron would have had to show that the alleged abuse influenced the attorney general's decision to authorize the death penalty, which was not found by the judge. The defense's request to stop the government from using 'any of this information at trial for any purpose' was also denied. Gendron is still able to challenge the admissibility of the grand jury's testimony by filing a pretrial motion, the decision said. In the post month, Gendron's defense has also filed motions to delay the federal trial and change the trial's venue. You can view Judge Vilardo's decision and order below. Katie Skoog joined the News 4 team in April 2024. She is a graduate from the University at Buffalo. You can view more of her work here. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

The Buffalo Supermarket Shooter Really Tried It With This Latest Request
The Buffalo Supermarket Shooter Really Tried It With This Latest Request

Yahoo

time04-04-2025

  • Politics
  • Yahoo

The Buffalo Supermarket Shooter Really Tried It With This Latest Request

Updated as of 4/5/2025 at 3:00 p.m. ET The white man who gunned down 10 Black people inside a Buffalo supermarket in a racially motivated massacre is facing the ultimate penalty by the Department of Justice. However, he's come up with a slick idea that may or may not change the odds of him meeting the chair. Attorneys for Payton Gendron, 20, filed a request in the U.S. District Court earlier this week arguing that their client has no chance of receiving an unbiased jury if the pool is picked in the area impacted by the shooting, per USA TODAY. 'Due to the overwhelming amount of pretrial publicity, combined with the impact of this case on Buffalo's segregated communities of color, it is impossible for Payton Gendron to select a fair and impartial jury in the Western District of New York,' wrote Gendron's lawyers in the filing. The attorneys argued that a 'diverse group of citizens' should be the ones to deliver a verdict in the case. Gendron is already serving a life sentence after pleading guilty to state charges of murder and hate-motivated domestic terrorism in connection to the fatal mass shooting of ten Black people at Tops Friendly Market back in 2022, per The New York Times. His racist motives were discovered after investigators found a manifesto he posted online, filled with white supremacist ideologies. They also found he targeted this neighborhood because it contained a majority-Black population. Gendron still faces sentencing for federal hate crimes, use of firearms to commit murder and gun charges. Though New York does not hand out the death penalty, court records show the Department of Justice sent a notice of intent to seek the death penalty anyway Friday. '...the United States believes the circumstances in counts 11 to 20 of the Indictment are such that, in the event of a conviction, a sentence of death is justified...' the notice reads. Read more from CBS News: New York does not have capital punishment, but the Justice Department had the option of seeking the death penalty in a separate federal hate crimes case. The gunman had promised to plead guilty in that case if prosecutors agreed not to seek the death penalty. The Justice Department has made federal death penalty cases a rarity since the election of President Biden, who opposes capital punishment. This is the first time Attorney General Merrick Garland has authorized a new pursuit of the death penalty. Under his leadership, the Justice Department has permitted the continuation of two capital prosecutions and withdrawn from pursuing death in more than two dozen cases. The families of the victims weren't necessarily jovial at the news of the filing. Mark Talley, grandson of Geraldine Talley who was killed in the incident, said he would've rathered Gendron do prison time. 'It would have satisfied me more knowing he would have spent the rest of his life in prison being surrounded by the population of people he tried to kill. I would prefer he spend the rest of his life in prison suffering every day,' he told CBS. In a statement on behalf of the other relatives of the victims, attorney Terrence Connors said the families were relieved of the decision but 'no decision could eliminate the pain and suffering they continue to experience.' In the same breath, Jamila Hodge, CEO of Equal Justice USA, an organization fighting against the expansion of death penalty legislation, said execution will do nothing to solve the real problem. 'The government's decision to pursue a death sentence will do nothing to address the racism and hatred that fueled the mass murder. Ultimately, this pursuit will inflict more pain and renewed trauma on the victims' families and the larger Black community already shattered by loss and desperately in need of healing and solutions that truly build community safety,' Hodge said in a statement. For the latest news, Facebook, Twitter and Instagram.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store