Latest news with #NealBarber


Gizmodo
28-07-2025
- Business
- Gizmodo
Former Chaturbate Moderator Sues Site Over ‘Psychological Trauma'
A former content moderator for the porn site Chaturbate has sued the platform and its affiliates, claiming that he was psychologically harmed by his ongoing exposure to the sexual material on the site. Neal Barber, who was hired as a moderator for the porn site in 2020, claims in a class action lawsuit that his employers knowingly and intentionally failed to 'provide their content moderators with industry-standard mental health protections, such as content filters, wellness breaks, trauma-informed counseling, or peer support systems.' 404 Media first reported on the litigation. The suit, which names as defendants Chaturbate, its parent company, Multi Media LLC, and a customer support contractor, Bayside Support Services, was filed earlier this month in California. The lawsuit claims that Barber 'developed post-traumatic stress disorder (PTSD) and other severe emotional injuries' from his work, which required him to view and interact with 'sexually explicit, violent, obscene and psychologically disturbing live-streamed content for extended periods of time.' Barber now claims to suffer from 'vivid nightmares, emotional detachment, panic attacks, and other symptoms consistent with PTSD.' This alleged emotional trauma requires 'ongoing medical treatment and therapy,' the suit says. 'These injuries were not only foreseeable, but preventable,' the litigation continues. 'Had Defendants taken even the minimal precautions adopted by companies in Defendants' industry, Plaintiff would not have suffered these injuries.' The lawsuit also notes the importance of moderators to the porn industry's business model. 'Because platforms like Chaturbate host vast amounts of live, unfiltered, and sexually explicit content, content moderators are essential to maintain compliance with legal standards, enforce platform rules, and prevent the dissemination of illegal or abusive material,' the lawsuit says. 'They serve as the first line of defense against child exploitation, non-consensual content, violent content, obscene content, self-harm, and other violations.' Gizmodo reached out to Chaturbate, as well as to Bayside Support Services and Multi Media LLC, for comment. The plight of the content moderator has become one of the most confounding dilemmas of the modern age. The internet is overflowing with repellant material, and it's almost always somebody's job to try to clean it up (even Elon Musk's 'free speech' platform X has a moderation staff). Usually, the job falls to precarious low-wage workers—many of whom end up claiming that the sites that employ them do next to nothing to ease the psychological pain of having to watch awful stuff all day. As an example, Meta has been sued multiple times over the company's alleged treatment of African contractors who were tasked with moderating the deluge of disturbing and illegal content on the company's websites. Last year, it was reported that 140 moderators who had previously done work for Facebook had been diagnosed with PTSD from having viewed social media material involving murders, suicides, and child sexual abuse material. As legal troubles involving moderators have become more common, some companies are increasingly turning to automated, AI-driven systems to do the work of cleaning up their sites. However, it's often the case that human observers are still necessary to provide oversight for the automated systems. Chaturbate has had a difficult few years, as it and other porn sites continue to adjust to the wave of age-verification regulations that have taken root in mostly conservative states. Last year, the platform was fined over half a million dollars by the state of Texas for failing to institute age-verification mechanisms for the users of its site. A conservative political movement has also increasingly lobbied to make the entire porn industry illegal.

Business Insider
28-07-2025
- Health
- Business Insider
Content moderator at live-streaming porn site Chaturbate sues, saying he suffered PTSD from his work
A content moderator for the popular porn site Chaturbate has sued the adult live-streaming platform, alleging in court papers that he has developed PTSD as a result of his daily exposure to "extreme, violent, graphic, and sexually explicit" material. In the lawsuit filed last week in the US District Court for the Central District of California, plaintiff Neal Barber accused Chaturbate and its operator of negligence for "knowingly and intentionally" failing to provide their content moderators with "industry-standard mental health protections" like content filters, wellness breaks, trauma-informed counseling or peer support systems. Barber, the lawsuit said, has suffered "psychological trauma" and other severe emotional injuries since his November 2020 hiring. He is currently on medical leave "due to PTSD" from his content moderation work, the court papers added. A spokesperson for Multi Media, LLC — the owner of the pornographic website which is also named as a defendant — told Business Insider in a statement on Monday: "The company has not been served nor has it reviewed the complaint and therefore cannot comment on the matter at this time." "With that said, it takes content moderation very seriously, deeply values the work of its moderators, and remains committed to supporting the team responsible for this critical work," the statement continued. Attorneys for Barber did not immediately respond to a request for comment by BI. Lawyers for Barber allege in the lawsuit his injuries "were not only foreseeable, but preventable." "Had Defendants taken even the minimal precautions adopted by companies in Defendants' industry, Plaintiff would not have suffered these injuries," the lawsuit said. "Plaintiff is informed and believes that numerous other members of the proposed class have also suffered emotional harm from engaging in the content moderator duties required of them by Defendants." Barber's lawsuit said his role was known as "customer service risk supervisor," and that his job was to act as a content moderator for the Chaturbate website, where hosts can broadcast sexual live video streams and viewers can interact with them in real time. "Because platforms like Chaturbate host vast amounts of live, unfiltered, and sexually explicit content, content moderators are essential to maintain compliance with legal standards, enforce platform rules, and prevent the dissemination of illegal or abusive material," the lawsuit said. The content moderators, the court papers said, serve as the "first line of defense against child exploitation, non-consensual content, violent content, obscene content, self-harm, and other violations." Without them, the porn site "would become unmanageable, unsafe, and legally vulnerable," said the lawsuit. Barber and the proposed class, the lawsuit said, "have been and continue to be routinely exposed to some of the most graphic, disturbing, obscene and psychologically damaging content found anywhere online." "Their jobs require them to monitor live-streamed material which too often involves child sexual abuse imagery, self-harm and suicide threats, extreme violence, and highly obscene, degrading, or dehumanizing sexual acts," the lawsuit said. "Much of this content is created to be intentionally shocking, often non-consensual, and designed to provoke trauma." The lawsuit alleged that Chaturbate's lack of mental health protections for its employees "was not a routine workplace oversight but a conscious disregard of nondelegable duties imposed by law and public policy, including the obligation to provide a safe and healthy work environment." Chaturbate and Multi Media previously faced another lawsuit last year brought by Texas Attorney General Ken Paxton, who accused the entities of violating the state's age-verification law. A settlement that called for Multi Media to pay a $675,000 penalty was ultimately reached between the parties.