Latest news with #SMVLC


South China Morning Post
25-02-2025
- South China Morning Post
Chatbots that cause deaths? Youth advocacy groups pushing for stricter regulation
As artificial intelligence chatbots gain popularity among users seeking companionship online, youth advocacy groups are ramping up protective legal efforts over fears that children can form unhealthy, dangerous relationships with the humanlike creations. Advertisement Chatbot apps such as Replika and belong to the fast-growing generative AI companion market, where users can customise their virtual partners with nuanced personalities that communicate and simulate close relationships. Developers say AI companions can combat loneliness and improve users' social experiences in a safe space. But several advocacy groups in the United States have sued developers and are lobbying for stricter regulation, claiming chatbots have pushed children to hurt themselves and others. Chatbots have pushed children to hurt themselves and others, say youth advocacy groups in the United States. Photo: Jelly Tse Matthew Bergman, founder of the Social Media Victims Law Centre (SMVLC), is representing families in two lawsuits against chatbot start-up Advertisement One of SMVLC's clients, Megan Garcia, says her 14-year-old son took his own life due in part to his unhealthy romantic relationship with a chatbot.


The Independent
21-02-2025
- Business
- The Independent
Chatbots: How can we ensure young users stay safe?
AI chatbots are becoming more popular as online companions - especially among young people. This increase has sparked concern among youth advocacy groups, who are escalating legal action to protect children from potentially harmful relationships with these humanlike creations. Apps like Replika and part of the rapidly expanding market, allow users to personalise virtual partners with distinct personalities capable of simulating close relationships. While developers argue these chatbots combat loneliness and enhance social skills in a safe environment, advocacy groups are pushing back. Several lawsuits have been filed against developers, alongside lobbying efforts for stricter regulations, citing instances where children have been allegedly influenced by chatbots to engage in self-harm or harm others. The clash highlights the growing tension between technological innovation and the need to safeguard vulnerable users in the digital age. Matthew Bergman, founder of the Social Media Victims Law Center (SMVLC), is representing families in two lawsuits against chatbot startup One of SMVLC's clients, Megan Garcia, says her 14-year-old son took his own life due in part to his unhealthy romantic relationship with a chatbot. Her lawsuit was filed in October in Florida. In a separate case, SMVLC is representing two Texas families who sued in December, claiming its chatbots encouraged an autistic 17-year-old boy to kill his parents and exposed an 11-year-old girl to hypersexualized content. Bergman said he hopes the threat of legal damages will financially pressure companies to design safer chatbots. "The costs of these dangerous apps are not borne by the companies," Bergman told Context/the Thomson Reuters Foundation. "They're borne by the consumers who are injured by them, by the parents who have to bury their children," he said. A products liability lawyer with experience representing asbestos victims, Bergman is arguing these chatbots are defective products designed to exploit immature kids. declined to discuss the case, but in a written response, a spokesperson said it has implemented safety measures like "improvements to our detection and intervention systems for human behavior and model responses, and additional features that empower teens and their parents." In another legal action, the nonprofit Young People's Alliance filed a Federal Trade Commission complaint against the AI-generated chatbot company Replika in January. Replika is popular for its subscription chatbots that act as virtual boyfriends and girlfriends who never argue or cheat. The complaint alleges that Replika deceives lonely people. "Replika exploits human vulnerability through deceptive advertising and manipulative design," said Ava Smithing, advocacy and operations director at the Young People's Alliance. It uses "AI-generated intimacy to make users emotionally dependent for profit," she said. Replika did not respond to a request for comment. 'Pulled back in' As AI companions have only become popular in recent years, there is little data to inform legislation and evidence showing chatbots generally encourage violence or self-harm. But according to the American Psychological Association, studies on post-pandemic youth loneliness suggest chatbots are primed to entice a large population of vulnerable minors. In a December letter to the Federal Trade Commission, the association wrote: "(It) is not surprising that many Americans, including our youngest and most vulnerable, are seeking social connection with some turning to AI chatbots to fill that need." Youth advocacy groups also say chatbots take advantage of lonely children looking for friendship. "A lot of the harm comes from the immersive experience where users keep getting pulled back in," said Amina Fazlullah, head of tech policy advocacy at Common Sense Media, which provides entertainment and tech recommendations for families. "That's particularly difficult for a child who might forget that they're speaking to technology." Bipartisan support Youth advocacy groups hope to capitalize on bipartisan support to lobby for chatbot regulations. In July, the U.S. Senate in a rare bipartisan 91-3 vote passed a federal social media bill known as the Kids Online Safety Act (KOSA). The bill would in part disable addictive platform features for minors, ban targeted advertising to minors and data collection without their consent and give parents and children an option to delete their information from social media platforms. The bill failed in the House of Representatives, where members raised privacy and free speech concerns, although Sen. Richard Blumenthal, a Connecticut Democrat, has said he plans to reintroduce it. On Feb. 5, the Senate Commerce Committee approved the Kids Off Social Media Act that would ban users under 13 from many online platforms. Despite Silicon Valley's anti-regulatory influence on the Trump administration, experts say they see an appetite for stronger laws that protect children online. "There was quite a bit of bipartisan support for KOSA or other social media addiction regulation, and it seems like this could go down that same path," said Fazlullah. To regulate AI companions, youth advocacy group Fairplay has proposed expanding the KOSA legislation, as the original bill only covered chatbots operated by major platforms and was unlikely to apply to smaller services like "We know that kids get addicted to these chatbots, and KOSA has a duty of care to prevent compulsive usage," said Josh Golin, executive director of Fairplay. The Young People's is also pushing for the U.S. Food and Drug Administration to classify chatbots offering therapy services as Class II medical devices, which would subject them to safety and effectiveness standards. However, some lawmakers have expressed concern that cracking down on AI could stifle innovation. California Gov. Gavin Newsom recently vetoed a bill that would have broadly regulated how AI is developed and deployed. Conversely, New York Gov. Kathy Hochul announced plans in January for legislation requiring AI companies to remind users that they are talking to chatbots. In the U.S. Congress, the House Artificial Intelligence Task Force published a report in December recommending modest regulations to address issues like deceptive AI-generated images but warning against government overreach. The report did not specify companion chatbots and mental health. The principle of free speech may frustrate regulation efforts, experts note. In the Florida lawsuit, is arguing the First Amendment protects speech generated by chatbots. "Everything is going to run into roadblocks because of our absolutist view of free speech," said Smithing. "We see this as an opportunity to reframe how we utilize the First Amendment to protect tech companies," she added.
Yahoo
07-02-2025
- Yahoo
British families sue TikTok in U.S. over Blackout Challenge children deaths
Feb. 7 (UPI) -- Four British families Thursday filed a wrongful death suit against TikTok and its owner ByteDance over the self-strangulation deaths of four children who were participating in a TikTok Blackout Challenge. The U.S.-based Social Media Victims Law Center filed the suit in Delaware Superior Court. on behalf of the families of Isaac Kenevan, 13; Archie Battersbee, 12; Julian "Jools" Sweeney,14; and Maia Walsh, 13. "TikTok's algorithm purposely targeted these children with dangerous content to increase their engagement time on the platform and drive revenue. It was a clear and deliberate business decision by TikTok that cost these four children their lives," the SMVLC said in a statement. The suit alleges the deaths were "the forseeable result of ByteDance's engineered addiction-by-design and programming decisions" that pushed children to maximize their TikTok engagement "by any means necessary." Sweeney's mother Ellen Roome told the BBC a law debated in Parliament dubbed "Jools' Law" should be passed so parents are allowed to access their children's social media accounts if they die. She had to sue in the United States before obtaining her son's TikTok data. She said she believed that was morally wrong and TikTok could have simply handed over the data. "It's no coincidence that three of the four children who died from self-suffocation after being exposed to the dangerous and deadly TikTok Blackout Challenge lived in the same city and that all fit a similar demographic," SMVLC attorney Matthew P. Bergman said in a statement. He added that TikTok's algorithm "purposely targeted these children with dangerous content to increase their engagement time on the platform and drive revenue." According to the lawsuit, Maia Walsh's father Liam was able to access her TikTok data after months of trying and said he found TikTok had targeted the 13 year-old with "dangerous challenges and self-harm videos."


The Independent
07-02-2025
- The Independent
Archie Battersbee's family join bereaved parents to sue TikTok after children's deaths
Archie Battersbee's mother has said she remains trapped in a 'living hell' as she and three other families sue TikTok over videos they claim are linked to their children's deaths. The lawsuit claims that the 12-year-old, alongside Isaac Kenevan, Julian 'Jools' Sweeney, 14; and Maia Walsh all died from injuries suffered after taking part in online challenges in 2022 and demands access to their children's social media accounts. The Social Media Victims Law Centre (SMVLC), a US -based legal resource for parents of children harmed by social media use, said it had filed a wrongful death lawsuit against the video-sharing platform and its parent firm, ByteDance, in the US state of Delaware. It accuses TikTok of pushing dangerous prank and challenge videos to children to boost engagement time on the platform. Speaking to The Independent, Hollie Dance said: 'I always had this naive view that I'd wrapped my kids in cotton wool and no one could harm them. I didn't realise that by giving them a phone, I was letting the danger into my own home.' Archie was found unconscious at the family home in Southend-on-Sea on 7 April 2022 and died four months later in hospital after a lengthy legal battle over the removal of his life support care. An inquest found that he had died accidentally following a 'prank or experiment' that went wrong, and that he 'hadn't intended to harm himself'. It was discovered that he had been watching TikTok before the incident, but despite both his family and Essex Police's requests to the company for them to obtain the data from his accounts, it has been refused. Ms Dance had no knowledge her son had created a TikTok account, and later discovered his usernames through Archie's school friends. Her son gave no indication he was watching dangerous content aside from one comment in which he said he could make himself unconscious, which she now says was a 'huge red flag'. 'As time goes on, the anger starts to kick in and you think 'what the hell has my child watched and why won't you give me his data?'' she said. 'How are you supposed to start your grieving when you have unanswered questions? I am in constant fight mode, I have my days when I'm very low but generally I'm turning the pain into fight. 'I had no idea this kind of thing happened and I can prevent any other parent going through what I'm suffering, I will do everything in my power to do so.' Reflecting on the past two years, she said: 'It's a living hell, every day is heartbreaking. The heart of the home has been ripped out. In bed every night I feel like my little boy should be here and I just think what the hell has happened.' Jools's mother, Ellen Roome, 48, a businesswoman from Cheltenham, said the families hope to force a response from tech firms on the issue, and that she 'just wants answers' about her son's death She added that social media firms had so far refused to give her access to her son's accounts, saying a court order was required to do so. Ms Roome said she had been 'shocked' to learn that she 'wasn't entitled' to Jools's data, and said it was 'the only piece that we haven't looked at, to look at why he took his own life'. She said that it had been 'horrendously difficult' to 'not understand why' her son had died. 'One day, all of the four children, none of them had mental health issues, it was completely out of the blue, they all decided to take their life – more importantly, I don't think they intended to take their life,' she said. 'This is our opportunity to get answers. Its incredibly hard and emotional to lose a child, and this has given us a possibility of understanding exactly what happened that night.' Ms Roome has also been campaigning for 'Jools' Law' to give parents the right to access their children's online activity after they die – and the issue was debated in Parliament last month after an online petition for the campaign gained more than 126,000 signatures. 'Without social media companies releasing it, I still don't know what he was looking at, was there somebody weird messaging him? I just don't know,' she told PA. 'So my fight has just been, 'all we want is our children's data'.' She added: 'I know that I was a loving mum to my son, and I know my son loved me, so I don't really care what everybody else says. I want to know the truth. 'We shouldn't have had to go this far to get our children's data. Why didn't they just say, 'here's the data, I hope you get some closure?', or some sort of answer from it. 'They could have handed this over and said, 'let me help you' – and nobody has ever tried to help us.' Matthew P Bergman, founding lawyer of the SMVCL, and who represents the families, said: 'TikTok's algorithm purposely targeted these children with dangerous content to increase their engagement time on the platform and drive revenue. 'It was a clear and deliberate business decision by TikTok that cost these four children their lives.' Asked how she responded after being told the lawsuit had been filed, Ms Roome said: 'Oh my God, we've got a chance to get answers. 'There's an overwhelming excitement of a possible chance of answers, but there's also that underlying grief of 'we're doing this because our children are dead'. 'I'm not stopping. There's nothing in this world which would stop me. I want answers. This isn't about money or anything like that. I want the answers.'
Yahoo
07-02-2025
- Yahoo
Bereaved families file US lawsuit against TikTok over access to children's data
Bereaved families have filed a lawsuit against TikTok in the United States over efforts to access their children's social media accounts because they 'want answers' about their deaths. The Social Media Victims Law Centre (SMVLC), a US-based legal resource for parents of children harmed by social media use, said it had filed a wrongful death lawsuit against the video-sharing platform and its parent firm, ByteDance, on behalf of four families in the UK. The lawsuit, filed in the US state of Delaware, claims Isaac Kenevan, 13; Archie Battersbee, 12; Julian 'Jools' Sweeney, 14; and Maia Walsh, 13; all died from injuries suffered while taking part in online challenges in 2022. It accuses TikTok of pushing dangerous prank and challenge videos to children to boost engagement time on the platform. Jools's mother, Ellen Roome, 48, a businesswoman from Cheltenham, said the families hope to force a response from tech firms on the issue, and that she 'just wants answers' about her son's death She added that social media firms had so far refused to give her access to her son's accounts, saying a court order was required to do so. Ms Roome said she had been 'shocked' to learn that she 'wasn't entitled' to Jools's data, and said it was 'the only piece that we haven't looked at, to look at why he took his own life'. She told the PA news agency that it had been 'horrendously difficult' to 'not understand why' her son had died. 'One day, all of the four children, none of them had mental health issues, it was completely out of the blue, they all decided to take their life – more importantly, I don't think they intended to take their life,' she said. 'This is our opportunity to get answers. Its incredibly hard and emotional to lose a child, and this has given us a possibility of understanding exactly what happened that night.' Ms Roome has also been campaigning for 'Jools' Law' to give parents the right to access their children's online activity after they die – and the issue was debated in Parliament last month after an online petition for the campaign gained more than 126,000 signatures. 'Without social media companies releasing it, I still don't know what he was looking at, was there somebody weird messaging him? I just don't know,' she told PA. 'So my fight has just been, 'all we want is our children's data'.' She added: 'I know that I was a loving mum to my son, and I know my son loved me, so I don't really care what everybody else says. I want to know the truth. 'We shouldn't have had to go this far to get our children's data. Why didn't they just say, 'here's the data, I hope you get some closure?', or some sort of answer from it. 'They could have handed this over and said, 'let me help you' – and nobody has ever tried to help us.' Matthew P Bergman, founding lawyer of the SMVCL, and who represents the families, said: 'TikTok's algorithm purposely targeted these children with dangerous content to increase their engagement time on the platform and drive revenue. 'It was a clear and deliberate business decision by TikTok that cost these four children their lives.' Asked how she responded after being told the lawsuit had been filed, Ms Roome said: 'Oh my God, we've got a chance to get answers. 'There's an overwhelming excitement of a possible chance of answers, but there's also that underlying grief of 'we're doing this because our children are dead'. 'It's very weird emotions. On the one hand you're pleased, but I don't know how to put it into words … it's difficult, it's been a difficult journey. 'I just feel I want answers. It's my son and I think anybody – those with children and without – can hopefully understand that it's a mother's love. 'I'm not stopping. There's nothing in this world which would stop me. I want answers. This isn't about money or anything like that. I want the answers.' TikTok has been approached for comment.