logo
#

Latest news with #Can'tLookAway

Social media aimed at kids is driven by profit, not safety
Social media aimed at kids is driven by profit, not safety

Irish Examiner

time2 days ago

  • Health
  • Irish Examiner

Social media aimed at kids is driven by profit, not safety

Representatives from big tech companies consistently describe their products and services as being "safe by design" for children. I'm not buying it. TikTok says: 'We've designed our app with safety in mind, implementing a 'safety by design' approach that ensures we are building protection for our users, including teens and their parents'. Roblox claims its platform 'was developed from the beginning as a safe, protective space for children to create and learn'. Unsurprisingly, all of the popular services that children use make similar claims about the importance of the safety of their product. Claims that are at best, lofty ideals, and at worst, deliberately misleading. There is plenty of evidence these claims are far from the reality of what children and young people are experiencing on platforms such as YouTube and TikTok, both of which use an incredibly powerful AI-powered recommender system to recommend content that maximises attention, rather than what is age-appropriate, content from friends, or even the kinds of things they're interested in seeing. A hard-hitting 2024 report by DCU's Antibullying Centre showed how recommender algorithms being used in YouTube and TikTok shorts were actively fuelling the spread of misogyny and 'male supremacy' content. There were two other notable findings to this research. Firstly, the speed with which the content started to appear was staggering, taking only 23 minutes after the account became active. Secondly, once the user started to show any interest at all in the recommendations, the content increased dramatically in both volume and in toxicity. Girls are not immune to being served harmful content either, though it tends to fall into different categories, such as eating disorders and body dysmorphia. In a report published last year, the Centre for Countering Digital Hate reported YouTube's recommender algorithm is pushing 13-year-old girls down rabbit holes of harmful content. An analysis of 1,000 recommended videos found one in three were related to eating disorders, two in three focused on eating disorders or weight loss, and one in 20 involved content about self-harm or suicide. Snapchat, which is incredibly popular (like YouTube and TikTok) with children who are under the platform's own minimum age requirement of 13 (36% of eight-12 year olds use it, according to our latest research), promises it is 'deeply committed to helping teens on Snapchat have a healthy and safe experience', yet Snap Inc, its parent company, is currently facing multiple lawsuits in the US that allege its harmful design fosters addictive behaviors and exposes children to risky content, such as cyberbullying, substance abuse, and self-harm material. The horrifying impact of this content is well documented in a soon to be released Bloomberg documentary called Can't Look Away. There can be no doubt these online products and services widely used by children are far from being 'safe by design'. They may not be intentionally harmful to children, but it is the consequence of their design. Their design is to deliberately maximise profit, not safety, which has largely been an afterthought. A more recent market offering has been Instagram's teen accounts from Meta, which promises parents that 'teens are having safe experiences with built-in protections on automatically [sic] .' Certainly, at surface level, this new design sounded promising, with greater efforts made to ensure there would be far stronger protections in place from harmful content and harmful contact for those under 16. But two recent reports suggest it's not as safe by design as its widely circulated ad campaign would have us believe. Accountable Tech released a report this month which found despite Meta's controls, all accounts had been recommended sensitive, sexual, and harmful content, with minimal educational recommendations and distressing experiences reported by most users. Another report published in April by the 5Rights Foundation titled Is Instagram Now Safe for Teens? had very similar findings. There can be no doubt these online products and services widely used by children are far from being 'safe by design'. They may not be intentionally harmful to children, but it is the consequence of their design. Their design is to deliberately maximise profit, not safety, which has largely been an afterthought. How is it that other industries are expected to adhere to far more stringent regulations? The toy industry, for example, is subject to a strict range of regulations to sell into the European market, including the CE standard, which shows the toy meets European safety standards for children. This ensures the toy has passed checks for things like dangerous chemicals, durability, and age-appropriate design. This is all before it reaches the consumer. It is simply inconceivable that an industry as powerful and as widely used by children is not subject to anything even close to this level of scrutiny and testing at the design stage. This is despite the passing of the EU Digital Services Act, which does attempt to put some minimum standards in place and impose risk assessments on some of the larger platforms. It's hard to fathom how WhatsApp was able to unleash its AI buddy — a virtual assistant powered by artificial intelligence — onto all of its subscribers' feeds, with the small caveat that 'some messages may be inaccurate or inappropriate". Forty per cent of children aged eight–12 have a WhatsApp account in Ireland, according to our research. For any parents who may not want such a powerful resource in their children's pockets, it's unfortunately not a feature that you can unsubscribe from, despite it being described as "optional". As we continue to navigate the online landscape, it's clear the promise of "safety by design" from major tech companies is almost always falling short of its stated goal. While big tech must be held accountable for these shortcomings, it's crucial that we demand stricter regulations and oversight from Government and regulators to ensure they are truly safe and appropriate for the young audiences they serve. Until these changes are made, children will continue to face a digital environment that prioritises profit over their wellbeing and exposes them to very real harms. Alex Cooney is chief executive of Ireland's online safety charity, CyberSafeKids. Find resources to help protect children on its website.

Jolt's Latest Doc ‘Can't Look Away' Examines the Dark Side of Social Media and Its Impact On Adolescents
Jolt's Latest Doc ‘Can't Look Away' Examines the Dark Side of Social Media and Its Impact On Adolescents

Yahoo

time6 days ago

  • Health
  • Yahoo

Jolt's Latest Doc ‘Can't Look Away' Examines the Dark Side of Social Media and Its Impact On Adolescents

In the documentary 'Can't Look Away,' directors Matthew O'Neill and Perri Peltz expose the dark side of social media and the tragic impact Big Tech company algorithms can have on children and teens. Based on extensive investigative reporting by Bloomberg News reporter Olivia Carville, the doc follows a team of lawyers at Seattle's Social Media Victims Law Center who are battling several tech companies for families who have lost children due to suicide, drug overdose, or exploitation linked to social media use. O'Neill and Peltz ('Axios,' 'Surveilled') capture the lawyers' fight against Section 230 of the Federal Communications Act. Created in 1996 before the birth of social media, Section 230, known as the Communications Decency Act, states that internet service providers cannot be held responsible for what third parties are doing. More from Variety Netflix's 'Cold Case: The Tylenol Murders' Investigates Who Was Responsible for Seven Deaths: A Psychopath or a Drug Company? Judas Priest Documentary, Co-Directed by Rage Against the Machine's Tom Morello, Coming From Sony Music Vision (EXCLUSIVE) Millennium Docs Against Gravity Festival in Poland Crowns 'Yintah' With Grand Prize 'The fact that this group of really incredible lawyers came together with this mission in mind to get around Section 230 through product liability, we just thought it was such a fascinating approach,' says Peltz. 'Can't Look Away' is currently streaming on Jolt, an AI-driven streaming platform that connects independent films with audiences. Recent Jolt titles include 'Hollywoodgate,' 'Zurawsksi v Texas,' and 'The Bibi Files,' a documentary from Oscar-winners Alex Gibney and Alexis Bloom that investigates corruption in Israeli politics. O'Neill says that he and Petlz decided to put 'Can't Look Away' on Jolt, in part, because the company could 'move quickly and decisively reach an audience now, with a message that audiences are hungry for.' 'What was also appealing to us is this sense of Jolt as a technology company,' he says. 'They are using these tools to identify and draw in new audiences that might not be the quote unquote documentary audience. We are documentary filmmakers, and we want our films to speak to everyone.' Jolt uses AI to power its Interest Delivery Networks, enabling films to connect with their target audiences. The platform's Chief Executive Officer, Tara Hein-Phillip, would not disclose Jolt viewership numbers for 'Can't Look Away,' making it difficult to determine how well the new distribution service is performing. However, Hein-Phillip did reveal that since the platform's launch in March 2024, the company's most-viewed film is the documentary 'Your Fat Friend,' which charts the rise of writer, activist, and influencer Aubrey Gordon. Hein-Phillip attributed part of the film's success on Jolt to Gordon's niche but significant online following. 'We are still learning along the way what builds audience and where to find them and how long it takes to build them,' Hein-Phillip says. 'It's slightly different for every film. We really focus on trying to find unique audiences for each individual film. In a way, that is problematic because it's not a reliable audience to say, 'Oh, we have built however many for this particular film, now we can turn them onto (this other) film and they'll all go there.' They won't.' The company utilizes advanced data analytics and machine learning to develop performance marketing plans that target specific audiences for each film and increase awareness. All collected data is shared with each respective Jolt filmmaker, who receives 70% of their Jolt earnings and retains complete ownership of their work and all future rights. 'Initially, we thought Jolt would just be an opportunity to put a film up there,' says Hein-Phillip. 'We would put some marketing against it, and we would push the film out into the world and give it our best push, and we definitely still do that, but now we realize that to build an audience, you actually have to do a handful of things. Some films come to us and they have already done that work, and some films come to us and they haven't. If they haven't, it's in our best interest and their best interest for us to help facilitate that.' That 'work' can include a theatrical release, an impact campaign, or a festival run. In addition to being a 'great, impactful film,' Hein-Phillip says that Jolt partnered with O'Neill and Peltz on 'Can't Look Away' because of the doc's audience potential. 'There are so many audiences for this film – parents, teenagers, lawyers, educators, etc,' said Hein-Philip. To attract those audiences, Jolt and 'Can't Look Away' directors have, ironically, relied on social media to help get the word out about the film. 'We aren't anti-social media,' says Petlz. 'What we are trying to say in the film is – put the responsibility where it rightly belongs.' 'Can't Look Away' will be released on Bloomberg Media Platforms in July. Best of Variety What's Coming to Netflix in June 2025 New Movies Out Now in Theaters: What to See This Week 'Harry Potter' TV Show Cast Guide: Who's Who in Hogwarts?

Social Media's 'Big Tobacco Moment' Is Coming
Social Media's 'Big Tobacco Moment' Is Coming

Bloomberg

time11-04-2025

  • Business
  • Bloomberg

Social Media's 'Big Tobacco Moment' Is Coming

The new Bloomberg Originals documentary Can't Look Away, which follows parents suing tech companies after the deaths of their children, is difficult to watch. It should be. The film lays bare what many parents already know: Social media is rewiring their children's brains, creating a generation of short attention spans and social anxiety. While viewing the film, what became clear is that tech platforms aren't doing nearly enough to stop it — and probably never will. It's apparent simply in Meta Platforms Inc. Chief Executive Officer Mark Zuckerberg's shift in tone. In January 2025, he stood before some of these parents at a US Senate Judiciary Committee hearing and said, 'I'm sorry for everything you've gone through.' Before the year was out, the Facebook creator's rhetoric had changed. Donning a gold chain and longer hair, he told an audience of technologists 'I don't apologize anymore.'

How to Keep Your Kids Safe Online
How to Keep Your Kids Safe Online

Bloomberg

time04-04-2025

  • Bloomberg

How to Keep Your Kids Safe Online

The 'stranger danger' fears of the 20th century can seem quaint compared with the horror stories kids may come across in the digital world. Before the internet, parents feared sexual predators or drug dealers having physical access to their children. Now they're just a swipe away. Kids are growing up online, immersed in social media, obsessed with it and, in some cases, addicted to it. More than 95% of teens in the US use social media, with one-third saying they are logged on almost constantly. The fabric of their social lives has shifted from classrooms to smartphone apps, video games and chat forums — internet spaces where it can be impossible to know who you're really talking to. And, as Bloomberg's new documentary Can't Look Away demonstrates, these online environments can be dangerous, and even deadly. The film, which is streaming on Jolt and in select theaters starting April 4, follows a group of attorneys fighting to hold social media companies accountable for causing devastating harm to kids: Cases where teens were ruthlessly blackmailed by international gangs of cyber-sextortionists or sold deadly counterfeit pills by drug dealers who deliver through their bedroom windows.

"Can't Look Away"
"Can't Look Away"

Bloomberg

time21-03-2025

  • Bloomberg

"Can't Look Away"

"Can't Look Away" follows a team of lawyers battling tech giants, fighting for families whose children suffered devastating harm linked to social media. The film underscores the urgent need for industry reform and serves as both a wake-up call about the dangers of social media—and a call to action to protect future generations. Directed by Matthew O'Neill and Perri Peltz and based on Bloomberg News' investigative reporting by Olivia Carville. (Source: Bloomberg)

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store