logo
#

Latest news with #Children'sOnlinePrivacyProtectionAct

Patrick Kennedy Slams ‘Addiction-for-Profit' Social Media and Gambling Companies: ‘We Are Losing the Fight'
Patrick Kennedy Slams ‘Addiction-for-Profit' Social Media and Gambling Companies: ‘We Are Losing the Fight'

Yahoo

time25-05-2025

  • Politics
  • Yahoo

Patrick Kennedy Slams ‘Addiction-for-Profit' Social Media and Gambling Companies: ‘We Are Losing the Fight'

The United States is failing its children by failing to protect them from addictive products, former Rep. Patrick Kennedy told 'Meet the Press' host Kristen Welker Sunday. 'We've got to stop all of these intrusive addiction-for-profit companies from taking our kids hostage. That's what they're doing,' Kennedy charged. The solution, he added, is to fight. Welker and Kennedy focused on the Kids Online Safety Act Sunday, which proponents argue would require social media platforms to take the safety of children under 16 more seriously. Concerns about censorship as a result of the proposed bill have been raised by both First Amendment advocacy groups and LGBTQ+ communities. 'Our country is falling down on its own responsibility as stewards to our children's future. We are commercializing marijuana across the country,' Kennedy told Welker. 'How in the world, with kids' anxiety rates and depression rates, does it make sense to add to the addiction crisis by having more access — you know, access to addiction products?' Sports betting is another problem, he added. 'Our states are becoming addicted to the revenue of sports betting. And I can guarantee you, just like you're playing that story about that young woman who's getting targeted, we already know the algorithms for these betting companies are targeting people who are high risk. And we are gonna see a high correlation between people with gambling addiction and suicide.' 'And so what I'm saying, Kristen, is we can't just pass these bills,' Kennedy added. 'We've got to stop all of these intrusive addiction-for-profit companies from taking our kids hostage. That's what they're doing. This is a fight. And we are losing the fight because we're not out there fighting for our kids to protect them from these businesses that their whole profit motive is, 'How am I going to capture that consumer and lock them in as a consumer?'' As Welker pointed out, the last time the issue of online safety and children was addressed by Congress was in 1998 when the Children's Online Privacy Protection Act (COPPA) was passed. Welker noted the lengthy period of time that has transpired since and asked Kennedy why the issue hasn't been addressed more frequently. 'Well, the power of the social media giants and their money, there's going to be a bigger settlement by Meta and all the big social media companies than even was tobacco or Purdue combined,' he answered. 'You know, fool me once, shame on you. Fool me twice, shame on me. We, as a country, have seen these companies and industries take advantage of the addiction-for-profit. Purdue, tobacco. Social media's the next big one. And unfortunately, it's going to have to be litigated. We have to go after the devastating impact that these companies are having on our kids.' The American Academy of Pediatrics and the American Psychological Association have called on Congress to pass the Kids Online Safety Act. The proposed bill also had the support of former President Joe Biden, who wrote in July 2024, 'There is undeniable evidence that social media and other online platforms contribute to our youth mental health crisis. Today our children are subjected to a wild west online and our current laws and regulations are insufficient to prevent this. It is past time to act.' Speaker of the House Mike Johnson slowed the progress of the bill in December. 'Look, I'm a lifelong advocate of protection of children…and online safety is critically important…but we also have to make sure that we don't open the door for violations of free speech,' he advised Republicans at the time. The post Patrick Kennedy Slams 'Addiction-for-Profit' Social Media and Gambling Companies: 'We Are Losing the Fight' | Video appeared first on TheWrap.

Pixalate Discovers 286 US-Based Mobile Apps in the Google & Apple App Stores Violating the Children's Online Privacy Protection Act (COPPA), Endangering the Privacy of Up to 18 Million Children
Pixalate Discovers 286 US-Based Mobile Apps in the Google & Apple App Stores Violating the Children's Online Privacy Protection Act (COPPA), Endangering the Privacy of Up to 18 Million Children

Yahoo

time19-05-2025

  • Business
  • Yahoo

Pixalate Discovers 286 US-Based Mobile Apps in the Google & Apple App Stores Violating the Children's Online Privacy Protection Act (COPPA), Endangering the Privacy of Up to 18 Million Children

Pixalate releases complete list of 286 U.S.-registered and advertising enabled apps that likely violate COPPA; research reveals 72% of the identified apps share U.S. consumers' location data with advertisers London, May 19, 2025 (GLOBE NEWSWIRE) -- Pixalate, the global market-leading ad fraud protection, privacy, and compliance analytics platform, today released the Q1 2025 State Of Children's Privacy On Mobile Apps Report, part of its COPPA Violation Risks in Mobile Apps series. The report highlights app developers on the Google Play Store and Apple App Store that are at risk of violating the Children's Online Privacy Protection Act (COPPA) for gathering children's personal information through likely child-directed apps without a compliant privacy policy,* as assessed by Pixalate. According to Pixalate's analysis, 100% of these likely child-directed and advertising enabled** apps shared US consumers' personal information with advertisers via the advertising bid stream. Children's Online Privacy Protection Act (COPPA) Background COPPA requires operators of apps (in this case, app developers) that collect, use, or disclose personal information from children to post an online privacy policy. To be COPPA compliant, the privacy policy must disclose (16 C.F.R. § 312.4 (d) (1-3): The name, address, telephone number, and email address of all operators collecting or maintaining personal information through the app; A description of what information the operator collects from children, including whether the operator enables children to make their personal information publicly available, how the operator uses such information, and the operator's disclosure practices for such information; The procedures by which a parent can review or have deleted the child's personal information and refuse to permit its further collection or use. For Pixalate's compliance technology to classify a likely child-directed app as potentially non-compliant with the COPPA Rule, one or more of the following deficiencies must be identified: No privacy policy URL was detected in the app stores. A URL claiming to link to a privacy policy was detected in the app stores, but neither the page it led to nor any linked pages (in headers / footers) were identified as privacy policies by Pixalate's compliance technology. A privacy policy URL was detected, but it likely did not meet the disclosure obligations of the COPPA Rule, according to Pixalate's analysis. Key State Of Children's Privacy On Mobile Apps Report Findings Compliance Issues: 17% (286) of the identified advertising enabled and likely child-directed mobile apps were likely non-compliant with COPPA 17 mobile apps did not have a detectable privacy policy Missing Disclosures: 79% (225) of the identified mobile apps with ads lacked a Children's Privacy disclosure – a requirement under the COPPA Rule Sensitive Data Concerns: 53% (153) of the mobile apps with ads that were likely non-compliant under COPPA requested sensitive data permissions, such as location, camera access, audio, etc. Data Sharing Risks: 72% (207) of the identified likely non-compliant ad-enabled mobile apps in the Google Play and Apple App Stores shared US consumers' location data with advertisers in the ad bid stream Top 5 US-Registered Likely Non-Compliant Mobile Apps under COPPA - Apple App Store Rank App Developer Developer Country # Estimated US Consumers 1 7 Little Words Blue Ox Family Games, Inc. UNITED STATES 223K 2 Magic Jigsaw Puzzles-Games HD XIMAD, Inc. UNITED STATES 217K 3 Word Maker - Puzzle Game NewPubCo, Inc UNITED STATES 122K 4 Just Blocks: Wood Block Puzzle NewPubCo, Inc UNITED STATES 44K 5 Art of Puzzles - Jigsaw Games XIMAD, Inc. UNITED STATES 22K Top 5 US-Registered Likely Non-Compliant Mobile Apps under COPPA - Google Play Store Rank App Developer Developer Country # Estimated US Consumers 1 Magic Jigsaw Puzzles-Games HD ZiMAD UNITED STATES 237K 2 7 Little Words Blue Ox Family Games, Inc. UNITED STATES 139K 3 Art of Puzzles-Jigsaw Pictures ZiMAD UNITED STATES 85K 4 KidsFlix for TV Gary L Peskin UNITED STATES 47K 5 Magic Diamond Painting-Gem Art ZiMAD UNITED STATES 29K For this report, Pixalate's legal and data science teams analyzed the privacy policies of over 24K likely child-directed apps enabled for programmatic advertising (e.g., has ad impressions targeted towards US Consumers). These apps were available for download from the Google Play Store and Apple App Store in Q1 2025. For the complete list and the methodology, download the report here. *References to 'without a privacy policy/policies' or 'no detected/detectable privacy policy/policies' imply that Pixalate's proprietary systems were unable to detect or identify a purported privacy policy/notice URL at the time of crawling the App Stores pursuant to Pixalate's proprietary privacy policy detection and classification system. Based on Pixalate's methodology and analysis, all of the identified apps generally exhibit characteristics aligned with the child-directed factors under the COPPA Rule, suggesting that these apps likely appeal to, or may predominantly be used by, children. **References to advertising-enabled mobile apps, or apps with ads are those that have open programmatic advertising traffic, with ad impressions targeted towards United States based consumers at the time of crawling. To learn more, please review the report's methodology. About Pixalate Pixalate is a global platform specializing in privacy compliance, ad fraud prevention, and digital ad supply chain data intelligence. Founded in 2012, Pixalate is trusted by regulators, data researchers, advertisers, publishers, ad tech platforms, and financial analysts across the Connected TV (CTV), mobile app, and website ecosystems. Pixalate is accredited by the MRC for the detection and filtration of Sophisticated Invalid Traffic (SIVT). Disclaimer The content of this press release, and the Q1 2025 State of Children's Privacy on Mobile Apps Report (the 'report') - including all content set forth herein, reflect Pixalate's opinions with respect to the factors that Pixalate believes can be useful to the digital media industry, inclusive of advertisers, advertising technology companies, developers of mobile applications, professional advisors, non-governmental entities, and regulators. Pixalate is sharing this report's data–and opinions relating thereto–not to impugn the standing or reputation of any entity, person, or app, but, instead, to report opinions and suggest trends pertaining certain apps available for download via the Apple App Store & Google Play Store during the time period studied. Any data shared is grounded in Pixalate's proprietary technology and analytics, which Pixalate is continuously evaluating and updating. Any references to outside sources should not be construed as endorsements. Pixalate's opinions are just that, opinions, which means that they are neither facts nor guarantees. It is important to note that the mere fact that an app appears to be directed to children or is deemed likely child-directed (e.g., data subjects under 13 years of age, as defined by COPPA), does not mean that any such app, or its operator, is failing to comply with COPPA. Further, with respect to apps that appear to be directed to children and have characteristics that, in Pixalate's opinion, may trigger related privacy obligations and/or risk, such assertions reflect Pixalate's opinions (i.e., they are neither facts nor guarantees); and, although Pixalate's methodologies used to render such opinions are derived from automated processing, which at times is coupled with human intervention, no assurances can be – or are – given by Pixalate with respect to the accuracy of any such opinions. CONTACT: Nina Talcott ntalcott@ in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

The Kids Online Safety Act is back, with the potential to change the internet
The Kids Online Safety Act is back, with the potential to change the internet

TechCrunch

time14-05-2025

  • Politics
  • TechCrunch

The Kids Online Safety Act is back, with the potential to change the internet

The Kids Online Safety Act (KOSA) has been reintroduced into Congress. If passed into law, this bill could impose some of the most significant legislative changes that the internet has seen in the U.S. since the Children's Online Privacy Protection Act (COPPA) of 1998. As it currently stands, KOSA would be able to hold social media platforms legally accountable if it's proven that these companies aren't doing enough to protect minors from harm. The bill includes a long list of possible harms, such as eating disorders, sexual exploitation, substance abuse, and suicide. Though it overwhelmingly passed through the Senate last year, the bill was stifled in the House. KOSA has faced much backlash since its introduction in 2022. Human rights groups like the ACLU raised concerns that the bill could be weaponized as a tool for censorship and surveillance. While amendments to KOSA have mitigated some of these concerns, groups like the Electronic Frontier Foundation and Fight for the Future have remained against the bill. 'The bill's authors have claimed over and over that this bill doesn't impact speech. But the Duty of Care is about speech: it's about blocking speech that the government believes is bad for kids,' Fight for the Future wrote in a statement. 'And the people who will be determining what speech is harmful? They are the same ones using every tool to silence marginalized communities and attack those they perceive as enemies.' However, KOSA has garnered support from companies like Microsoft, Snap, and X; X CEO Linda Yaccarino even worked with Senators Marsha Blackburn (R-TN) and Richard Blumenthal (D-CT) on the most recent draft of the bill. Google and Meta have remained opposed to the bill, but Apple announced today that it will support the legislation. 'Apple is pleased to offer our support for the Kids Online Safety Act (KOSA). Everyone has a part to play in keeping kids safe online, and we believe [this] legislation will have a meaningful impact on children's online safety,' Timothy Powderly, Apple's Senior Director of Government Affairs, said in a statement. Techcrunch event Join us at TechCrunch Sessions: AI Secure your spot for our leading AI industry event with speakers from OpenAI, Anthropic, and Cohere. For a limited time, tickets are just $292 for an entire day of expert talks, workshops, and potent networking. Exhibit at TechCrunch Sessions: AI Secure your spot at TC Sessions: AI and show 1,200+ decision-makers what you've built — without the big spend. Available through May 9 or while tables last. Berkeley, CA | REGISTER NOW

Michigan AG Dana Nessel files lawsuit against Roku for allegedly violating children's data privacy laws
Michigan AG Dana Nessel files lawsuit against Roku for allegedly violating children's data privacy laws

CBS News

time01-05-2025

  • Business
  • CBS News

Michigan AG Dana Nessel files lawsuit against Roku for allegedly violating children's data privacy laws

Michigan Attorney General Dana Nessel has filed a lawsuit against Roku, Inc., claiming the company violates the Children's Online Privacy Protection Act (COPPA) and the Michigan Consumer Protection Act. The lawsuit, filed on April 29 in the U.S. District Court for the Eastern District of Michigan, accused the streaming platform of collecting and allowing third parties to collect the personal information of children without obtaining parental consent or the required notice. Nessel alleges that Roku does not offer parents the ability to create children's profiles. In her lawsuit, Nessel claims that "Roku systematically collects, processes, and discloses the personal information of children, including their locations, voice recordings, IP addresses, and persistent identifiers that track children's browsing histories on Roku and across the internet," which are categories protected under COPPA. The lawsuit alleges that Roku allows third-party channels to collect the personal information of children to increase advertising revenue and attract content providers to its streaming platform. "Roku has blatantly violated children's privacy laws, illegally exposing kids across Michigan to invasive data collection practices," Nessel said. "We cannot allow companies to jeopardize the security of our children's personal information. My office remains committed to holding accountable companies that violate the rights of Michigan families and seek to profit at the expense of children's safety and privacy." Roku issued the following statement in response to Nessel's lawsuit: "Roku strongly disagrees with the allegations in today's filing, which do not reflect how our services work or our efforts to protect viewer privacy. We plan to challenge these inaccurate claims and look forward to demonstrating our commitment to trust and compliance. Roku respects and values the privacy of our users. We do not use or disclose children's personal information for targeted advertising or any other purpose prohibited by law, nor do we partner with third-party web trackers or data brokers to sell children's personal information. We take the responsibility of creating a safe and trusted online environment seriously. Our viewers rely on Roku for engaging content, and we take pride in connecting our viewers to the streaming content they love every day." Nessel is asking for Roku to stop its alleged illegal data collection and disclosure practices, require the company to comply with state and federal law and recover civil penalties, damages and restitution for the company's years of alleged misconduct.

Opinion - To ensure child safety online, move age verification protections to the app store
Opinion - To ensure child safety online, move age verification protections to the app store

Yahoo

time01-05-2025

  • Yahoo

Opinion - To ensure child safety online, move age verification protections to the app store

As parents, we do our best to teach our kids how to stay safe online. We put limits on screen time. We use the settings available on their devices to help protect their privacy and reduce their exposure to potential dangers. But those steps still leave room for risk, partly because reliable age verification systems are not currently the norm. It's a problem I have been grappling with for years, both as a parent and as the founder and CEO of Snapchat, a tech platform that proudly serves tens of millions of young Americans every day. Snapchat is a visual communications platform for people 13 and older — and we work hard to detect and remove accounts that violate our age policy. In our efforts to prevent underage use, we have grappled with the same challenges to age verification that virtually every platform must confront. Privacy concerns are legitimate; verification systems require the collection of large amounts of personal information, create cybersecurity risks and invite the potential for misuse of sensitive data. Technical problems exist, too, from fake IDs to flawed algorithms. Despite these issues, the demand for better online age verification is growing. After all, in the physical world, society has established age-based restrictions for certain activities, including driving, voting and watching certain films. These guardrails exist for good reason, and reflect our understanding of developmental stages and the capacity for responsible decision-making. There's no reason why the digital world should operate by entirely different rules. In fact, some argue that the digital environment warrants even more careful age-appropriate boundaries. When technology makes the entire world accessible from a teenager's pocket, the implications for safety, cognitive development and emotional well-being are significant. Young people deserve support as they navigate online spaces. Parents naturally want to protect their children online, just as they do offline. Federal laws like the Children's Online Privacy Protection Act already require platforms to limit data collection for users under 13. Platforms have implemented age-gating for certain features and settings. But these rules only work if we can reliably tell how old users are — and the current system of self-reporting is far from perfect. No system will be flawless. The key is maximizing benefits while reducing downsides. That's why I believe initial age verification should happen at the operating system or app store level. In addition to the safeguards already put in place by many app developers, it's the best way to address the concerns many have about age verification while also meeting the broad and growing demand to find best-practice-level approaches to enable it. We're starting to see progress in this direction. Recently, Apple announced new features that will allow parents to set up their child's account and share the child's age range with app developers. This is a welcome step toward the kind of OS-level verification we need. However, this approach still leaves gaps. For this solution to truly work, we need comprehensive adoption across all major device-makers and app stores. Legislation that supports this concept has already passed in the State of Utah and been introduced in 16 other states. Bills by Sen. Mike Lee (R-Utah) and Rep. John James (R-Mich.) are also expected to be introduced in the U.S. Senate and House today. Operating systems and app stores already play a crucial role in the digital ecosystem. They set the standards requiring certain security protocols and removing inappropriate and potentially harmful apps to help protect people. They sit at the gateway of the digital world. This position gives them capabilities that individual app developers simply don't have. The benefits of this approach are compelling, particularly for families: It's simpler. Parents already share their teen's age when purchasing and setting up a device. Rather than forcing families to navigate repetitive verification processes across dozens of apps, the OS can serve as a secure 'one-stop shop' where verification happens once. This makes it much more likely that families will actually use these protections. It's consistent. Teens use dozens of apps every week, including apps offered by the app stores. A device-level approach gives parents peace of mind knowing that age verification protections will be applied consistently across any app their teen downloads. It's more private and secure. Centralizing age verification limits how often personal information must be shared, significantly reducing privacy risks, identity theft opportunities and data-breach exposure. It's trustworthy. OS and app store developers already have sophisticated systems, such as digital wallets, for managing user data. They can share age information with app developers without revealing personal details. Parents can be confident that sensitive information is handled responsibly by companies they already trust with established privacy-protective frameworks. This approach isn't about being overly restrictive or surveilling teens. It's about making sure their online experiences are age-appropriate while protecting their privacy and freedom to explore. Digital platforms offer young people incredible opportunities for creativity, learning and connection. On Snapchat, we've seen firsthand how technology can empower and uplift young voices, with the right protections and safeguards. Age verification at the app store or OS level represents a balanced approach that preserves the benefits of the internet while helping to mitigate its risks. We need all major platforms to recognize their important role in the digital world to create a more robust and sensible age verification solution, with legislative action to ensure these solutions become universal. Only then can we better protect young users and support their parents, while ensuring digital spaces remain open, vibrant and accessible. Evan Spiegel is the chief executive officer of Snapchat. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store