logo
Clicking ‘I agree' online lets data in and keeps lawyers out

Clicking ‘I agree' online lets data in and keeps lawyers out

The Hill22-07-2025
Open your browser. Browse. Click. Most of us assume that if we avoid logging in or turning on special features, our activity stays private. But a federal court in California shattered that assumption last month.
The case, which was brought on behalf of Chrome users, alleged that Google continued to collect personal data even when users specifically chose not to sync their Chrome browsers with a Google account, a step many reasonably believed would keep their digital footprints out of the company's hands.
The court didn't question whether data collection had occurred. Instead, it focused on whether users had truly consented. District Judge Yvonne Gonzalez Rogers concluded that, because users encountered different privacy terms or understood them differently, they couldn't sue together as a group.
Legally, that outcome fits with the established rule that class actions require a shared legal or factual thread. But when it comes to digital privacy, that tidy legal logic creates a troubling imbalance. The rule requiring everyone's privacy perceptions to line up acts as a clever maneuver that turns the messiness of how people encounter privacy policies into a shield against accountability.
The entire online privacy regime hinges on the legal fiction that when we click 'I agree,' we've meaningfully understood and accepted what comes next. But users encounter these policies distractedly, rarely read them and often can't make sense of them even if they try.
That disconnect is no accident. Privacy consent was never meant to truly inform users. It was designed to operationalize data collection and optimize for convenience, speed and scale.
The irony emerges when users try to push back. At that point, the same system that treats a mindless click as meaningful legal consent suddenly demands forensic-level detail about what each person saw, understood and agreed to.
In the Google case, the court that readily accepted the fiction of digital consent became deeply concerned with the reality of digital experiences. The very users who had been perfectly uniform when clicking 'I agree' were now too different to challenge that agreement together.
This is privacy law's great bait-and-switch: We're all in it together when accepting surveillance, but on our own when seeking accountability.
This leaves users in an impossible bind. When class action lawsuits fail because consent transforms back into an individualized contextual act, users can only go it alone. But that's a dead end. Individual privacy lawsuits almost never happen. The injuries they try to address are diffuse and abstract, ranging from hyper-targeted ads that feel invasive and algorithmic decisions that quietly discriminate to the unsettling sense that our lives are being watched too closely.
These are harms that matter, but they're hard to convert into legal claims and harder still to translate into dollars.
Class actions exist to bridge this gap. They take the scattered, often invisible harms of modern digital surveillance and turn them into something legible to courts. Class actions make it economically viable for lawyers to represent people without power, and they are just threatening enough to make companies think twice before crossing the line.
This enforcement crisis reflects a deeper choice we face about how power operates in the digital age. We can continue pretending that privacy is protected by an elaborate theater of click-through agreements that nobody reads, privacy policies that nobody understands and legal fictions that fail to serve the people they claim to protect. Or we can build a privacy framework that takes context seriously, one that recognizes the structural imbalances between users and platforms, the impossibility of meaningful consent in an attention economy and the need for collective mechanisms to challenge abuses.
The Google case will likely be remembered not for what it decided, but for the asymmetry it revealed in the way our legal system treats consent. Fixing that asymmetry doesn't mean extending the consent fiction further. It means moving past it entirely. Privacy protections should not hinge on whether someone clicked a box, but should reflect the realities of power, context and social expectations.
If we won't commit to a framework that takes those realities seriously, then at the very least we should stop using context selectively to shield companies from accountability while leaving users exposed to harmful data practices.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Stunning Mac mini dock revives the Apple Macintosh with a tiny screen
Stunning Mac mini dock revives the Apple Macintosh with a tiny screen

Digital Trends

time28 minutes ago

  • Digital Trends

Stunning Mac mini dock revives the Apple Macintosh with a tiny screen

Over a year ago, product designer Scott Yu-Jan created a Mac Studio iPad dock that was inspired by the classic Macintosh design, integrating the iPad mini and a Mac Studio in a sleek 3D-printed package. A few months later, he gave a portable computer treatment to the Mac mini by linking it with a full-sized keyboard and an oddly wide display. Yu-Jan, who is currently an interaction designer at Google, doesn't sell his ware commercially. However, if you've ever dreamed of giving a retro Macintosh look to your tiny Apple desktop, there's finally a product for you. The device in question is Wokyis M5, a Mac mini dock that features a 5-inch display and a heck ton of connections, including an M.2 SSD slot (up to 8TB capacity). The Kickstarter project, which has currently been oversubscribed by more than 16 times over its initial crowdfunding goal, will begin shipping next month. The hub offers a total of 13 ports, which include USB-C (10Gbps USB 3.2), USB-A (four units, 1Gbps), HDMI, SD/microSD card reader, and a 3.5mm headphone jack, as well. Recommended Videos The Wokyis website lists the official price of the hub at $199, but on Kickstarter, the 10Gbps version is put up for $169. The 80Gbps variant should cost you $339. Of course, if you pledge as an early bird supporter, the price will come down to as little as $109 for the base model and $199 for the higher-end trim. The biggest draw is the retro Apple Macintosh design format, and the 5-inch display on it. This is no toy screen. It's a fully functional panel that supports screen extending or mirroring for the Mac mini, just like any other external display connected to your Mac. I believe it would be perfect for controlling media playback or keeping an eye on chats. There's even a power button underneath the screen, adorned in the classic Apple logo color wave. Do keep in mind that it only supports the current-gen Mac mini with its shrunk chassis and an M4 series processor inside. However, it will work when connected with a MacBook, as well. You can check out more details about Wokyis M5 on its Kickstarter page and the brand's official website.

OpenAI ends ChatGPT users' option to index chats on search engines
OpenAI ends ChatGPT users' option to index chats on search engines

Miami Herald

timean hour ago

  • Miami Herald

OpenAI ends ChatGPT users' option to index chats on search engines

Aug. 2 (UPI) -- OpenAI is ending the option to have Google and other search engines index user chats with ChatGPT and make the content of those chats discoverable on searches. Google accounts for more than 89% of all online searches, which made private chats on ChatGPT potentially widely accessible when indexed on that search engine and others. "This feature introduced too many opportunities for folks to accidentally share things they didn't intend to, so we're removing the option," Dan Stuckey, OpenAI chief information security officer, told PC Mag. Bing, DuckDuckGo and other search engines will continue to index discoverable chats, but only for a while longer. "We're also working to remove indexed content from the relevant search engines," Stuckey said. OpenAI recently enabled the index option for private ChatGPT discussions as an experiment, Stuckey added, but that experiment is ending. A message informed users their indexed chats were searchable on Google and other search engines, but many users did not read the message or don't understand the extent to which their conversations might be available to others. Such conversations are accessible when affixing "site:chatgpt/share" to search queries when those conversations are indexed. News of the indexed private conversations with ChatGPT first was reported by FastCompany on Wednesday in a story detailing Google's indexing of ChatGPT conversations. The indexing does not provide information on respective users, but the conversations might include personal information when mentioned by the users while conversing with ChatGPT. Many users also were unaware that sharing a conversation with someone via social apps, such as WhatsApp, when saving the URL for future use would cause Google to make it potentially widely available to millions of people. OpenAI officials recently announced they were appealing a court order requiring the preservation of all chats that users delete after conversing with ChatGPT, Ars Technica reported. Copyright 2025 UPI News Corporation. All Rights Reserved.

OpenAI ends ChatGPT users' option to index chats on search engines
OpenAI ends ChatGPT users' option to index chats on search engines

UPI

time2 hours ago

  • UPI

OpenAI ends ChatGPT users' option to index chats on search engines

ChatGPT developer OpenAI is ending an experiment that enabled users to index and share their private conversations with the artificial intelligence program. File Photo by Wu Hao/EPA-EFE Aug. 2 (UPI) -- OpenAI is ending the option to have Google and other search engines index user chats with ChatGPT and make the content of those chats discoverable on searches. Google accounts for more than 89% of all online searches, which made private chats on ChatGPT potentially widely accessible when indexed on that search engine and others. "This feature introduced too many opportunities for folks to accidentally share things they didn't intend to, so we're removing the option," Dan Stuckey, OpenAI chief information security officer, told PC Mag. Bing, DuckDuckGo and other search engines will continue to index discoverable chats, but only for a while longer. "We're also working to remove indexed content from the relevant search engines," Stuckey said. OpenAI recently enabled the index option for private ChatGPT discussions as an experiment, Stuckey added, but that experiment is ending. A message informed users their indexed chats were searchable on Google and other search engines, but many users did not read the message or don't understand the extent to which their conversations might be available to others. Such conversations are accessible when affixing "site:chatgpt/share" to search queries when those conversations are indexed. News of the indexed private conversations with ChatGPT first was reported by FastCompany on Wednesday in a story detailing Google's indexing of ChatGPT conversations. The indexing does not provide information on respective users, but the conversations might include personal information when mentioned by the users while conversing with ChatGPT. Many users also were unaware that sharing a conversation with someone via social apps, such as WhatsApp, when saving the URL for future use would cause Google to make it potentially widely available to millions of people. OpenAI officials recently announced they were appealing a court order requiring the preservation of all chats that users delete after conversing with ChatGPT, Ars Technica reported.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store