logo
Rahul Matthan: Don't let data privacy safeguards work against us

Rahul Matthan: Don't let data privacy safeguards work against us

Mint20-05-2025

The first country to seriously address the issue of protecting digital personal data was the United States of America. In a report titled Records, Computers and the Rights of Citizens issued in 1973, it set out a list of data protection principles called the Fair Information Practice Principles (FIPPs).
FIPPs required organizations to provide notice before collecting personal data and seek consent before processing it. Only as much personal data as was necessary to achieve the specified purpose could be collected, and it could only be used for the purpose specified. Organizations had to keep personal data accurate, complete and up to date, and give individuals the ability to access and amend it as required.
If all this sounds familiar, it is because it is. These principles have been incorporated into all modern data protection laws—from Europe's General Data Protection Regulation to India's Digital Personal Data Protection Act. It is where concepts like notice and consent, purpose specification, use limitation, data minimization and retention restriction come from, and it is remarkable how 50 years after they were first conceptualized, they continue to be used to protect personal privacy.
Or do they?
Also Read: Use verifiable credentials to grant us agency over our digital data
In the 1970s, our ability to process data was limited, constrained by computational power and storage capacity. As a result, very few organizations could afford to process personal information at a scale that would affect our privacy. Since companies had to be selective about what data they collected and used, it made sense to require them to constrain the uses to which they put the data and for how long they retained it.
Today, these constraints are no longer relevant. All organizations, regardless of their size or sphere of activity, use data in all aspects of their operations. Global data creation grew from about two zettabytes in 2010 to over 160 zettabytes projected in 2024. As a result, concepts like notice and consent are becoming increasingly meaningless, as it is no longer feasible to provide notice of all the different types of data processed or the many uses to which it will be put.
Advances in artificial intelligence (AI) have further complicated the issue.
If we want to benefit from all that AI has to offer, we need to give these systems access to our personal data so that they can draw inferences from it. With the ability to analyse the cumulative record of all the data that our personal fitness trackers have recorded about us, for example, AI systems may be able to use that information to infer our likelihood of contracting a disease. Those who are currently unable to access credit because they lack the traditional indicators of creditworthiness may be able to provide other indicators of their ability to repay a loan if AI systems are allowed to analyse their personal information.
Also Read: Biases aren't useless: Let's cut AI some slack on these
If we use AI systems for these purposes today, we are likely to run afoul of one or more of the data protection principles. Take, for instance, purpose specification. Since most AI use cases may not even have been conceivable when the data in question would have been collected, it is unlikely that our consent would have been obtained for it to be used in that manner. Deploying AI for these use cases would most likely require seeking fresh consent from data principals.
The other concern is around retention. Since data is only permitted to be retained for as long as necessary to serve the purpose for which it was collected, organizations that comply with the mandates of global data protection regulations have set up systems to delete personal data once their purpose has been served. In the case of healthcare data, this is unfortunate because medical AI applications rely on access to health data records over as long a period of time as possible in order to establish trends for current parameters to be evaluated against baselines. If hospitals have to delete this data as soon as the immediate purpose is served, these opportunities will not be realized.
Finally, there is the principle of data minimization, which requires us to only collect as much data as is strictly required to fulfil the specified purpose. Since AI systems perform better if they have more data on which they can be trained, the minimization obligation makes less data available for training and, as a result, limits the benefits that AI can bring.
Also Read: India must forge its own AI path amid a foundational tug of war
The approach taken by the US FIPPs to minimize the risk of privacy-harm limited the amount of personal data in the hands of the organizations that processed it. At the time, this was a reasonable approach as there was no additional benefit to be gained by allowing corporations to store our data.
This is no longer the case. The more data that AI systems have, the better the outcomes they produce. As a result, any approach that simply limits the data these systems can use trades the benefits that could accrue from data analysis for the mitigation of privacy-related risks.
I have, in previous columns, written about how new technological approaches—data anonymization and privacy-enhancing technologies—as well as institutional measures like data trusts can offer us a new approach. If we can deploy federated learnings and confidential compute systems, we should be able to use personal data without violating personal privacy.
Our current approach to data protection is now more than half a century old. It is no longer fit for purpose. We need to learn to use personal data for our benefit without causing privacy harms.
The author is a partner at Trilegal and the author of 'The Third Way: India's Revolutionary Approach to Data Governance'. His X handle is @matthan.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

An effective, credible Data Protection Board
An effective, credible Data Protection Board

Hindustan Times

time5 days ago

  • Hindustan Times

An effective, credible Data Protection Board

Earlier this year, the ministry of electronics and information technology (MeitY) sought public feedback on the draft rules for the implementation of the Digital Personal Data Protection Act. At present, stakeholder feedback is being considered, and the final rules await imminent enforcement. Thus, it is important to understand the effectiveness of the institutional framework of the Data Protection Board (DPB) established under the Act and operationalised through the rules. DPB, an enforcement and adjudicatory body, will receive complaints from individuals and references from the government, conduct inquiries, and impose penalties on organisations failing to comply with the Act's provisions. The draft rules prescribe the operational procedures for the implementation of the Act, but it is limited by the design envisaged under the Act. The lean institutional design and narrow scope of powers will constrain what the rules can achieve. Yet, there are three key aspects that can enhance the performance of DPB that can be brought about through the rules. First, the effective functioning of DPB requires designing an institution with adequate independence and functional autonomy, sufficient expertise and capacity, and necessary accountability measures. Institutional independence is based on several factors, but a great deal can be achieved through how its members are selected and appointed. The proposed rules empower the Union government to establish a selection committee to recommend DPB members, including its chairperson. The committee selecting the chairperson will be led by the Cabinet Secretary and have the secretaries of legal affairs and MeitY as well as two government-appointed experts. Similarly, the selection committee for DPB members will be headed by the MeitY secretary and have the secretary of legal affairs and two government-nominated experts. The composition of such selection committees is often skewed towards serving Union government officials. While the involvement of the executive is inevitable, a lack of diversity in the selection committee can result in partisan and biased appointments. A more diverse selection committee could include members from the legislative, judiciary, civil society, and other stakeholder groups. This would align better with principles of impartiality and ensure the much-needed diversity of stakeholder perspectives. There are enough examples of this, such as the selection committees for members and chairs of the Competition Commission of India and the Central Information Commission, among others. Institutional independence is vital for public trust and effective governance, and the selection and appointment processes are foundational to this. Second, the need for transparency in appointments, as well as the proceedings and functioning of DPB, cannot be overstated. The selection committees will evaluate candidates based on qualifications in data governance, dispute resolution, information technology, and the digital economy, as required by the Act. They will assess the integrity and practical experience of candidates too. To ensure transparency, the committee's recommendations and a brief justification of each candidate's eligibility should be made public. Predictability and certainty in dispute resolution will guide stakeholder decisions on approaching DPB. It will also allow the study of trends, effectiveness, and critically analyse DPB's performance. One way to ensure this is to impart maximum transparency to the resolution process. This can be done by publishing orders and findings of DPB and their reasoning and rationale. Periodic guidance on complex issues relating to data privacy and how DPB may respond to concerns relating to any emerging technology would be invaluable but is potentially outside the scope of the board's current design. The rules could also provide for the disclosure of minutes of meetings to promote transparency in the decision-making process. Third, DPB should not suffer delays in appointments and resolution of complaints. Delayed appointments in the statutory bodies affect enforcement of the law and impede policy decisions. They also lead to backlog of appeals and complaints from the public, resulting in a trust deficit and erosion of confidence in the institutional grievance redressal mechanism. To ensure this doesn't happen, it would be useful for the rules to stipulate that appointments with respect to any future vacancy should be made prior to the date of the vacancy to maintain smooth operation of the board. There is considerable potential for improving the institutional design of DPB through the draft rules to create a more independent, transparent, and reliable body. Institutions responsible for protecting core fundamental constitutional values, such as the right to privacy, must be independent to ensure effectiveness and credibility, and strengthen the trust of civil society, the market, and industry. Gangesh Varma and Yaqoob Alam work with Technology and Policy Practice at Saraf and Partners, a law firm. The views expressed are personal.

The not-so-sweet truth about clicking ‘accept all' when visiting a new website
The not-so-sweet truth about clicking ‘accept all' when visiting a new website

Time of India

time5 days ago

  • Time of India

The not-so-sweet truth about clicking ‘accept all' when visiting a new website

HighlightsCookies are small files saved to your device that enhance your online experience, such as remembering login details and personalizing content based on browsing history. The General Data Protection Regulation, enacted by the European Union in 2018, mandates that users must consent to cookies that identify them and provides guidelines for the handling of personal data online. Global Privacy Control is a tech specification that allows users to signal their privacy preferences to websites, potentially reducing the need for repetitive cookie consent pop-ups. It's nearly impossible to use the internet without being asked about cookies. A typical pop-up will offer to either 'accept all' or 'reject all'. Sometimes, there may be a third option, or a link to further tweak your preferences. These pop-ups and banners are distracting, and your first reaction is likely to get them out of the way as soon as possible – perhaps by hitting that 'accept all' button. But what are cookies , exactly? Why are we constantly asked about them, and what happens when we accept or reject them? As you will see, each choice comes with implications for your online privacy. What are cookies? Cookies are small files that web pages save to your device. They contain info meant to enhance the user experience, especially for frequently visited websites. This can include remembering your login information and preferred news categories or text size. Or they can help shopping sites suggest items based on your browsing history. Advertisers can track your browsing behaviour through cookies to show targeted ads. There are many types, but one way to categorise cookies is based on how long they stick around. Session cookies are only created temporarily – to track items in your shopping cart, for example. Once a browser session is inactive for a period of time or closed, these cookies are automatically deleted. Persistent cookies are stored for longer periods and can identify you – saving your login details so you can quickly access your email, for example. They have an expiry date ranging from days to years. What do the various cookie options mean? Pop-ups will usually inform you the website uses 'essential cookies' necessary for it to function. You can't opt out of these – and you wouldn't want to. Otherwise, things like online shopping carts simply wouldn't work. However, somewhere in the settings you will be given the choice to opt out of 'non-essential cookies'. There are three types of these: -functional cookies, related to personalising your browsing experience (such as language or region selection)-analytics cookies, which provide statistical information about how visitors use the website, and-advertising cookies, which track information to build a profile of you and help show targeted advertisements. Advertising cookies are usually from third parties, which can then use them to track your browsing activities. A third party means the cookie can be accessed and shared across platforms and domains that are not the website you visited. Google Ads, for example, can track your online behaviour not only across multiple websites, but also multiple devices. This is because you may use Google services such as Google Search or YouTube logged in with your Google account on these devices. Should I accept or reject cookies? Ultimately, the choice is up to you. When you choose 'accept all,' you consent to the website using and storing all types of cookies and trackers. This provides a richer experience: all features of the website will be enabled, including ones awaiting your consent. For example, any ad slots on the website may be populated with personalised ads based on a profile the third-party cookies have been building of you. By contrast, choosing 'reject all' or ignoring the banner will decline all cookies except those essential for website functionality. You won't lose access to basic features, but personalised features and third-party content will be missing. The choice is recorded in a consent cookie, and you may be reminded in six to 12 months. Also, you can change your mind at any time, and update your preferences in 'cookie settings', usually located at the footer of the website. Some sites may refer to it as the cookie policy or embed these options in their privacy policy. How cookies relate to your privacy The reason cookie consent pop-ups are seemingly everywhere is thanks to a European Union privacy law that came into effect in 2018. Known as GDPR (General Data Protection Regulation), it provides strict regulations for how people's personal data is handled online. These guidelines say that when cookies are used to identify users, they qualify as personal data and are therefore subject to the regulations. In practice, this means: -users must consent to cookies except the essential ones-users must be provided clear info about what data the cookie tracks-the consent must be stored and documented-users should still be able to use the service even if they don't want to consent to certain cookies, and-users should be able to withdraw their consent easily. Since a lot of website traffic is international, many sites even outside the EU choose to follow GDPR guidelines to avoid running afoul of this privacy law. Better privacy controls Cookie pop-ups are tiresome, leading to 'consent fatigue' – you just accept everything without considering the implications. This defeats the purpose of informed consent. There is another way to address your online privacy more robustly – Global Privacy Control (GPC). It's a tech specification developed by a broad alliance of stakeholders (from web developers to civil rights organisations) that allows the browser to signal privacy preferences to websites, rather than requiring explicit choices on every site. GPC is not universally available, and it's not a legal requirement – a number of browsers and plugins support it, but broader adoption may still take time. Meanwhile, if you're worried you may have accidentally consented to cookies you don't want, you can find an option in your browser settings to delete cookies and get back to a clean slate (be warned, this will log you out of everywhere). If you want to learn even more, the non-profit Electronic Frontier Foundation has a project called Cover Your Tracks.

Seen, sent, but never read – WhatsApp's new privacy story
Seen, sent, but never read – WhatsApp's new privacy story

Mint

time26-05-2025

  • Mint

Seen, sent, but never read – WhatsApp's new privacy story

In its boldest marketing move yet, WhatsApp has launched 'Not Even WhatsApp"—a global campaign that puts privacy front and centre. The 60-second TV spot in India, directed by Achowe and shot across Delhi's Yamuna banks and Chandni Chowk, flips the camera to the app's POV—where your most mundane messages stay unseen, even by WhatsApp itself. With Aamir Khan lending his voice in India, the film plays like a love letter to everyday messaging: from voice notes to moms, to gossip sessions and late-night confessions. It's all end-to-end encrypted, the ad reminds us, and that's the selling point, wrapped in local sights, sounds, and sentiment. Also Read | Annapurna's Mother's Day Miss: Where's the brand in the beauty? The timing is strategic. As India revisits data protection laws and rivals like Signal continue their quiet rise, Meta is reasserting trust. A star like Khan adds credibility without triggering political baggage. The campaign also highlights the new 'Advanced Chat Privacy" toggle, though subtly, alongside WhatsApp's other privacy tools like Privacy Checkup. It's smooth, emotional, and miles ahead of the usual tech gobbledygook. If Meta follows it up with an intuitive product experience, this could help fix a trust gap it has long been trying to bridge. Also Read | Are advertising agencies dying? Long may the art of persuasion live WhatsApp's latest ad campaign doesn't just sell a feature, it sells a feeling. In its biggest global marketing push yet, the messaging giant has unveiled 'Not Even WhatsApp", a bold declaration that your chats are yours alone. No snooping. No leaks. Not even from the app itself. The 60-second film, directed by Achowe of Chalk & Cheese Films and voiced in India by Aamir Khan, takes viewers into the inner world of WhatsApp—seen from the other side of your phone screen. Familiar, everyday messages float by in stylised motion, but no one is reading. It's a neat visual metaphor for end-to-end encryption, dramatised without ever feeling technical. Also Read | WhatsApp vs Pegasus: A well deserved win for Zuckerberg Shot across Delhi, including along the Yamuna and through the chaos of Chandni Chowk, the film grounds a global campaign in hyper-local familiarity. Khan's presence adds quiet gravitas. There's no hard-sell, no tech babble, just the idea that your most mundane exchanges, from mom's voice notes to midnight confessions, deserve absolute privacy. The campaign couldn't have come at a more strategic time. WhatsApp may be India's most used messaging platform, but its trust reserves have taken hits—from misinformation forwards to regulatory tussles and rising competition from privacy-first players like Signal and Telegram. With the Indian government re-examining the Digital Personal Data Protection Act, Meta is clearly looking to pre-empt the trust question with storytelling, not statements. And the stakes are higher than ever. As messaging apps increasingly double up as transaction hubs, health info archives, and workplace tools, privacy has gone from a niche concern to a mainstream demand. WhatsApp is responding with product updates like 'Advanced Chat Privacy"—a new setting to keep content from being taken outside the app—and tools like Privacy Checkup. The ad gently nods to these, but wisely avoids turning into a product demo. Still, the campaign's real win is tone. It doesn't panic you into caring about privacy. It normalises it. That restraint stands out in an advertising landscape obsessed with drama and data dumps. Will it be enough to shift sentiment? That depends on how easily users find and trust the new privacy tools. But in terms of narrative clarity, 'Not Even WhatsApp" sticks the landing. It's intimate without being intrusive, cinematic without losing cultural context, and local without looking like a retrofit. For a brand often caught between global ambition and local anxiety, this is WhatsApp speaking softly, but saying something loud.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store