As it turns out, my Centre for the Debunking of Privacy Myths has been rendered unnecessary. By none other than the world’s most famous whistleblower: Edward Snowden. In an Ask Me Anything session on Reddit, days before the proposed renewal of the controversial NSA phone records program, Snowden had this to say:
“Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.”
With that one sentence, Snowden quietly and masterfully achieved what lawyers, activists, scholars and policy wonks have struggled with for years: a compelling, unassailable response to a dangerous logic that mistakes privacy for secrecy. An argument that puts the burden on those under surveillance to resist it, rather than on the system to justify why it is needed and to implement the checks and balances required to make it proportional, fair, just and humane. An argument that locates privacy at an individual (some would say selfish) level and ignores the collective, societal benefits that it engenders and protects, such as the freedom of speech and association. The “nothing to hide” rhetoric reduces the rich, multifaceted concept of privacy to a crude formulation, one that equates a legitimate desire for space and dignity to something sinister and suspect.
A western idea?
Snowden’s demolition of the argument doesn’t mean our work here is done. There are many other tropes that my (now renamed) Society for the Rejection of Culturally Relativist Excuses could tackle. Those that insist Indians are not private. That privacy is a western liberal construct that has no place whatsoever in Indian culture. That acknowledging privacy interests will stall development. This makes it particularly hard to advance claims of privacy, autonomy and liberty in the context of large e-governance and identity projects like Aadhaar: they earn one the labels of elitist, anti-progress, Luddite, paranoid and, my personal favourite, privacy fascist.
Part of the problem is the difficulty in conceptualising privacy harms: unlike other human rights violations that, through almost cinematic depictions of gore and pain, provoke a visceral sense of wrongdoing, privacy harms are largely invisible. Their injustices and effects are hard to visualise: most users feel little sense of violation about their electronic communications being read or their e-commerce transactions being tracked to profile them. Certainly less than they would feel about civilians being bombed or prisoners being tortured. Their typical response is, “So the government knows my Netflix viewing history, so what?”, or “I can get more targeted ads for stuff I actually want, that’s great, right?”
Brave new world
This makes a good bedfellow for the modernising narrative that casts anything shiny, new and digital as progress, and any attempt to question its risks as profoundly backward. In their fascinating work Flesh Machine, the Critical Art Ensemble described the emergence of a virtual body, which "allows one to create an identity of one's own, with much less restrictions than would apply in the physical world". In a prescient section, they say:
“What did this allegedly liberated body cost? Payment was taken in the form of a loss of individual sovereignty, not just from those who use the Net, but from all people in technologically saturated societies. With the virtual body came its fascist sibling, the data body – a much more highly developed virtual form, and one that exists in complete service to the corporate and police state. The data body is the total collection of files connected to an individual. The data body has always existed in an immature form since the dawn of civilization. Authority has always kept records on its underlings. Indeed, some of the earliest records that Egyptologists have found are tax records. What brought the data body to maturity is the technological apparatus. With its immense storage capacity and its mechanisms for quickly ordering and retrieving information, no detail of social life is too insignificant to record and to scrutinize. From the moment we are born and our birth certificate goes online, until the day we die and our death certificate goes online, the trajectory of our individual lives is recorded in scrupulous detail. Education files, insurance files, tax files, communication files, consumption files, medical files, travel files, criminal files, investment files, files into infinity…
The data body has two primary functions. The first purpose serves the repressive apparatus; the second serves the marketing apparatus. The desire of authoritarian power to make the lives of its subordinates perfectly transparent achieves satisfaction through the data body. Everyone is under permanent surveillance by virtue of their necessary interaction with the marketplace.”
This begins to get at the nub of why privacy matters, especially in the context of Aadhaar. Privacy is breached at several levels; at the time of data collection (especially when biometrics are involved); at the time of its storage by multiple actors (which its federated and decentralised enrollment apparatus facilitates by design); at the time of use (especially when Aadhaar is tagged for banal everyday activities that are low-risk from an identity theft or benefits fraud point of view, risking an allegedly secure system being devalued through ubiquity, and compromised through biometric overuse). All of this is compounded by the lack of a statutory frame for the Unique Identification Authority of India and/or a dedicated privacy law.
When the Attorney General contends, as he did during the ongoing matter before the Supreme Court, and as referenced in Tuesday's order, that there is no privacy violation if the data is not shared, this fails to acknowledge the very complex network of transactions and uses that the scheme is predicated on. When the Supreme Court misses the opportunity to put the brakes on the continued collection of data, it opens the door for the government relying on the Too Big To Fail, Too Late to Turn Back rhetoric.
Data trading
That data is the raw material of the new economy is scarcely in doubt. That data is often collected, used, traded and manipulated is no surprise; whether this is done with or without our consent and knowledge is a different matter. The Attorney General assures the Supreme Court that Aadhaar cards will only be issued on a “consensual basis after informing the public at large about the fact that the preparation of Aadhaar card involving the parting of biometric information of the individual, which shall however not be used for any purpose other than a social benefit schemes”. This statement fails to reassure us on at least three grounds:
a) Consent to preparation of the card is only part of the picture: even this consent is meaningless if having a card is voluntary but requiring it to access services is, if not mandatory, impossible to opt-out of or resist. Signing up out of fear of exclusion is hardly voluntary when the number’s linkage to a growing list of schemes makes it mandatory by stealth.
b) Informed consent can only exist when a person is consenting to every intended use, present and future, with clear knowledge of the risks and ramifications. This is clearly not the case, and can never be. Scope creep is a very real and problematic concern.
c) Even if the authorities behave honourably and refrain from using biometric and demographic information other than for social benefit schemes (an over-broad and undefined term in itself), there are absolutely no guarantees that every actor in the Aadhaar ecosystem can be stopped from so doing. With the data sitting on the servers and systems of so many agents, registrars, third party vendors and other intermediaries within the collection and implementation ecosystem, the Attorney General’s assurance ignores ground realities.
Look who's stalking
Those who argue that the data collected is minimal should remember that even innocuous pieces of data have value in combination. It is the relational quality of data that is key. Aadhaar, if embedded in potentially every government scheme and used in concert with private sector databases, will insert a technological platform between a state and its people that could mediate and track every aspect of our ordinary lived experience. We should worry about the detailed profiles that it helps create, the complex patterns it reveals when combined with other data, however innocuous, and the social sorting that it enables. Not least, because information asymmetries result in the data subject becoming a data object, to be manipulated, misrepresented and policed at will.
And it is this asymmetry that we should care most about because socio-technical systems reflect and reproduce existing power imbalances and inequalities. Access to technology, digital literacy, socioeconomic class, and the lack of availability of alternatives – all these shape our experience of and relationship to technology. That technology is not neutral or objective, that algorithms and technical systems do not see, parse or process all people equally, is increasingly being proven. Whatever one’s views about privacy being a fundamental right or constitutional guarantee, there is no grey area when it comes to equality and equal treatment. And in an instrumental way, privacy helps secure and implement equality: anonymous exam papers are graded without bias and anonymous browsing helps secure equal prices, just as anonymity permits marginalised voices to speak freely and truly explore the marketplace of ideas without fear. So, if you’re not really the privacy fascist type, try “equality fascist” on for size. It’s a one-size-fits-all label we should all be proud to wear.