Aadhaar proponents regularly ask opponents why they have a problem sharing their biometrics with Aadhaar if they have no problem sharing with the United States government for a visa. Proponents seem to be genuinely perplexed by this apparent hypocrisy. It helps to understand it in terms of the types of information and how they are used.

Secret information

Secrets are things that are not meant to be shared. Shared secrets are things that are meant to remain within a well defined group, and are not meant to be shared outside that group.

A password is a shared secret. Only you and the website where you use this password are supposed to know the password. In fact, since the website never needs to login to you (that’s absurd), they don’t even need to know your password. They only need to confirm you have the correct one. Since websites can and do get hacked, a good website operator will not risk keeping a copy of your password. They’ll instead use a cryptographic hash function, a complex mathematical operation that cannot easily be reversed. They will keep the result of the hash function. The original password cannot be guessed from it. When you attempt to login with a password, the website will apply the hash function to your supplied password and check if it matches the result they have stored. What you supply is then immediately discarded.

Sometimes you may not even trust the website to discard your password as promised, so authentication protocols like OAuth exist, allowing you to use a login and password at a site you trust — like Google, Facebook or Twitter — to login to some other site using their “Login with Facebook”, etc buttons. The destination website never sees your password.

Keeping secrets a secret is a big deal, so tech people work hard to get it right. Biometrics, however, are not secrets, and this is the crux of the problem. We’ll look at why in a moment.

Public information

Public information is anything that is meant to be publicly known. There is no risk to anyone’s privacy from you coming to know this. That the name of the current Prime Minister of India is Narendra Modi is public, for example.

In Public Key Cryptography (digital certificates), there are two cryptographic keys. One is called the “public key” and is meant to be public in the above manner. The more people who are aware of this public key, the better. The other is called the “private key” but this is an unfortunate choice of name because it is actually a secret. Not a shared secret but a total secret, not meant to be shared at all.

Private information

Which brings us to the third type. This is a little nuanced, so pay attention.

Your name is private. It is not public information. You go through life without wearing a name badge announcing your name to every stranger. Others do not know your name unless they recognise you or you tell them.

In closed environments such as a corporate office or a conference, your name is meant to be shared within that group, which is why you have to wear a name tag.

An email or WhatsApp message you send to someone is private. Other people cannot see it, unless you or the recipient choose to share it. Email and WhatsApp forwards are commonplace. They are still private. There is no way to tell which piece of fake news is circulating around India on WhatsApp, because it’s private. You can only see what you send and receive. Unless it somehow gets published in the media or on a public website, at which point it becomes public.

Your biometrics are also private information. They are not secrets. You leave a copy of your fingerprints on almost everything you touch. Your iris biometrics can be extracted from a high resolution picture of your face, which even a modern smartphone is capable of. Unless you spend your life wearing gloves and shades, there is no hope of your biometrics being secret. They are available to the people you encounter in daily life, just like your name is.

Unlike your name, other people have no use for your biometrics and don’t pay attention to them, so we may be fooled into thinking they are secrets. They are not.

Biometrics are not public either. There is no public database from which biometrics can be freely downloaded, as UIDAI keeps assuring us in response to news of Aadhaar numbers leaking.

Privacy is about the responsible maintenance of private information. This responsibility is hard to define, which is why laws are necessary.

Private versus secret

Your Aadhaar number? Also private, not secret or public. Unlike passwords, where a website has no reason to know it apart from verifying that the correct person is trying to login, service providers must know your Aadhaar number so they know who they are providing service to. They’re, however, not meant to share it publicly.

And why not? Because if a service provider knows your Aadhaar number without you sharing it, they can claim to have provided you a service without actually providing it. This is fraud. It happened before Aadhaar and it happens with Aadhaar, because UIDAI made the choice of designing Aadhaar numbers to be private instead of secret. This is why Aadhaar numbers leaking to the public is bad news, and why a privacy law is essential for the safe use of Aadhaar.

But what about biometrics? By nature they are private and not secret. Unlike Aadhaar numbers, they were not designed by UIDAI, and UIDAI cannot change their nature.

A good secret is disposable. If your password is compromised  –  you gave it away or someone guessed it  –  change your password. If you’ve lost your digital certificate, revoke it and get a new one. If your Aadhaar number leaks, UIDAI should give you a new one, or maybe switch to a more sensible design where it’s unusable without a disposable secret token.

Authority versus authentication

Which brings us back to foreign travel. Given biometrics are private and not secret, how come they are okay for foreign travel but not for Aadhaar?

The answer lies in how they are used. When you arrive at a foreign destination (or a foreigner arrives in India), the immigration official at the counter decides whether to let you in. The fingerprint scanner on the desk informs this official. The official is the authority, not the scanner or some remote server.

Under Aadhaar, whenever there is a demand to replace your existing id with Aadhaar, such as at a bank or mobile company, there is also the implicit assumption that the official at the bank or mobile company cannot be trusted to certify your identity. The fingerprint scanner and the server it is connected to are the authority. What is the problem with this? Multi-fold:

  1. Biometric matching gives probabilistic, not deterministic answers. That means the scanner will score your match on a scale of 0% to 100%. It cannot give a straightforward ‘yes’ or ‘no’ answer.
  2. To convert a 0–100% scale into a binary yes/no, you have to pick some number on the scale as the threshold point. Let’s say 50%. Anything less is a ‘no’, anything more is a ‘yes’. (In Aadhaar, the server does this for you using an unspecified threshold.)
  3. However, you can’t simply convert a probability into a binary. Even if the scanner says it is 80% confident that the fingerprint matches, the other 20% could be a match with someone else’s fingerprint.
  4. In a probabilistic outcome, you will always have some degree of refusing a match to a genuine person (an exclusion) or accepting a match to a fake person (identity fraud).

Adding to this, a biometric scanner cannot see that your friend is standing next to you to place their thumb, or that you’re holding up a picture of someone’s eyes, or that you’ve made a mask of someone’s fingerprints. There are many different ways to fool a biometric scanner.

An immigration official will have no trouble spotting you attempting fraud like this. If you are genuine and the scanner somehow can’t identify you, you can still explain to the official. If your fingers are worn out or missing or bandaged, they can see that. It’s one human to another.

In Aadhaar, on the other hand, there is an implicit assumption that the human operator is complicit in fraud and cannot be trusted. Only the scanner and the remote server can be trusted. This is a terrible design decision:

  1. Biometrics are private, not secret. They are easily stolen. Private information can be used to identify, but not to authenticate. A remote server can identify if the biometrics match, but cannot authenticate if they actually came from the owner of this information. A secret, on the other hand, is assumed to be known only to the owner.
  2. The matching is probabilistic, not deterministic, so in the absence of a human operator with authority, there will always be both exclusion and fraud. UIDAI believes their probabilistic matching algorithm is very good (see page 4) with low levels of exclusion (0.057%) or fraud (0.035%), but the current evidence is not in their favour.
  3. Cryptographic hash functions cannot be applied to probabilistic match algorithms like biometrics. That means your original biometrics have to be stored and can be stolen from a central database if it is ever hacked, and all databases eventually get hacked. UIDAI may do its best to keep its data centre secure, but copies of your biometrics may also exist in the various SRDH databases, which have less thorough security.

Please stop comparing biometrics in Aadhaar with US visas. Since everyone uses the same technology, the US government will have the same level of reliability as the Indian government. The problem is not the biometrics. It is the assumptions made around the biometrics.

This article first appeared on Medium.