Smartphones rule our lives. Having information at our fingertips is the height of convenience. They tell us all sorts of things, but the information we see and receive on our smartphones is just a fraction of the data they generate. By tracking and monitoring our behaviour and activities, smartphones build a digital profile of shockingly intimate information about our personal lives.
These records aren’t just a log of our activities. The digital profiles they create are traded between companies and used to make inferences and decisions that affect the opportunities open to us and our lives. What’s more, this typically happens without our knowledge, consent or control.
New and sophisticated methods built into smartphones make it easy to track and monitor our behaviour. A vast amount of information can be collected from our smartphones, both when being actively used and while running in the background. This information can include our location, internet search history, communications, social media activity, finances and biometric data such as fingerprints or facial features. It can also include metadata – information about the data – such as the time and recipient of a text message.
Each type of data can reveal something about our interests and preferences, views, hobbies and social interactions. For example, a study conducted by MIT demonstrated how email metadata can be used to map our lives, showing the changing dynamics of our professional and personal networks. This data can be used to infer personal information including a person’s background, religion or beliefs, political views, sexual orientation and gender identity, social connections, or health. For example, it is possible to deduce our specific health conditions simply by connecting the dots between a series of phone calls.
Different types of data can be consolidated and linked to build a comprehensive profile of us. Companies that buy and sell data – data brokers – already do this. They collect and combine billions of data elements about people to make inferences about them. These inferences may seem innocuous but can reveal sensitive information such as ethnicity, income levels, educational attainment, marital status, and family composition.
A recent study found that seven in 10 smartphone apps share data with third-party tracking companies like Google Analytics. Data from numerous apps can be linked within a smartphone to build this more detailed picture of us, even if permissions for individual apps are granted separately. Effectively, smartphones can be converted into surveillance devices.
The result is the creation and amalgamation of digital footprints that provide in-depth knowledge about your life. The most obvious reason for companies collecting information about individuals is for profit, to deliver targeted advertising and personalised services. Some targeted advertisements, while perhaps creepy, are not necessarily a problem, such as an advertisement for the new trainers you have been eyeing up.
But targeted advertising based on our smartphone data can have real impacts on livelihoods and well-being, beyond influencing purchasing habits. For example, people in financial difficulty might be targeted for advertisements for payday loans. They might use these loans to pay for unexpected expenses, such as medical bills, car maintenance or court fees, but could also rely on them for recurring living costs such as rent and utility bills. People in financially vulnerable situations can then become trapped in spiralling debt as they struggle to repay loans due to the high cost of credit.
Targeted advertising can also enable companies to discriminate against people and deny them an equal chance of accessing basic human rights, such as housing and employment. Race is not explicitly included in Facebook’s basic profile information, but a user’s “ethnic affinity” can be worked out based on pages they have liked or engaged with. Investigative journalists from ProPublica found that it is possible to exclude those who match certain ethnic affinities from housing advertisements, and certain age groups from job advertisements.
This is different to traditional advertising in print and broadcast media, which although targeted is not exclusive. Anyone can still buy a copy of a newspaper, even if they are not the typical reader. Targeted online advertising can completely exclude some people from information without them ever knowing. This is a particular problem because the internet, and social media especially, is now such a common source of information.
Social media data can also be used to calculate credit worthiness, despite its dubious relevance. Indicators such as the level of sophistication in a user’s language on social media, and their friends’ loan repayment histories can now be used for credit checks. This can have a direct impact on the fees and interest rates charged on loans, the ability to buy a house, and even employment prospects.
There is a similar risk with payment and shopping apps. In China, the government has announced plans to combine data about personal expenditure with official records, such as tax returns and driving offences. This initiative, which is being led by both the government and companies, is currently in the pilot stage. When fully operational, it will produce a social credit score that rates an individual citizen’s trustworthiness. These ratings can then be used to issue rewards or penalties, such as privileges in loan applications or limits on career progression.
These possibilities are not distant or hypothetical – they exist now. Smartphones are effectively surveillance devices, and everyone who uses them is exposed to these risks. What is more, it is impossible to anticipate and detect the full range of ways smartphone data is collected and used, and to demonstrate the full scale of its impact. What we know could be just the beginning.
Vivian Ng, Senior Research Officer, Human Rights Centre, University of Essex, University of Essex and Catherine Kent, Project Officer, Human Rights Centre, University of Essex.
This article first appeared on The Conversation.