Apple contributed eight pages of written evidence to the consultation on the new draft, which was put on the table in November 2015. The proposed law will replace the 2000 Regulation of Investigatory Powers Act, as well as a bunch of other laws that had for years been covertly interpreted as allowing bulk surveillance, hacking, and the creation of population databases.
Having aggressively marketed its privacy credentials for the last two years, Apple’s contribution to the consultation is perhaps not surprising. Its criticisms concentrate on three areas, all of which closely relate to the company’s reputation.
Hacking, with help?
One of Apple’s concerns relates to provisions in the bill regarding “equipment interference”. This refers to the ability of intelligence services to hack both hardware and software – something the UK government only admitted to doing in a 2015 Intelligence and Security Committee report.
The new law would not only explicitly include this capability, but also extend it. Hacking would be allowed “in bulk” – this could, for example, be through a tampered update (maybe one that circumvents encryption) rolled out to all devices or software installations of a certain kind. Plus companies could be forced to collaborate with this interference, and then be subject to gagging orders.
Apple clearly objects to this: it would mean they cannot credibly maintain to customers that its devices and software cannot have not been tampered with. And the broader security argument behind Apple’s opposition is that any such tampering would likely create security holes that then become available to anyone – not just the “good guys”.
Encryption and backdoors
A legislative attack on end-to-end encryption has been on the cards in the UK since David Cameron said he did not want there to be “a means of communication … that we are not able to intercept”. This would undermine some of Apple’s products – such as iMessage – which use end-to-end encryption so that only sender and recipient can read the message.
The new bill puts an obligation on communication service providers to have the “ability to remove any encryption applied by it”. This potentially requires any end-to-end encryption to be provided only if a method to give government access to the contents is also built in. Apple argues explicitly against these kinds of backdoors.
In objection, Apple said: “A key left under the doormat would not just be there for the good guys. The bad guys would find it too.” This clearly refers to the recent technical paper by Cambridge computer scientist Ross Anderson and other grandees in cryptography and security, which argues that such backdoors should be avoided as they undermine overall security.
With Apple’s iMessage service, it can currently tell customers that it is impossible to intercept and decrypt messages even if the company wanted to. But the mere existence of backdoors would undermine this.
Extraterritoriality
Apple’s third area of concern is to do with “extraterritoriality”. Apple does not want to have to comply with warrants for information from governments wherever it is based or wherever its data is.
The complaint that it will have to cope with overlapping foreign and domestic laws sounds a bit generic, however. After all, internet giants like Apple operate internationally on a scale that surpasses every individual country.
Rather than sparking “serious international conflicts”, as Apple claims, this law suffers from the same intrinsic problems of extraterritoriality as any internet related legislation. Just because the internet and the companies operating on it barely experience national boundaries does not mean national governments and international organisations should accept that it is an anarchic situation. What we are watching here is really a battle between an old world power and a new one.
Apple prefers to avoid making judgements on individual countries and their surveillance regimes. Contrast this with Blackberry who are pulling out of Pakistan because of excessive surveillance demands, whereas in 2013 they decided to comply with quite similar requests from the Indian government.
Britain, however, might think twice about making technological demands of Apple which give competing world powers an edge – either by creating a precedent in surveillance demands, or through undermining overall security.
One glaring omission
Apple likes to present itself as a privacy advocate, and uses that as a selling point. For an overall conclusion, it is worth comparing Apple’s criticisms of the bill to those by privacy organisations and experts.
There is a glaring difference: Apple does not criticise the mass collection of communications data (or metadata). Apple notes that it still provides metadata when requested. So while end-to-end encryption provided by Apple protects the content, it does not protect the information on who has been communicating with whom.
The draft law is built on the premise that communications data is less worthy of protection than contents. But experts in computer science and privacy tend to disagree with this. In his evidence to the joint parliamentary committee, East Anglia legal academic Paul Bernal argued that the collection of either type of data is “differently intrusive”.
Similarly, in their evidence session on this law for the Science and Technology committee, Ross Anderson and Joss Wright of the University of Oxford Internet Institute argued that communications data was often more useful than contents in law enforcement practice.
For all its championing of privacy, Apple is silent when it comes to these arguments. One reason for this may be that its business is not primarily about protecting communications data. So, while it lends powerful and convincing support to the technological arguments against encryption backdoors and bulk interference, Apple also remains a very smart business.
Eerke Boiten, Senior Lecturer, School of Computing and Director of Academic Centre of Excellence in Cyber Security Research, University of Kent
This article was originally published on The Conversation.