top of page

"Thanks, but no thanks," the Apple versus FBI and U.S. Justice Department and the bigger questions

 

Professor Mark Skilton and Professor Irene Ng,  Warwick University, UK

 

The ongoing conflict between the FBI and the U.S. Justice Department which is emblematic  of the larger issue, is Encrypting data protecting privacy?  if the FBI can hack a phone, is privacy dead?   

 

The December 2015 terrorist attack in San  Bernardino, California has been felt around the world in that it touched not only the terrible events that seem to proliferate in society but the paradox of having both privacy and security.

 

That the FBI were able to get round the Apple iPhone security in this specific case seems to reinforce the cat-and-mouse game between the hacked, hackers and government agencies. With the constant rise of security data breaches for theft, extortion, espionage and terrorism, the need for stronger encryption will continue to try to keep one step ahead. In extreme cases like the San Bernardino case and many legitimate cases for legal requested for access to personal data, this seems at conflict with equally legitimate concerns from surveillance and unauthorized use that the Edward Snowdon affair so magnified in the public mind.    

 

The tug-of-war between Apple and the FBI is only temporarily solving with a third party hack used by the FBI to get round and avoid the legal battle which are likely take years, but what kind of world can tech users look forward to?

 

If Apple won

 

If Apple won had won the legal case then it is likely to only increase the determination for governments and more surveillance activities and the creation of laws to compel companies to hand over the data.  We are already seeing many tech companies protesting against the extent of electronic surveillance permitted I the UK’s Investigatory Powers Bill which is before parliament. In the US next month, the Senate Intelligence Committee are expected to introduce legislation that would require companies to help law enforcement agencies read encrypted information. 

 

If the FBI won

 

If the FBI and U.S. Justice Department had won the legal case this would feel like the law to override company or a community and individual data rights; and set a worrying president for governments such as Chine and North Korea to demand further access. Further arguments from Apple of what is called “Back Door methods” to gain access to data would weaken the security for all who use the iPhones by introducing unnecessarily vulnerabilities.

 

Who is the keeper of my house?

 

The easiest way to describe the FBI v Apple saga is to consider the situation being played out in the physical world. The FBI needs to access the terrorist’s physical belongings which is of course at his house. If the terrorist owns his house, the FBI would just get a court order, break the door down and gain access to the house and all its belongings. If the terrorist rents his house, a court order is again needed to request the landlord to provide access, perhaps with the master key the landlord owns. In the case of FBI v Apple, Apple is not the landlord or the house owner. Apple is the guy who made the front door lock. And they made it in such a way where if you try to break down the front door, the lock incinerates everything in the house and burns the house down. So FBI requested Apple to modify the lock so that they could get in, but Apple is not willing to comply because that would just compromise all the front door locks in the world they have supplied and the ability for other governments or nefarious entities to try to break in once their lock can be somehow picked.

 

Somehow, the FBI finally figured out how to open the front door lock (not really that hard if you ask some of the techies out there) without burning down the house and the case is now dropped.

 

To understand data on the phone as analogous to physical belongings is probably a good place to start in terms of the wider implications of the case and the state of play for personal data driven business models.

 

The Honest-not-Curious business model

 

Apple runs what we could term as Honest-not-curious business model for personal data. They have gone to inordinate lengths in terms of security and privacy to ensure that no one, not even Apple staff, can get into your data if you don't let them (jon’s paper here). Apple’s principle is that you've bought the phone, and with it some free iCloud space, and for the money you've paid, they won't peek at your data, nor make money from your data on phone or on cloud.

 

The Honest-but-Curious business model

 

This is the more common business model, practiced by the likes of Google and Facebook. In this model, programmes and algorithms keep your data secure, but privacy may be compromised because information passed may be looked at. Often individuals allow this in return for free services. For example, you may wish to know the median income of your age group but you're not willing to disclose your own income to others. This is the classic millionaire problem where 2 millionaires wanting to know who is richer but refuses to share their data (Yao) and Computer scientists have designed honest but curious protocols for this to happen in a way that no one really sees the data but intelligence can result from it. Many of our personal data are used this way to help advertising target ads at us. The analogy to the physical world is that you rent your house from a landlord which has a little robot come in everyday to make a note of your belongings and what you have in your fridge and your shelves. In return, you might not have to pay rent and even get vouchers for the products you normally consume.

 

The challenge of the HBC model is of course, mission creep. Since the landlord owns the key to the house, and the robot, what's to stop him from going in more times in the day at the behest of other firms, or look at other stuff that you didn't authorise him to look at. This is especially so when the FBI comes calling. Since the firm already have access, opening the door for the Feds is not illegal, in fact, a court order may not even be necessary and we can see governments exploiting this.

 

How do we ensure the firm is incentivised towards being more honest, rather than more curious? The current economic incentives clearly point towards firms being more curious. With growth numbers under scrutiny and profitability under pressure, and as long as there is no reputational risks (don't get caught), nothing stops firms from exploiting personal data even more than they currently do. Unfortunately, legislation won't help because it would hamper the HBC business model, which is a driver of much of the Internet economy. Clearly, there are more carrots for the firm to be more curious whereas there are no carrots for the firm to be more honest, which makes firms that are HBC driven less credible when they insist they are honest (why would they be?).  The exception is of course when the firm’s business model is clearly not aligned to the use of personal data such as Apple, where their exhortations of being honest is credible and believable because being more honest sells more handsets.

 

The Malicious business model

 

And if the HBC firm become more curious, perhaps they could turn malicious? The malicious business model is only interested to get a foot into the door so to speak, so that once in, the firm can exploit your data for gains, often with your consent because you've signed off your personal data in the terms and conditions without even have read it since you are so keen to get free rent.

 

Custody and access rights control

 

The wider implication of the FBI v Apple and indeed, personal data issues in general means that we need new legislation to legitimately protect public safety but not at the expense of more surveillance from these “back doors” and a lack of trust. 

 

One solution to this future could be in the separation of concerns of personal data. For example, separating custody from access rights. In the way our physical information and the rights to access that are kept separate and managed as separate rather than wrapped up into a Phone or connected device in the future. This means that we may need not just better legal frameworks but new technologies and personal data platforms that still enable custodial rights but when the Feds show up to ask to have a peek, firms can credibly throw up their arms and say ‘my access is only limited to a preset terms of use. Get a court order and go ask them yourself or next of kin’.

bottom of page