BY KATIE D’HONDT
Today, Apple released an open letter defying the U.S. government, the first of its kind. In the wake of the San Bernardino attack, a U.S. federal judge has mandated that Apple build software that will allow the government to break the encryption that safeguards the data of all Apple products – a piece of software that does not yet exist. This would effectively create a ‘backdoor’ for the U.S. government to use at its discretion, allowing the possibility of abuse and circumvention of traditional legal channels for government and a new avenue for criminals. It’s a piece of software that would either bring America closer to 1984 or to a safer state, depending on whom you ask.
It is a piece of software we are not ready for, regardless of where you stand on the privacy-versus-security debate.
Apple’s letter is defiant, protective, fear-mongering, misguided and spot-on all at once. More importantly, it’s the first battle in a long legal and policy war that will ultimately end in a new technological landscape, likely ending in less secure data. What does that mean for the average American – or the any Apple product user? The assumption of privacy that allows you to text, message, search, and communicate freely through digital channels – or even travel freely and privately through the physical world, for those with location services activated – will potentially evaporate. Your subsequent use of those technologies would depend on your level of trust in the government’s discretion and your confidence that the government can keep this technology out of malevolent hands.
Unlike many other debates, there is little to no gray area in this part of the privacy-versus-security debate as it relates to encryption, the way Apple ensures the security of your data. Encryption is not easy to do, to understand, and to break, as the FBI made clear this month. The FBI and the public’s ensuing frustration in the wake of San Bernardino is at odds with technological realities, and this elucidates a huge gap in public understanding of encryption. Current technologies have not just outsmarted the FBI, they’ve outsmarted everyone.
However, weakening these encryption technologies weakens them for everyone, not just for law enforcement. The San Bernardino terrorist’s iPhone cannot be cracked without unlimited brute force techniques (imagine cracking a password by trying every possible combination) and more time than the impending heat death of the universe allows. Is this a problem for law enforcement? Yes. Is the prospective creation of ‘backdoor’ software a problem for everyone? Yes.
In this debate, it’s fashionable to make parallels about encryption as a concept in the non-digital world, as Apple does in the letter. The prospective software, they claim, “would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”
Apple’s analogy is incomplete in a few key ways. First, it assumes that all locks can be changed to reinstate security. If this technology pervades the market, changing phone brands will do nothing to increase data security. The analogy also glosses over the law enforcement perspective: as it stands now, no subpoena can make current technologies searchable when people have been killed or crimes have been committed. Finally, this analogy understates the finality and shock of creating this ‘key.’ With low public knowledge and education of technology, the creation of this ‘master key’ will go unnoticed by many – until they are the victims of a damaging data breach. This mandate forces the privacy landscape for Apple users to be binary – your data is encrypted and secure or it is not.
Still, the biggest problem with analogies on this topic is not that they oversimplify complicated explanations of how the technology works. If anything, these analogies make the topic of encryption accessible to more people and generate a public discourse around the topic, which is desperately needed.
Rather, it’s that these analogies are not exclusively used to explain the way that technology works. They bleed into law and policymaking in an unintentional and forced way – and often in a way that causes courts to uphold legal precedents that we would typically use when comparing cases in the physical world with other cases in the physical world. This ultimately produces haphazard and poorly conceived policies for the digital world. Despite the amount of time we spend on our phones, the physical and digital worlds are not the same and require policymaking that accounts for these nuances.
The Apple letter is spot on in one way: the world is not ready for this technology. However, it’s only a matter of time until we’re forced into it, ready or not. Complying with this order throws America into the deep end of data insecurity without a buoy. Creating this software would create a visible target on the back of the organizations that are known to possess it. Unless policy solutions, public knowledge, and legal frameworks can preempt this software creation, it is a piece of technology the U.S. government should not force Apple to create for the safety and security of digitally connected America.
Katie D’Hondt is a second-year Master in Public Policy student at the Harvard Kennedy School. Originally from Michigan, she worked at the Partnership for Public Service, a D.C.-based think tank focusing on federal government workforce issues. She is interested in technology and employment policy.
Image by user Benjamin Child via Unsplash