Dropping The Gloves: The FBI vs Apple
Unless you live in a completely disconnected bubble, you’ve heard all about the recent battle between Apple and the FBI. You’ve heard the arguments from different sides —you’ve probably even debated on one side or the other. Some argued that Apple was right because nothing should come above privacy, while others maintained that some things outweigh the privacy expectations. Saying that the FBI wanted a backdoor is a stretch. Finally, there were some who believed the FBI was after setting a legal precedent than actually having Apple build a software.
The question that’s debated less frequently is whether the FBI made a mistake in fighting Apple publicly and then just as publicly announcing its triumph — that is, finding a hack without Apple’s help. It’s a question that goes deeper than simply personal privacy vs. national security.
Privacy and Encryption
To recap, the FBI needed to unlock the iPhone of one of the two shooters who left 14 people dead and 22 injured last December in San Bernardino. Because the phone was encrypted, after 10 failed passcode attempts, the data would self-erase. According to Apple, FBI’s request — which ended up in court — was to “make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation.” The request for this backdoor, Apple argued, would create software — which doesn’t currently exist — that could fall into the wrong hands and allow anyone to physically unlock any iPhone.
Apple began encrypting iPhones when it introduced iOS 8 in 2014. In a nutshell, it works like this: The iPhone encrypts the data by scrambling it so it can’t be read without a 256-bit “key.” The key, a unique identifier for each specific device, is “burnt” into the silicon, can’t be bypassed and is not logged anywhere outside of the application processor.
Cybersecurity professionals have been advocating for encryption of data at rest as well as in transit as a way of protecting personally identifiable information, and not just on devices. With encryption, data stolen by cybercriminals in a breach would be useless since a key would be required to read it. The U.S. government, at the same time, has been growing concerned because encryption hinders the access of law enforcement agents just as much as it does for criminals. Even worse, from the point of view of agencies like the FBI, it gives criminals and terrorists a leg up because they can “go dark.’
Although in the most-recent case the FBI said the iPhone would be a one-time hack, Apple and other privacy advocates argued this would set a precedent not only for the U.S. government but also for foreign ones, including those in oppressive countries like China.
The encryption discussion is not unique to the United States. China passed a law last year requiring tech companies to hand over encryption keys for government requests of information. The United Kingdom proposed a bill last year that would have required intercept capabilities for encrypted communications. On the other side of the spectrum, Germany has been promoting encryption, even offering all its citizens a free email service that encrypts messages.
Apple vs. the FBI
Apple’s refusal to comply with FBI’s request led the Justice Department to request a U.S. District Court order forcing the company to comply. In the weeks preceding a scheduled hearing, the matter played out in the court of public opinion. The dispute became increasingly public as Apple revealed how it had offered the agency other solutions for obtaining the phone’s data, while the FBI insisted it was not trying to set a precedent for other cases.
There was no shortage of reaction from the media, other major tech players, advocacy groups, lawmakers and former intelligence officials. There was also plenty of speculation — including on Apple’s part — as to whether the National Security Agency already had the capability to break into an iPhone, considering its advance surveillance capabilities.
And if NSA could do it, why wouldn’t the FBI request the assistance, Apple asked. But according to Reuters, not all federal agencies support FBI’s side. A Reuters report said that there was no consensus within the government itself, and that some NSA and Department of Homeland Officials took Apple’s side.
Despite FBI’s insistence that only Apple can offer a solution, the agency recently revealed it’s working with one of the many outside parties that had offered to help hack the phone, and the court hearing was postponed.
FBI’s Tactical Mistake
Despite the FBI’s insistence that it wasn’t asking for a precedent and the case was all about a specific phone, it’s clear that there’s much more going on. In fact, according to Apple, the agency has made similar requests on several other occasions. The San Bernardino case is only different because it’s much tougher to argue against fighting terrorism — but compliance with the request would, without a doubt, open the floodgates for future court orders.
The FBI, after all, has been trying to make its case against encryption for a while. FBI Director James Coney stopped short of telling Congress that a backdoor should be required of tech companies but did suggest a “front door” approach (which, like the proposed UK law, would mandate “intercept solutions”). And, as documents leaked by Edward Snowden revealed, the FBI deliberately influenced weaker cryptography standards recommended by the National Institute of Standards and Technology that are used both by the private and public sectors.
The security community has been divided in this fight, but it’s fare to say that the FBI overplayed its hand by trying to force Apple — especially publicly— to compromise its own products and to bet on its “fight against terrorism” card. The agency could have just as easily and quietly requested the assistance of other third parties without dragging Apple through the mud. Not to mention if it believes so strongly that this fight is justified, it should instead use the U.S. Congress to advocate for new laws.
And now that the whole world knows what only some may have suspected — that the iPhone is not the fortress Apple’s marketing makes it out to be — who really loses in the end?
Consumers may lose in the short run, considering that the method will likely be leaked by the vendor and the FBI would also have to share it with local law enforcement agencies. At the same time cyberattackers, including nation-state sponsored ones, will be very interested in getting their hands on the vulnerability that allowed the FBI to get in, and they’re very crafty when it comes to getting what they want.
Apple may not be in the losing corner for long, however, even it its PR takes a hit. It will, instead, emerge as a winner because it will be able to figure out the flaw and patch it. The FBI’s victory in the end will be short-lived since not only will Apple fix the backdoor but it will also look for new ways to make its devices even more secure. So in the long run, consumers win too, because all this fist-fighting will result in more secure software.
Hopefully one thing that sticks in everyone’s minds, when the dust settles, is why building a backdoor into any security product is not a good idea. Even with robust security that’s built into a product such as the one built by Microsoft, a backdoor would negate all the efforts to create better security in the first place.
The FBI is certainty not the first federal agency attempting to use a good story and make a case for giving up individual privacy in exchange for perceived security. But this is a reminder that the tradeoff will always be up for debate. One thing is clear though: in breaking into the iPhone that was touted as impenetrable, the FBI has sent a strong signal to tech companies and their reverence for encryption. So the lingering questions remains: was this battle, after all, about nothing but principles, from either side?
By Sekhar Sarukkai