On Monday, Attorney General William Barr made a very public appeal to Apple to unlock the iPhone of the shooter in last month's attack at a naval air station in Pensacola, Florida. Barr has been outspoken about his belief that tech companies have an obligation to provide access to encrypted devices when requested by law enforcement, and Apple has been steadfast in its position that it not only won't comply, but can't.
In the most prominent example, the company actually defied a court order to unlock the device belonging to the San Bernardino mass-shooter. The FBI eventually accessed that device without Apple's help, working with a third-party security firm.
It's not hard to argue that Apple should do everything it can to help fight crime and terrorism, and to that end, the company has already turned over all of the data it had in its possession. That information was stored on Apple's iCloud servers. The iPhone, it says, is different because the company is unable to decrypt a device without either the user's passcode, FaceID, or fingerprint (depending on the specific device).
In fact, Apple's transparency report says it has responded to over 125,000 such government requests for information and has turned over what information it has when asked by law enforcement.
Both sides have a lot at stake in this battle. Law enforcement obviously has a vested interest in fighting crime and stopping terrorist attacks. No one questions that. The question is whether or not tech companies should be required to build encrypted devices with a backdoor. By the way, there is no such thing: If a device has a backdoor, it's not encrypted.
In fact, at CES just last week, Apple's Senior Director of Global Privacy, Jane Horvath, said that "end-to-end encryption is critically important to the services we rely on." And with regard to fighting terrorism, she continued that "building a backdoor to encryption is not the way we're going to solve those issues."
In addition, an Apple spokesperson told me:
We have always maintained there is no such thing as a backdoor just for the good guys. Backdoors can also be exploited by those who threaten our national security and the data security of our customers. Today, law enforcement has access to more data than ever before in history, so Americans do not have to choose between weakening encryption and solving investigations. We feel strongly encryption is vital to protecting our country and our users' data.
In fact, The New York Times is reporting that sources familiar with the company's position say it will refuse to comply with any efforts to force it to break its encryption.
Barr has also called for legislation requiring tech companies to build in backdoors for law enforcement. While that might seem good for public safety, what happens when someone is able to get access to your personal information like health or financial data? What happens when someone can access photos of your family, or your messaging history?
Apple can't comply with the FBI, no matter how noble the cause, no matter how much the Attorney General protests. Because, while it's true that encryption means that some information won't be accessible to law enforcement, the alternative is that all of our information will be at risk. If there's a backdoor for the good guys, you better believe that the bad guys will figure out how to exploit it.
Which is the point.
And the Attorney General knows that to be the case. According to that same Times report, the FBI's top attorney had already sent a written request to Apple, to which the company responded with the information it could access on its servers. The current appeal is meant to put pressure on the company by bringing attention to a highly publicized case and put Apple on the wrong side of terrorism.
No one wants to be on the side of terrorism, but being for encryption isn't the same as enabling crime. In fact, it actually prevents crime every day. And while events like what happened in Pensacola or San Bernardino are horrific tragedies, it would be another tragedy to lose the ability to protect our personal information. Apple knows this, and so does the Department of Justice.
Neither side is likely to back down, but clearly Apple has more at stake. Actually, we all do, since there is no winner if all of our information is at risk.