Apple’s defiance of a court order last week to help the FBI unlock a suspected terrorist’s iPhone sets up what promises to be a long legal confrontation between the company and the U.S. Justice Department. In the meantime Apple is showing signs that it will further raise the stakes, dropping hints that it wants to create devices and services that are even more difficult to break into. Apple and FBI representatives both get to air their sides of the conflict when they testify before Congress on Tuesday about the need to balance security and privacy.
Currently, any task of breaking into Apple phones as requested by the FBI could potentially be accomplished by hackers, which is why Apple brass wants to create devices that even the company cannot access without user permission. Law enforcement investigating a mass shooting at a December 2, 2015, holiday party in San Bernardino, Calif., that killed 14 people and injured 22 wants Apple to disable instructions written into the iPhone’s software that delay passcode guesses and could delete data after a certain number of failed attempts.
If Apple were to write those instructions directly into the iPhone’s processor, the company would be unable to change them later on, says Charlie Miller, a researcher whose past work has analyzed Apple iOS security. This has both positives and negatives, of course. Apple technicians would be unable to comply with court orders to modify customers’ devices, potentially avoiding another situation similar to the one they are in now. If the company mistakenly wrote flawed code at the chip level, however, it would be much harder to correct, he adds.
Programmers write a smartphone’s instructions into its hardware and software as well as its firmware—a type of software that provides instructions for how the device communicates and is typically stored in the device’s read-only memory (ROM). Software can be changed relatively easily via remote updates like the ones smartphone users receive regularly reminding them to download the latest version of an app or operating system. The specific threat of law enforcement or government having your phone and trying to compel a company to bypass its security protections “really has to be addressed at the hardware level,” says Justin Cappos, an assistant professor of computer science and engineering at New York University. “That way, Apple can say to the U.S. government or to governments in Iran, China and elsewhere that they can have a person’s phone but [the company] cannot provide access to the data on that phone.”
Apple began centralizing all the most sensitive data and cryptography keys on the iPhone in September 2013 with the introduction of a “Secure Enclave” (pdf) coprocessor, which provides security functions separate from the device’s main application processor. Secure Enclave is a two-key system in that it requires both that coprocessor and the main processor to approve a passcode before the device is unlocked and its data is decrypted. “It isn’t perfectly clear to people—other than those working for Apple—what specifically can be changed in the Secure Enclave,” says Cappos, whose research includes security, operating systems and networks. The company has provided some clues. Apple can make changes to the Secure Enclave software, for example, without needing a customer’s passcode. “If you wanted to make the Secure Enclave even more secure, you would write more security features that execute at the chip level,” he adds.
A challenge for Apple is to maintain the flexibility to update iPhone software and firmware when needed without creating opportunities for attackers or law enforcement to likewise manipulate code on the device. The Secure Enclave gives the company options, says Dan Guido, co-founder and CEO of Trail of Bits, a cybersecurity firm based in New York City. Apple might have to rapidly update iOS at times but the Secure Enclave needs to be updated much less frequently. “There are likely ways they can tie down the Secure Enclave firmware while retaining the flexibility to update iOS when needed,” he says. “For example, Apple can require the passcode to update the [Secure Enclave], or they can move certain security features on the [Secure Enclave] into the phone’s ROM.” Still another option would be to make the iPhone’s “device firmware update” mode—used to troubleshoot broken phones—“more destructive” if someone tampers with it as a way to break into the device, he adds.
Apple is likely to extend its encryption efforts beyond the iPhone to include information in transit and has hired a well-respected messaging security expert to help. Frederic Jacobs tweeted last week that he will be joining Apple this summer as an intern with the company’s CoreOS security team. Jacobs is best known for his work developing the Signal secure messaging app at Open Whisper Systems. Signal rose to prominence a few years ago when whistleblower and former NSA contractor Edward Snowden said publicly that he uses the app to secure his own highly secretive communications. Signal is a free and open-source encrypted voice calling and instant messaging application for iOS and Android. “Apple has gone and hired one of the best in the encryption world right now,” says Dan Kaufman, founder and chief technology officer of Brooklyn Labs, a software company that primarily builds mobile apps for the iOS and Android operating systems. Signal is “virtually unhackable” and is the standard for encrypted messaging at this time, he adds. Apple might want to apply that level of encryption to iMessage, iCloud and potentially at the device level as well.
The Financial Times reported last week that Apple is working to strengthen the encryption of iCloud backups. Still, further securing iCloud backups will be very difficult, Guido says. “They could encrypt the iCloud backups on their servers with your passcode, Apple ID password or some other means but that will certainly cause a lot of grief from users,” he says. “Not many technology firms have figured out how to easily end-to-end encrypt user data on their Web servers.”
Apple v. itself
In the absence of specific regulations that force technology companies to make customer data available for law enforcement investigations, the main limitations on any of Apple’s efforts to bolster security would be customer reactions. “If an iPhone user cannot remember a passcode, enhanced security might mean they lose all of the information, images and video stored on their device—and that would not be popular with users,” says Miller, who spent nearly five years at the National Security Agency. If customers think that Apple is weakening its products and making personal data more vulnerable to court orders and hackers alike, however, they might be less willing to buy newer, better secured products or update the software on existing devices. One of the iPhone’s strengths is that its users tend to have a lot of trust in Apple and will readily update their smartphone software whenever new versions are issued, he adds. If iPhone users stop doing that, they will not get the legitimate security enhancements that Apple offers.
Apple balked at a court order compelling the company to help the FBI crack into the locked iPhone 5c used by Syed Rizwan Farook who, along with wife Tashfeen Malik, is suspected of carrying out the mass shooting in San Bernardino. Apple cited concerns that such actions would give law enforcement the legal precedent needed to compel the company to unlock iPhones as part of several other ongoing criminal investigations. FBI Director James Comey contends that the San Bernardino court order is not trying to set such a precedent.
Comey and Apple General Counsel Bruce Sewell make their respective cases to legislators today when they testify at a U.S. House of Representatives Judiciary Committee hearing on encryption issues. The March 1 hearing is just one of many efforts by lawmakers to address the underlying tug-of-war between tech companies and law enforcement. Some in Congress are proposing a “9/11-style” congressional commission that would discuss encryption, among other topics. Meanwhile, Sens. Dianne Feinstein (D–Calif.) and Richard Burr (R–N.C.) are working on a bill to force companies like Apple to cooperate or suffer punitive damages.