Apple has until tomorrow to either help law enforcement unlock a suspected terrorist’s iPhone or formally challenge a court order demanding that the company do so. For the past week CEO Tim Cook has made it clear that Apple has no intention of giving in to what the FBI has positioned as a technological compromise. The case has since become more about setting legal precedents between tech companies and law enforcement than about overcoming technological hurdles. Right now there is no federal law requiring companies to maintain a key to unlock the encryption used on their devices.
Apple’s stance in the case is that submitting to this court order will give law enforcement the legal precedent needed to compel the company to unlock iPhones in any number of cases. FBI Director James Comey contends that the San Bernardino court order is not “trying to set a precedent or send any kind of message” but rather intended to access the phone they retrieved from Syed Rizwan Farook and Tashfeen Malik’s black Lexus in San Bernardino, Calif. Police killed Farook in a shootout, complicating efforts to obtain any relevant information stored on his phone.
Despite Comey’s comments, there are several cases involving suspected criminals and their encrypted iPhones that await the outcome of the San Bernardino decision. Law enforcement has requested Apple’s assistance unlocking iPhones being held as evidence in a dozen other criminal cases across the U.S., The Wall Street Journal reports. In the meantime Apple is reportedly working on upgrades to the security of its devices that would stymie law enforcement efforts to unlock them, even if the company loses in court this time around, according to The New York Times.
There seems to be a consensus among cybersecurity experts that efforts to circumvent encryption ultimately create vulnerabilities. Programmers are often asked by law enforcement or government agencies to give them some way of bypassing normal authentication—aka a “backdoor”—that allows access to secure information, says Daniel Kaufman, founder and chief technology officer of Brooklyn Labs, a software company that primarily builds mobile apps for the iOS and Android operating systems. “With any software, and I’ve done this on many different platforms, it’s requested that you build a backdoor,” he says. “But once you build that backdoor, no matter how good you are, someone could eventually find it.”
The FBI has requested that Apple make a new version of its iPhone operating system that disables protections designed to keep people from guessing a device’s security passcode. This software would then be loaded onto the locked iPhone 5c used by Farook, who, along with wife Malik, is suspected of a mass shooting at a December 2 holiday party last year in San Bernardino that killed 14 people and injured 22. If successful, the software would deactivate the phone’s delay feature that prevents someone from rapidly and repeatedly trying to guess a passcode as well as features that permanently encrypt or erase data after a number of unsuccessful passcode attempts.
Led by Apple, the technology industry has indicated there is no middle ground when it comes to device security and data privacy. “Apple has the wherewithal to unlock the iPhone 5c in question,” Kaufman says. “Accessing a newer iPhone would require a new backdoor, but the same underlying logic would be used in both cases. An engineer with technological prowess could figure out how Apple is bypassing its security features and possibly reengineer it for other devices.”
Apple’s move to aggressively protect customer data using a passcode the company does not store came in September 2014 with the release of iOS 8. Google followed suit that November with Android 5.0, which likewise included default data encryption protected by a passcode that Google does not store. “People now have military-grade crypto on consumer devices,” says David Brumley, an associate professor of electrical and computer engineering at Carnegie Mellon University and head of the school’s CyLab Security and Privacy Institute. “The security industry is moving in the direction of basically self-destructing evidence.”
Apple and Google are more the exception than the rule. “Microsoft has engineered a backdoor into its Windows Phone,” Kaufman says. “Even though they haven’t made it publicly available, it’s fairly easy to find.” The company has also been accused of using flawed encryption algorithms supplied by the government in order to get government contracts, including a feature in Windows 10 that automatically uploads a computer’s decryption key to Microsoft’s servers. Chinese companies Huawei Technologies, Lenovo and ZTE are known to have built backdoors into their products at the request of the Beijing government. This and a string of cyber attacks traced back to China in 2013 led the U.S. to ban the Commerce and Justice departments, NASA and the National Science Foundation from buying hardware produced in that country by any company supported by its government.
With no middle ground when it comes to protecting devices via encryption, the San Bernardino case is more a matter for the courts and, eventually, Congress to decide. Apple is expected to argue that being forced to weaken its own encryption code is a violation of its right to free expression under the First Amendment of the U.S. Constitution. In the meantime Senate Intelligence Committee leaders are working on a bill that would require Apple and other companies to unlock their devices when a court orders them to do so. In the House Rep. Ted Lieu (D–Calif.) has even written a letter to Comey requesting the agency withdraw its motion so Congress, rather than the courts, can address the privacy and security implications.
Congress’s involvement in the area of digital security could play out in any number of ways, according to a February 18 Congressional Research Service report by Kristin Finklea, a domestic security specialist at CRS. One option might be for lawmakers to update electronic surveillance laws to cover data stored on smartphones. Congress could also prohibit the encryption of data unless there is a way to provide law enforcement with access.
In the likely event that Apple does file an application for relief with the U.S. District Court for the Central District of California by Friday, the hearing would be set for March 22.