Syed Rizwan Farook, the male shooter in the December 2, 2015 San Bernardino terrorist attacks, carried an iPhone 5C that was owned by the county public health department, where he worked as an inspector. After the attack, the county consented to the FBI’s search of Farook’s phone, but it runs on Apple’s iOS9 operating system, which is built with default device encryption — and, after two months of trying, the FBI hasn’t been able to break through the phone’s data security features.
The FBI believes the phone may hold data, such as in contact lists, photographs, or instant messages, that could materially assist in the investigation and potentially identify others, in the United States and overseas, who assisted Farook. So, what to do?
The FBI went to a federal magistrate judge, who ordered Apple to help the FBI unlock the iPhone by disabling the feature that wipes the data on the phone after 10 incorrect tries at entering a password. That would allow the government to keep trying new combinations, without deleting the data. Apple says only the phone’s user can disable that feature, but the court order requires Apple to write software that would bypass it.
Apple is resisting the court order, saying that such software would be a back door to the iPhone and is too dangerous to create. “Once created,” Apple CEO Tim Cook said, “the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.”
National security and counterterrorism specialists say Apple should be a “good corporate citizen,” comply with the court order, and help in the investigation of one of the worst terrorist attacks in U.S. history. Privacy advocates agree with Apple that the government is overreaching, and argue that the court decision could set a precedent that would undermine the privacy, and security, of everyone’s handheld devices. So Apple will appeal the court order, and no doubt other technology companies and interest groups will weigh in, in court and in the court of public opinion, about the propriety of the order.
We’ll have to see how the appeal plays out, but for now we can draw some conclusions. First, Apple’s default encryption system must be pretty robust, if it can withstand two months of probes and hacking efforts by a highly motivated FBI. Second, in the post-Edward Snowden world, there is a huge amount of mistrust for our own government and an obvious unwillingness to hand them any code, key, or software that could then be used in another mass governmental data-gathering effort. And third, with cell phones now ubiquitous world-wide and serving as wallets, photo albums, Rolodexes, mailboxes, message centers, internet search devices, and home to countless apps, all in one handy device the size of a playing card, we’re going to see more and more of these collisions between data security and national security in the future.