Over the last 24 hours, the Internet has been given something new to fight about. Following a federal judge's order that Apple embark on an odyssey to unlock the San Bernardino shooter's iPhone 5c, Apple CEO Tim Cook has responded in kind, and politely told them to shove off. This further inflamed the ongoing encryption discussion, which leads us to right now: a battle for the privacy of Apple users' is being fought at the highest levels -- between a multi-billion international corporation and a bastion of the "free" world, the US government. We here at MacNN
have three questions. One is for the public: where do you draw the line on freedom and privacy? The second is for Google and Samsung: where's your strong and definitive stand, because the governments of the world are coming for you next. The third is to our elected officials: do you have any clue what this is about?
First, some background, just in case you weren't up to speed
On Tuesday, a US magistrate judge ordered Apple to comply with FBI requests
to help the law enforcement agency get into a San Bernadino County-owned iPhone 5c running some version of iOS 9 used by one of the perpetrators of the mass-shooting that happened in San Bernadino, California last December. Apple would not be required to override the passcode itself, but is being compelled to develop and give the FBI software that would prevent the iPhone from erasing itself after a number of unsuccessful login attempts, to skip any time delay induced by the OS between each attempt, and allow for a digital connection to enter the passwords in rapid succession, allowing the agency to brute-force unlock the iPhone.
Later, writing about the San Bernardino events of December, Cook noted
the FBI's initial request help and Apple's compliance, as "[Apple has] no sympathy for terrorists." While Apple has complied with valid subpoenas and search warrants in relation to data in its possession, as well as providing access to engineers to help with investigations, it is balking at the prospect of building "a backdoor to the iPhone" by creating a version of iOS with weaker security that can be installed on a target iPhone to retrieve data. Apple claims this to be a tool "we simply do not have, and something we consider too dangerous to create," just in case it ends up in "the wrong hands" after being produced, as it surely would.
The FBI is using a law from the 18th century, called the All Writs Act. The All Writs Act authorizes the United States federal courts to "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law." Apple has battled requests that have utilized this before
, most notably in New York City, over a phone owned by a drug dealer. That decision is still pending.
There are several "outs" for Apple to make its case against the All Writs Act. The Act is only applicable when other judicial tools are not available, and must be "necessary and appropriate" to the particular case.
The Los Angeles US Attorney noted that despite the FBI having declared that the pair of shooters was radicalized into violence on their own, and not part of any conspiracy sleeper cell, the unlock demand was made to "exhaust every investigative lead in the case" -- though what there is left to investigate hasn't been made clear -- and crassly made the claims on behalf of the victims' families in order to to "learn everything we possibly can about the attack in San Bernardino." So, while the attorney claims that the investigation "needs" this data, it is not at all clear exactly why, given that the FBI claims that there is no danger of a cell of terrorists in California that the pair belonged to.
As far as other judicial tools, the FBI is already in possession of location logs, and phone calls made from the device, as the former has been obtained by a subpoena from the carrier, and the latter from the county itself. Additionally, for some reason, the county didn't exercise best enterprise device management, and did not retain the ability to unlock the phone using enterprise software deployment tools which Apple and others make available. Why this was not done is, again, unclear.
While questions are being asked if Apple can comply with the FBI's ruling, it in all likelihood can do so. The iPhone 5c lacks the "secure enclave" present on later devices, starting with the 5s. Apple has five days to challenge the ruling in an appeal, as spelled out in the court order. The two attackers -- a US-born citizen and his Pakistani permanent-resident wife -- were killed in the shootout, so they can not be compelled to unlock the phone.
Those are the facts. Now, the questions.
Does the government fully understand what they are asking?
There has been a strong indication in the various public statements of government officials -- from FBI Director James Comey in particular, but including Judge Sheri Pym in this particular case, and right on through to White House Spokesman James Earnest -- that all of them lack the technical understanding to consider the wider implications of what they are asking Apple (and, by inference, all tech companies) to do. Let's be perfectly clear on this: the government wants Apple, and, by extension, every device manufacturer, to create a hack into their products that lets law enforcement access the data on a given device at will.
This is a particularly odd case for the FBI to hang its hat on, given that there is no urgency for the data on the dead gunman's iPhone. The agency itself has said there is no reason to believe there was any conspiracy, alliance with other groups or individuals, or further danger to the public. With both attackers dead, there is no risk of any criminal information that might be on the iPhone being relevant to any investigation. Additionally, there is no reason to suspect that there is viable information on the phone beyond what has already been obtained from the other attacker's smartphone, and the subpoenaed information.
To put it in legal terms, the FBI wants to conduct a "fishing expedition" without any evidence to support their cause, and it wants to force an innocent party -- Apple -- to help it do so. In our view, the judge grievously erred by ordering Apple to comply with the unfounded request in the first place, and Apple has a strong case with this particular set of circumstances to win an appeal. Were the attackers still on the loose, Apple would face a tougher test, but the possibility of further shootings from this particular couple has been -- literally -- laid to rest.
More troubling than the shaky logic of the judge's decision is the incredulous nature of the statements from the motion's defenders: they apparently believe Apple has something akin to a wand from Harry Potter
it can wave to change the firmware of the device, which handles the automatic erasure after a given number of passcode attempts, and further argues that even if provided such a tool, it would only ever use it for this one case, and just this one time. Apparently the story of Pandora's Box
and the lessons it imparted are no longer taught in any of the schools these people went to.
While it isn't necessary for officials to have an engineer's level of understanding of how technology works, it would be useful to either seek the counsel those who do, or at least bone up on some basic concepts of how security on digital devices works -- as well as making some time for a fresh perusal of the US Constitution, which explicitly states in its Fourth Amendment that the right of the people "to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized."
It is not a stretch of judicial activism to extend "papers and effects" to include the personal data on a smartphone or other computing device unimagined in the 18th century. This is where we store that information, and quite literally the Fourth Amendment is all that stands between a functioning democracy and a dysfunctional police state -- which the US is already dangerously close to becoming. While we can't claim to know everything about this case, the FBI has not, as of yet, suggested any probable cause -- much less anything supported by oath or affirmation, and without any specific description of what they expect to find and where on the device they intend to look -- and thus completely fail the test put forth by the authors of the Constitution.
It is disheartening to think that ignorance of fundamental law, or fear, or simple cowardice on the part of the agency -- and in this particular ruling, Judge Pym -- can lead so easily to a complete dismemberment of an explicitly-stated and guaranteed Constitutional right granted to all citizens. The precedents being set here -- that companies can be forced to create security compromises in their products that reveal personal customer information without due cause, and that governments can override Constitutional protections through the use of a sufficient number of buzzwords like "terrorism" -- would have far-reaching effects that would effectively nullify any checks and balances that guard against abuse, not to mention the rights that distinguish the US as a free country.
This isn't about guarding the rights of some dead murderers; it is about whether the US, as a country and as a society, stands by its founding principles under any circumstances, or -- as Benjamin Franklin once put it
-- are willing to "give up essential liberty, to purchase a little temporary safety" and thus "deserve neither."
Where are Google and Samsung on this?
Google and Samsung have to know that they're next: why they aren't standing up for Apple in a forceful way on this matter is beyond us. Yes, this afternoon Google CEO Sundar Pichai issued a series of tweets
that half-heartedly supported Cook's defense against being forced to compromise its products, but the fact that these two smartphone makers, along with others who control mass amounts of gathered user data -- such as Facebook's Mark Zuckerberg -- have largely remained silent or have paid little more than lip service to the concept of user data privacy is deeply disturbing.
One particularly concerning possibility is that they are already acquiescent to surveillance requests, and law enforcement demands for custom device firmware -- a theory supported by the curious lack of criticism of any other major tech company other than Apple by the FBI, NSA, and other such agencies. Another possibility is that they are biding their time, and hoping the pointing finger never reaches their doorstep. Either way, it's not a good scenario.
Sundar Pichai, Sergey Brin, Larry Page, and Eric Schmidt (among others) had an opportunity to stand up today with pitchforks in hand, and Pichai's very carefully worded "defense" of Cook's letter notwithstanding, they didn't.
The silence is deafening.
Where is your line?
We've been battling back and forth in the forums on this issue, and there is no agreement to be had. A vocal few are advocating for some sort of secure Apple effort in the matter, that can also somehow not be discovered by others, or escape Apple's headquarters upon completion. This example of ill-considered "magical thinking" -- espoused by at least one presidential candidate
-- is in our view dangerously naive thinking, at best.
The FBI is asking for a custom firmware to break into this phone. We're not going to delve into the technical specifics of the way to do this, but we believe that it is possible for Apple to do it, and smarter people than us
agree. We won't make analogies for this, because there are none, there are no parallels to what we give up if this battle gets lost, because even the smallest iPhone has a ridiculous amount of personal data on it.
As far as the request goes, a best case for this is that the FBI is asking for a tool to brute-force any device without a secure enclave, which is still millions of devices in circulation. Tim Cook claims that this would have to be a more universal tool, and would lay the iOS bare -- right down to the financial history and fingerprint data currently safe from prying eyes on Apple's devices.
There are no assurances that once created, this tool will not be used in every investigation, regardless of import, secretly and/or illegally, or "lawfully" once the laws are sufficiently changed or watered down -- White House assurances to the contrary. There are no guarantees that the tool will not escape into the wild, and be used to bypass locks on devices stolen on the streets. The way for Apple to guarantee user security, and the right for the people to implement encryption for now, and the future, is to stand firm, and not develop this tool.
Should the tool get made, where does the avalanche stop? If this ruling isn't fought, and Apple concedes the fight, where is that line? At what point will the dangerous precedent not be used as a bludgeon by law enforcement and the courts to continue further development of the tool? "Terrorist group Y is using the iPhone 6," they'll say. "We need to get through an encrypted iPhone 7 running iOS 10" will be after that. How and when do you get off the slippery slope?
We have readers from all walks of life, with just about every range of beliefs. All of us have surrendered so many freedoms and privacy rights in the name of security, at what point are we willing to stand up and say that you've reached our line? If this isn't your line, then what is?
As far as we are concerned, the line is well past us. If we don't take steps, its going to get farther and farther away, and there won't be a damn thing we can do about it at that point. Law enforcement and the protection of public safety is an important function of the government, but there was -- and remains -- strong reasoning as to why the Founders set up the rights of the people as proactive and preventative, and the powers of the government and justice system as largely reactive and limited: it was specifically
to prevent abuse in the name of "protecting" us.
When the government -- even for the noblest of reasons -- gains too much preventative authority, the result is the loss of liberty. "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated" -- do we believe that, or are we flexible on this?
Where we stop the line, and get off the slippery slope of eroding personal freedoms and governmental eavesdropping is what is really being decided here.
-Mike Wuerthele with contributions from Charles Martin