Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 50 original podcasts with new episodes released every day.
Apple’s Reply to the FBI
Apple’s CEO Tim Cook has just released a message to Apple’s customers:
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.
This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
The Need for Encryption
Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.
All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.
Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.
For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
The San Bernardino Case
We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.
When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
The Threat to Data Security
Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.
In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
A Dangerous Precedent
Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.
The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.
We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.
While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.
Tim Cook
Your thoughts?
Published in Islamist Terrorism, Science & Technology
You guys can all argue this particular instance all day. But if they’re allowed to force Apple to do this after the fact, is it unreasonable to assume any encryption that the government wants to crack becomes de facto illegal? Will the FBI now be allowed to tell Apple or Microsoft or Google that their next OS has encryption that’s too strong and they must allow a backdoor before release?
Have you further considered that if Apple is forced to do this, why wouldn’t terrorists switch to some other technology? The problem is you think this is a single instance and only “now” matters but the side effects are legion down the road.
Finally, do you see this extending to less egregious things? If someone shoplifts a big screen TV and they supposedly told their friends where it was hidden through iPhone instant messaging, should the FBI force Apple to decrypt that phone?
But what the government is asking is to actually make millions of people less safe. Maybe not from terrorists, but from a whole host of other more common if less lethal threats. Isn’t the government basically asking Apple to generate a means of opening up any phone? Isn’t this like asking a lock maker to make a key that can open any lock. In a limited sense it seems okay, but if and once such a key and the means to make it get out then everyone’s home is open to future burglary.
The other question is can the government force Apple to make such a thing. Surely I agree if the government makes this device/program that is within their power and right. Maybe even duty. But, does Apple have to comply with what they think is a bad idea, for them and their costumers. What kind of service do we owe the government in such cases. Could Einstein have been legally forced to work on a nuke for the government?
The reason this one is easy is because all the past court orders held up. This is not some completely bizarre, out-of-nowhere court order.
Yes, they have reasonable grounds to search that phone, and they have searched it to the absolute extent of their current capability. They are now demanding a capability that does not currently exist be brought into being solely for their singular (and supposedly one time only) use. It is the ability to command creation which is in question here, not the authority to conduct a search.
Nope. Just that one. If I correctly understand the court order.
Our common defense is a military, not a law enforcement, function.
And should they promise to provide all the housing, clothing, and food we require we’d have no rational objection to such confiscation.
I’m sorry, but you don’t understand the issue. It is impossible to generate the “means” and have it apply to one and only one phone.
Actually, because of the level of effort involved by an innocent private party, Apple, this order seem to go beyond past orders.
The court orders Apple to “advise the government of the reasonable cost of providing this service”, but does not order the government to pay the reasonable cost. Hmmm.
Kwhopper,
Exactly. The government is looking for a precedent in the current crypto war. The lawyers and bureaucrats will stream in after the breach is made.
Also, the analogies of breaking into houses don’t seem to line up with hacking one’s personal digital information. We’re talking about getting one’s info in near realtime from a 24/7 tracker we call phones which carry more kinds of information than normally stored in a dusty office’s file cabinet.
Good luck keeping government to their word once they’ve been allowed this power. Has government ever taken a power it gave back without revolution?
The FBI got a warrant to search one particular phone. There’s nothing wrong with that. Then they got an order to get Apple to help them execute that warrant. Either Apple is able to do that, or they are not. What’s the issue? If it’s impossible, just tell them that. If not, Why should they balk at it? This isn’t about whether they should make their phones hackable. It’s just about whether they can or cannot help with this warrant. What am I missing?
True, but once you generate the process to undo such a device though all phones become at risk of having the process repeated on them. You want to avoid temptation you don’t start by licking the forbidden fruit. Also it is not clear to me that such a request will be limited to this instance, or to the FBI. If apple has to hack its own phone for the FBI in this case thanks to a legitimate warrant. Wont it have to do the same for all warrants.
It is also not clear to me why this is Apple’s job. They may be the only ones to have the skill to do it, but what obligates them to? Pharmacuticals are the only companies with the skill to make the deadly cocktail needed for lethal injection. Can the government force them to make and sell these chemicals to them so they can fulfill a court order to execute a convicted murderer?
But is it possible to only open up just that one. Or do you have to make a means to open up all that you will only use on that one. Apple seems to think it is the latter.
Understanding the court order and understanding the technology are not the same thing. Yes, the order includes the command for the software to include the serial number of that specific phone, but how difficult is it to write in a new serial number?
No, why would that be reasonable? They’re not asking Apple not to have encryption on their phones.
No, why would this entail that?
They might, but the technology they used here is Apple’s. It’s much for the better if terrorists have to resort to communicating by courrier.
The legal precedent on this was set long ago — this isn’t new.
Since when is shoplifting a federal crime? (And even if it were, no: They need a warrant.)
Legally or morally? Morally, I’d say the obligation is clear. Legally, there’s lots of precedent. It’s not like manufacturing the drug cocktail for an execution.
Let me see if I understand correctly….
Tim Cook, enthusiastic supporter of a piece of governmental overreach – the Obergefell decision, now finds himself and Apple the targets of another piece of governmental overreach. And he is deeply troubled.
Timmy lad … The shoe is not so comfortable on the other foot? Say it isn’t so!
Surprise surprise surprise! The government powerful enough to grant you whatever YOU wish is powerful enough to compel from you whatever IT wishes.
Give me a few moments to savor the richness of this sauce.
But that’s true to begin with, by your reckoning (which seems compatible with what I understand up to that point). What do the Feds stand to gain by showing this? Surely having the ability but not having it be common knowledge that they do is more useful. Seems excessively conspiratorial: I think it’s plausible, yes, that they don’t know how to do it.
No they don’t. Warrants cannot, in a free society, be made open ended. Warrants when it comes to prosecuting American citizens have to be specific in nature and subject to ending at a specified time. I don’t think either the NSA or the FBI have this.
A-frakkin’-men. So tired of timid short-range thinking.
Don’t miss the point. They’re asking Apple to redefine what encryption means. If a certain “encryption” cannot satisfy a court ordered warrant in the future, I’d call that “illegal.”
See above.
And yet, the act would likely still have happened.
You’re telling us it is. I’m not as sure it’s cut-and-dried in this case. I can see a minor class action suit from security-conscious iPhone users who perhaps bought the phone for this feature.
I’m going for the most absurd; level of government aside. Pick your favorite Federal crime. Does a warrant now trump everything, even to the point of forced modification of legally created products? I don’t consider cracking into a safe a modification – that’s just destruction.
The software is ubiquitous. It could be employed on any iPhone.
Let’s say this goes to the Supreme Court tomorrow and Apple is ordered to comply. How does the government enact that compliance? Are you willing to have our government stand over software engineers as they code and shoot all those who refuse to touch the keyboard?
And again, my favorite question: to what end? How many lives must this information actually or hopefully save for government to be so empowered over the actions of citizens and the businesses they create?
Hi Claire. I think Mr Walker may have a point. Think back on the IRS targeting scandle. That came to light because -out of the blue – the IRS admitted they had been doing it! They WANTED everyone to know (a) that they could and would do such a thing and (b) that there was nothing anyone could do about it. Same thing with wiretapping Associated Press reporters … They seem to WANT it known that they can and do engage in this stuff.
They are playing the “Chilling Effect”.
The one thing Apple could do would be to create this firmware, allow the government to brute force itself in and then update all others software to allow for exceptionally long passwords that would be impossible to brute force. I think what the FBI is asking is reasonable to their mind, but to Apple’s its asking them to sacrifice all of their credibility when it comes to Security. Security is a big deal to consumers and something like this could really hurt Apple, no matter how reasonable a request. In some ways its like the Little Sisters of the Poor and Obamacare. The FBI isn’t asking for Apple to break the password, but provide the means so that the FBI can do so, but if Apple does this its tantamount to breaking into one of their own phones whether or not they do the work.
Considering there isn’t one instance where any of the government overreach has stopped a terrorist attack, I am certainly more worried about government abuse of these technologies that I am of being killed in a terrorist attack.
Claire, you have rejected every analogy we have given, citing precedent. I will note that many of the analogies we threw out also had precedent, and some of them were cases where the Supreme Court said, in effect, “precedent has taken you this far, but it stops here.”
You also keep insisting that “this is only for one phone.” This ignores the obvious implications of the precedent set by winning the case, and the obvious abuses of new technology that almost always occur immediately after the creation of new technology.
The government can sit and twirl. Seriously, the NSA sat on joint standards committees under the guise of helping and purposely weakened encryption and security standards for the entire planet. Forget the economic cost of 9/11 the economic cost of their gross betray of public trust in weakening network security for the entire human race simply to make their job easier is a staggering cost we all are still paying.
The idea that an organization as notoriously talent poor and slow to adjust to changing technologies as the Federal Government is going to be the only or even the best users of these “secret” backdoors would be laughable if it wasn’t so terrifying that people actually believe that notion.
EDIT: The idea that this “backdoor” won’t be leaked before the FBI even gets a copy is laughable as well. Apple is a huge company and security engineering sure as hell aren’t going to like this. The very idea that Apple is even admitting this is possible might have already done serious damage.
Legal analogies usually parallel having a landlord use a key to open a door, given a warrant, or a bonded locksmith in the case of owned property. I admit it’s tricky. Steve Gibson (grc.com, Spin-Rite, and the Security Now podcast) has a pretty gloomy take on this sort of thing.
I view encryption as the electronic second amendment, and view encryption as parallel with the second, rather than the fourth. I view a requirement to build backdoors into encryption methods to be so unbelievably insecure as to constitute outlawing locks on doors altogether.
Every system known to have a backdoor has been compromised. This is the nature of this thing.
Claire,
Lawfareblog.com has a post on this from last October, “The Other Big Encryption News Last Week.” Regarding the All Writs Act:
Not sure how this fits with the current decision. The Steptoe Podcast discusses this issue. Listening to it is why I’m sure the NSA will do all it can to crack everyone’s phones.