Apple’s Reply to the FBI

 

Apple’s CEO Tim Cook has just released a message to Apple’s customers:

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

The Need for Encryption

Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

The San Bernardino Case

We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The Threat to Data Security

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

A Dangerous Precedent

Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Tim Cook

Your thoughts?

Published in Islamist Terrorism, Science & Technology
Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 185 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Kwhopper Inactive
    Kwhopper
    @Kwhopper

    Claire Berlinski, Ed.

    Depends how hard that is to do and what’s in the safe. I’d go with common sense. If it can be done reasonably easily, and if I had good reason to think that inside the safe was information that might be extremely relevant to saving many Americans’ lives, of course. Wouldn’t you agree?

    You guys can all argue this particular instance all day. But if they’re allowed to force Apple to do this after the fact, is it unreasonable to assume any encryption that the government wants to crack becomes de facto illegal? Will the FBI now be allowed to tell Apple or Microsoft or Google that their next OS has encryption that’s too strong and they must allow a backdoor before release?

    Have you further considered that if Apple is forced to do this, why wouldn’t terrorists switch to some other technology? The problem is you think this is a single instance and only “now” matters but the side effects are legion down the road.

    Finally, do you see this extending to less egregious things? If someone shoplifts a big screen TV and they supposedly told their friends where it was hidden through iPhone instant messaging, should the FBI force Apple to decrypt that phone?

    • #91
  2. Valiuth Member
    Valiuth
    @Valiuth

    David Carroll:If you have no problem with the government’s order (thanks, Claire, for the link to it), you favor trusting the government with potentially freedom-trashing information in favor of promoting safety and security (i.e., potentially finding more terrorists). If you favor protecting the security of all iPhones from the government, you see freedom from potential government abuse as more important than helping the government identify some more potential terrorists.

    But what the government is asking is to actually make millions of people less safe. Maybe not from terrorists, but from a whole host of other more common if less lethal threats. Isn’t the government basically asking Apple to generate a means of opening up any phone? Isn’t this like asking a lock maker to make a key that can open any lock. In a limited sense it seems okay, but if and once such a key and the means to make it get out then everyone’s home is open to future burglary.

    The other question is can the government force Apple to make such a thing. Surely I agree if the government makes this device/program that is within their power and right. Maybe even duty. But, does Apple have to comply with what they think is a bad idea, for them and their costumers. What kind of service do we owe the government in such cases. Could Einstein have been legally forced to work on a nuke for the government?

    • #92
  3. Claire Berlinski, Ed. Member
    Claire Berlinski, Ed.
    @Claire

    David Carroll: If the back door is created, the next court order is even easier.

    The reason this one is easy is because all the past court orders held up. This is not some completely bizarre, out-of-nowhere court order.

    • #93
  4. The King Prawn Inactive
    The King Prawn
    @TheKingPrawn

    Claire Berlinski, Ed.: How could this not seem like reasonable grounds to search his phone? Do we imagine the FBI can solve this by saying “radical Islamic terrorism” three times loudly?

    Yes, they have reasonable grounds to search that phone, and they have searched it to the absolute extent of their current capability. They are now demanding a capability that does not currently exist be brought into being solely for their singular (and supposedly one time only) use. It is the ability to command creation which is in question here, not the authority to conduct a search.

    • #94
  5. Claire Berlinski, Ed. Member
    Claire Berlinski, Ed.
    @Claire

    Valiuth: Isn’t the government basically asking Apple to generate a means of opening up any phone?

    Nope. Just that one. If I correctly understand the court order.

    • #95
  6. The King Prawn Inactive
    The King Prawn
    @TheKingPrawn

    Claire Berlinski, Ed.:

    The King Prawn: for our own good

    But that wouldn’t in fact be in our own good. If they did that, we’d all starve. Stopping more terrorist attacks would be in our own good. We have a government to provide for the common defense. That’s the one legitimate purpose we all agree on, surely.

    Our common defense is a military, not a law enforcement, function.

    And should they promise to provide all the housing, clothing, and food we require we’d have no rational objection to such confiscation.

    • #96
  7. Kwhopper Inactive
    Kwhopper
    @Kwhopper

    Claire Berlinski, Ed.

    Valiuth: Isn’t the government basically asking Apple to generate a means of opening up any phone?

    Nope. Just that one. If I correctly understand the court order.

    I’m sorry, but you don’t understand the issue. It is impossible to generate the “means” and have it apply to one and only one phone.

    • #97
  8. David Carroll Thatcher
    David Carroll
    @DavidCarroll

    Claire Berlinski, Ed.:

    David Carroll: If the back door is created, the next court order is even easier.

    The reason this one is easy is because all the past court orders held up. This is not some completely bizarre, out-of-nowhere court order.

    Actually, because of the level of effort involved by an innocent private party, Apple, this order seem to go beyond past orders.

    The court orders Apple to “advise the government of the reasonable cost of providing this service”, but does not order the government to pay the reasonable cost.  Hmmm.

    • #98
  9. Tennessee Inactive
    Tennessee
    @Tennessee

    Kwhopper,

    “The problem is you think this is a single instance and only “now” matters but the side effects are legion down the road.”

    Exactly.  The government is looking for a precedent in the current crypto war.  The lawyers and bureaucrats will stream in after the breach is made.

    Also, the analogies of breaking into houses don’t seem to line up with hacking one’s personal digital information.  We’re talking about getting one’s info in near realtime from a 24/7 tracker we call phones which carry more kinds of information than normally stored in a dusty office’s file cabinet.

    • #99
  10. The King Prawn Inactive
    The King Prawn
    @TheKingPrawn

    Claire Berlinski, Ed.: Forcing them to do it for a phone

    Good luck keeping government to their word once they’ve been allowed this power. Has government ever taken a power it gave back without revolution?

    • #100
  11. Bob W Member
    Bob W
    @WBob

    The FBI got a warrant to search one particular phone. There’s nothing wrong with that. Then they got an order to get Apple to help them execute that warrant. Either Apple is able to do that, or they are not. What’s the issue? If it’s impossible, just tell them that. If not, Why should they balk at it? This isn’t about whether they should make their phones hackable. It’s just about whether they can or cannot help with this warrant. What am I missing?

    • #101
  12. Valiuth Member
    Valiuth
    @Valiuth

    Claire Berlinski, Ed.:

    Valiuth: Does the government have a right to force the maker of such a device to undo their work?

    They’re not forcing them to undo their work — forcing them to do that for all phones would amount to that. Forcing them to do it for a phone, — one very reasonably suspected of containing evidence about a heinous crime and information relevant to stopping another one — is quite another story.

    True, but once you generate the process to undo such a device though all phones become at risk of having the process repeated on them. You want to avoid temptation you don’t start by licking the forbidden fruit. Also it is not clear to me that such a request will be limited to this instance, or to the FBI. If apple has to hack its own phone for the FBI in this case thanks to a legitimate warrant. Wont it have to do the same for all warrants.

    It is also not clear to me why this is Apple’s job. They may be the only ones to have the skill to do it, but what obligates them to? Pharmacuticals are the only companies with the skill to make the deadly cocktail needed for lethal injection. Can the government force them to make and sell these chemicals to them so they can fulfill a court order to execute a convicted murderer?

    • #102
  13. Valiuth Member
    Valiuth
    @Valiuth

    Claire Berlinski, Ed.:

    Valiuth: Isn’t the government basically asking Apple to generate a means of opening up any phone?

    Nope. Just that one. If I correctly understand the court order.

    But is it possible to only open up just that one. Or do you have to make a means to open up all that you will only use on that one. Apple seems to think it is the latter.

    • #103
  14. The King Prawn Inactive
    The King Prawn
    @TheKingPrawn

    Claire Berlinski, Ed.:

    Valiuth: Isn’t the government basically asking Apple to generate a means of opening up any phone?

    Nope. Just that one. If I correctly understand the court order.

    Understanding the court order and understanding the technology are not the same thing. Yes, the order includes the command for the software to include the serial number of that specific phone, but how difficult is it to write in a new serial number?

    • #104
  15. Claire Berlinski, Ed. Member
    Claire Berlinski, Ed.
    @Claire

    Kwhopper:

    But if they’re allowed to force Apple to do this after the fact, is it unreasonable to assume any encryption that the government wants to crack becomes de facto illegal?

    No, why would that be reasonable? They’re not asking Apple not to have encryption on their phones.

    Will the FBI now be allowed to tell Apple or Microsoft or Google that their next OS has encryption that’s too strong and they must allow a backdoor before release?

    No, why would this entail that?

    Have you further considered that if Apple is forced to do this, why wouldn’t terrorists switch to some other technology?

    They might, but the technology they used here is Apple’s. It’s much for the better if terrorists have to resort to communicating by courrier.

    The problem is you think this is a single instance and only “now” matters but the side effects are legion down the road.

    The legal precedent on this was set long ago — this isn’t new.

    Finally, do you see this extending to less egregious things? If someone shoplifts a big screen TV and they supposedly told their friends where it was hidden through iPhone instant messaging, should the FBI force Apple to decrypt that phone?

    Since when is shoplifting a federal crime? (And even if it were, no: They need a warrant.)

    • #105
  16. Claire Berlinski, Ed. Member
    Claire Berlinski, Ed.
    @Claire

    Valiuth: They may be the only ones to have the skill to do it, but what obligates them to?

    Legally or morally? Morally, I’d say the obligation is clear. Legally, there’s lots of precedent. It’s not like manufacturing the drug cocktail for an execution.

    • #106
  17. Ekosj Member
    Ekosj
    @Ekosj

    Let me see if I understand correctly….

    Tim Cook, enthusiastic supporter of a piece of governmental overreach – the Obergefell decision, now finds himself and Apple the targets of another piece of governmental overreach. And he is deeply troubled.

    Timmy lad … The shoe is not so comfortable on the other foot? Say it isn’t so!

    Surprise surprise surprise! The government powerful enough to grant you whatever YOU wish is powerful enough to compel from you whatever IT wishes.

    Give me a few moments to savor the richness of this sauce.

    • #107
  18. Claire Berlinski, Ed. Member
    Claire Berlinski, Ed.
    @Claire

    anonymous:

    So, they take a high-profile case and then, rather than crack the passcode themselves, force Apple, through a court order, to be complicit in bypassing their customers’ security, “just this one time”. But that one time makes Apple bend to the will of the snooper state, and demonstrate for all to see that its resistance has been broken. And when that happens, no iOS user can be confident their data are secure.

    But that’s true to begin with, by your reckoning (which seems compatible with what I understand up to that point). What do the Feds stand to gain by showing this? Surely having the ability but not having it be common knowledge that they do is more useful. Seems excessively conspiratorial: I think it’s plausible, yes, that they don’t know how to do it.

    • #108
  19. Robert McReynolds Member
    Robert McReynolds
    @

    Claire Berlinski, Ed.:

    Robert McReynolds: Get a warrant or go to hell.

    They have a warrant.

    No they don’t. Warrants cannot, in a free society, be made open ended. Warrants when it comes to prosecuting American citizens have to be specific in nature and subject to ending at a specified time. I don’t think either the NSA or the FBI have this.

    • #109
  20. Owen Findy Inactive
    Owen Findy
    @OwenFindy

    Kwhopper: The problem is you think this is a single instance and only “now” matters but the side effects are legion down the road.

    A-frakkin’-men.  So tired of timid short-range thinking.

    • #110
  21. Kwhopper Inactive
    Kwhopper
    @Kwhopper

    Claire Berlinski, Ed.

    Kwhopper:

    But if they’re allowed to force Apple to do this after the fact, is it unreasonable to assume any encryption that the government wants to crack becomes de facto illegal?

    No, why would that be reasonable? They’re not asking Apple not to have encryption on their phones.

    Don’t miss the point. They’re asking Apple to redefine what encryption means. If a certain “encryption” cannot satisfy a court ordered warrant in the future, I’d call that “illegal.”

    Will the FBI now be allowed to tell Apple or Microsoft or Google that their next OS has encryption that’s too strong and they must allow a backdoor before release?

    No, why would this entail that?

    See above.

    Have you further considered that if Apple is forced to do this, why wouldn’t terrorists switch to some other technology?

    They might, but the technology they used here is Apple’s. It’s much for the better if terrorists have to resort to communicating by courrier.

    And yet, the act would likely still have happened.

    The problem is you think this is a single instance and only “now” matters but the side effects are legion down the road.

    The legal precedent on this was set long ago — this isn’t new.

    You’re telling us it is. I’m not as sure it’s cut-and-dried in this case. I can see a minor class action suit from security-conscious iPhone users who perhaps bought the phone for this feature.

    Finally, do you see this extending to less egregious things? If someone shoplifts a big screen TV and they supposedly told their friends where it was hidden through iPhone instant messaging, should the FBI force Apple to decrypt that phone?

    Since when is shoplifting a federal crime? (And even if it were, no: They need a warrant.)

    I’m going for the most absurd; level of government aside. Pick your favorite Federal crime. Does a warrant now trump everything, even to the point of forced modification of legally created products? I don’t consider cracking into a safe a modification – that’s just destruction.

    • #111
  22. BrentB67 Inactive
    BrentB67
    @BrentB67

    Claire Berlinski, Ed.:

    Valiuth: Isn’t the government basically asking Apple to generate a means of opening up any phone?

    Nope. Just that one. If I correctly understand the court order.

    The software is ubiquitous. It could be employed on any iPhone.

    • #112
  23. The King Prawn Inactive
    The King Prawn
    @TheKingPrawn

    Let’s say this goes to the Supreme Court tomorrow and Apple is ordered to comply. How does the government enact that compliance? Are you willing to have our government stand over software engineers as they code and shoot all those who refuse to touch the keyboard?

    And again, my favorite question: to what end? How many lives must this information actually or hopefully save for government to be so empowered over the actions of citizens and the businesses they create?

    • #113
  24. Ekosj Member
    Ekosj
    @Ekosj

    Hi Claire. I think Mr Walker may have a point. Think back on the IRS targeting scandle. That came to light because -out of the blue – the IRS admitted they had been doing it! They WANTED everyone to know (a) that they could and would do such a thing and (b) that there was nothing anyone could do about it. Same thing with wiretapping Associated Press reporters … They seem to WANT it known that they can and do engage in this stuff.

    They are playing the “Chilling Effect”.

    • #114
  25. Brandon Shafer Coolidge
    Brandon Shafer
    @BrandonShafer

    The one thing Apple could do would be to create this firmware, allow the government to brute force itself in and then update all others software to allow for exceptionally long passwords that would be impossible to brute force.  I think what the FBI is asking is reasonable to their mind, but to Apple’s its asking them to sacrifice all of their credibility when it comes to Security. Security is a big deal to consumers and something like this could really hurt Apple, no matter how reasonable a request.  In some ways its like the Little Sisters of the Poor and Obamacare.  The FBI isn’t asking for Apple to break the password, but provide the means so that the FBI can do so, but if Apple does this its tantamount to breaking into one of their own phones whether or not they do the work.

    • #115
  26. Robert McReynolds Member
    Robert McReynolds
    @

    David Carroll:Forget what is legal and what is illegal for a moment. This discussion is about balancing safety and security versus freedom.

    If you have no problem with the government’s order (thanks, Claire, for the link to it), you favor trusting the government with potentially freedom-trashing information in favor of promoting safety and security (i.e., potentially finding more terrorists). If you favor protecting the security of all iPhones from the government, you see freedom from potential government abuse as more important than helping the government identify some more potential terrorists.

    I guess in the long run, I fear government abuse (which would be more widespread and long lasting) more than I believe in the (probably remote) likelihood of the to-be-created back-door for (supposedly) one iPhone helping the government to stop the next terrorist bomber.

    Considering there isn’t one instance where any of the government overreach has stopped a terrorist attack, I am certainly more worried about government abuse of these technologies that I am of being killed in a terrorist attack.

    • #116
  27. Lazy_Millennial Inactive
    Lazy_Millennial
    @LazyMillennial

    Claire, you have rejected every analogy we have given, citing precedent. I will note that many of the analogies we threw out also had precedent, and some of them were cases where the Supreme Court said, in effect, “precedent has taken you this far, but it stops here.”

    You also keep insisting that “this is only for one phone.” This ignores the obvious implications of the precedent set by winning the case, and the obvious abuses of new technology that almost always occur immediately after the creation of new technology.

    • #117
  28. Nyadnar17 Inactive
    Nyadnar17
    @Nyadnar17

    The government can sit and twirl. Seriously, the NSA sat on joint standards committees under the guise of helping and purposely weakened encryption and security standards for the entire planet.  Forget the economic cost of 9/11 the economic cost of their gross betray of public trust in weakening network security for the entire human race simply to make their job easier is a staggering cost we all are still paying.

    The idea that an organization as notoriously talent poor and slow to adjust to changing technologies as the Federal Government is going to be the only or even the best users of these “secret” backdoors would be laughable if it wasn’t so terrifying that people actually believe that notion.

    EDIT: The idea that this “backdoor” won’t be leaked before the FBI even gets a copy is laughable as well. Apple is a huge company and security engineering sure as hell aren’t going to like this. The very idea that Apple is even admitting this is possible might have already done serious damage.

    • #118
  29. Ball Diamond Ball Member
    Ball Diamond Ball
    @BallDiamondBall

    Legal analogies usually parallel having a landlord use a key to open a door, given a warrant, or a bonded locksmith in the case of owned property.  I admit it’s tricky.  Steve Gibson (grc.com, Spin-Rite, and the Security Now podcast) has a pretty gloomy take on this sort of thing.

    I view encryption as the electronic second amendment, and view encryption as parallel with the second, rather than the fourth.  I view a requirement to build backdoors into encryption methods to be so unbelievably insecure as to constitute outlawing locks on doors altogether.

    Every system known to have a backdoor has been compromised.  This is the nature of this thing.

    • #119
  30. Tennessee Inactive
    Tennessee
    @Tennessee

    Claire,

    Lawfareblog.com has a post on this from last October, “The Other Big Encryption News Last Week.”  Regarding the All Writs Act:

    “Magistrate Judge James Orenstein of the U.S. District Court for the Eastern District of New York signaled a certain unwillingness on Friday to sanction a legal maneuver increasingly employed by the federal government to break locked devices….he forewarned that he does not think the government or the courts have the legal authority to compel Apple to unlock its devices….

    The Supreme Court has made clear that the All Writs Act empowers federal courts to issue writs that are not otherwise governed by statute. Where a statutory scheme addresses the issue at hand, however, it is that statute that controls the court’s authority. Significantly, the Court has stated that the All Writs Act does not authorize courts to issue writs “whenever compliance with statutory procedures appears inconvenient or less appropriate.” Rather, the power to issue commands extends only so far as is necessary to effectuate orders a court has issued in the exercise ofjurisdiction otherwise obtained.”

    …it is not even clear if the alternative of having Apple unlock the phone is technologically possible. The court cannot order Apple or any other company to do something that is impossible to achieve.

    Not sure how this fits with the current decision.  The Steptoe Podcast discusses this issue.  Listening to it is why I’m sure the NSA will do all it can to crack everyone’s phones.

    • #120
Become a member to join the conversation. Or sign in if you're already a member.