Apple’s iPhone Blunder

 

iPhone_6_PLUS_preview_MG_1875Can the United States government compel Apple to help break into the phone of Syed Rizwan Farook, who, along with his wife Tafsheen Malik, gunned down fourteen innocent people last December at the Inland Regional Center in San Bernardino? That question has sparked fireworks in recent days. The dispute arises because Apple has equipped its new iPhones with encryption settings that erase the data contained on the phone whenever ten false password entries have been made. It was agreed on all sides that only Apple has the technology that might overcome the encryption device.

In her short order of February 16, Magistrate Judge Sheri Pym instructed Apple to offer that assistance, pursuant to the All Writs Act, which dates back to the First Congress in 1789. The heart of the matter lies in whether the government—more specifically the FBI—can require computer companies to build in back doors to their systems to allow the government to enter.

I participated in hearings that then-Senator John Ashcroft held in March 1998, and spoke in opposition to the measure, along with Kathleen Sullivan, then a professor of law at Stanford Law School. The greatest risk of the built-in back door is that the government will not be the only party that will enter through it. Back doors necessarily compromise the integrity of a security system. There were therefore serious constitutional as well as practical objections to these early proposals. It would be highly dangerous to allow the government to seize confidential data sources without first obtaining a search warrant, except in conditions of genuine emergency. And the loss of confidential data through theft gives rise to serious risks to vital data, for which compensation from the government, assuming that it were available, could never repair the damage or restore the confidence that people have within the system.

It should not be supposed however, that the proposals that were bandied about in 1998 reflect the state of play on the ground today. The first myth to dispel is that the current case has anything to do with data privacy at all. On the day the order was issued, Apple CEO Tim Cook posted a strong message to his customers denouncing the government. Unfortunately, Cook gave away the privacy game when he noted that Apple had cooperated with the government by turning over all data pursuant to a valid search warrant. The difficulty here is that the information that was sought from Farook’s iPhone had not been backed up, so that the government could not conduct a simple search on its own to get it. Instead, it had to attack the encryption systems built into the phone itself.

In dealing with that issue, it is important to note that Farook did not own the phone; his employer did, and it gave consent to the search. This knocked out any Fourth Amendment claim that the government intended to perform some unreasonable search and seizure. The point is true, but also inconsequential, that the legal situation would not materially change if Farook had used his personal password on his very own phone. The Fourth Amendment states, “no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.” Clearly, these requirements were satisfied when the government identified the iPhone to be searched, knowing that its possessor had committed mass murder. One of the tragic gaps in Cook’s letter is that he ignores the strength of the government’s Fourth Amendment case. He also fails to explain why granting the government’s request necessarily involves the compromise of the privacy of millions when only one iPhone is at stake.

Cook skirted the Fourth Amendment issue. Instead, the gist of his claim is contained in the following misguided sentence: “Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.” Every part of this sentence is misguided. First, there are thousands of government applications each year under the All Writs Act; there is no reason whatsoever why the government has to seek to pass new legislation to cover a situation that is amply covered by current laws.

Nor is Cook correct in insisting that the All Writs Act does not cover this particular case. The relevant portion of that short statute reads: “all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.” Terms like “necessary,” “appropriate,” “usages,” and “principles” were chosen precisely because Congress did not want to pin judicial discretion to particular technologies known in 1789 or any subsequent date. That language is no more problematic than the text of the Fifth Amendment, which holds that the United States shall not deprive any person “of life, liberty or property, without due process of law.”

No one doubts that these terms can give rise to difficult cases at the margins. But it hardly follows that Apple’s case is one just because it hogs the headlines. Cook attempts in his letter to stake out a per se rule that it is somehow outside the scope of the All Writs Act to require any company to work with the government in overcoming technological barriers. The case law on this question is well settled, and the government brief has assembled an impressive list of precedents in which private parties have been required to assist the government in its legitimate enforcement efforts. These include “ordering a phone company to assist with a trap and trace device” or having a company “assist in accessing a cell phone’s files so that a warrant may be executed as originally contemplated.” It is also the case that Apple had assisted the government without complaint in over other 70 cases.

To be sure, the order here is more complex than those imposed in other cases, but the legitimate government interest in the document is more compelling than those other cases, so that perhaps it is more accurate to say that what is truly unprecedented is that any company would seek to defy the government when the stakes are so high. Right now, Apple is worried that assisting the government will tarnish its brand. Cook may well be wrong. The better strategy might be to insist on the narrowness of the order, thereby avoiding the current soap opera. But Cook seems intent on turning the case into a heroic struggle, by making some dubious leaps of logic:

If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

This parade of horribles is unworthy of Apple. Judge Pym knows full well that a balancing test is implicit in any application of the All Writs Act. She therefore made it clear that Apple’s efforts are narrowly limited to the task of uncovering this requested information from this device, and no other; that Apple may limit the government to the remote access needed to unlock the passcode by brute force; and that Apple could protest the order within five days if it believed compliance with its terms would be “unreasonably burdensome.” Her order does not pave the way for the government to unilaterally insist in the next case that Apple must disclose the private information of millions of people.

Remember the central motif of the All Writs Act is one of balance and proportion. Allowing the government to win in one case under dramatic circumstances does not give it carte blanche to do whatever it wants in other cases. I strongly opposed the push for a mandatory built-in back door in 1998, and I would oppose one today. But, again, it is irresponsible hyperbole for Cook to write that “it would be wrong for the government to force us to build a backdoor into our products.” That broad request is not found in the government’s plea for a specific fix under which it need not be told anything about the technology that Apple will use in order to overcome the data protection feature on its iPhone.

Other defenses of Apple’s legal position are no better. Writing in The New Yorker, Amy Davidson also resorts to improbable scenarios to find the government’s request dangerous. She writes: “If it can tell Apple, which has been accused of no wrongdoing, to sit down and write a custom operating system for it, what else could it do?”

Once again, this reasoning is flawed. Of course, the All Writs Act applies to persons who are accused of no wrong. Persons who have been accused of no wrong still have to comply with warrants or subpoenas in cases in which their noninvolvement is wholly beyond dispute. The government can and does often compel labor in these situations. The hard question again is whether the need for the evidence trumps any broad claim for privacy, which given the specificity of the request of Apple, it surely does. Nor is Apple being asked to make an operating system. The government just wants to find the password without destroying the data. And as to the last question, the burden is always on the government to explain what it needs, not for any private party to parry an infinite set of government claims.

Davidson is on even weaker ground when she muses that the government might use its power to further violate the rights of individuals: “Could an imam, for example,” she writes, “be asked not only to tell what he knows but to manufacture an informant?” In posing her hypotheticals, Davidson takes no note whatsoever of the undisputed point that the only party to whom the government could turn for this assistance is Apple itself.

It is a commonplace of the law of common carriers that anyone who has a monopoly over some public utility or common carrier can be required to offer service on reasonable and nondiscriminatory terms. The existence of the monopoly is the justification for the extra burden. The government’s case against Apple is perfectly analogous to the common carrier situation insofar as the services it demands cannot be competitively provided. But when it comes to finding informants, the government is just as able, indeed more able, to do that by itself, which is why it never makes such silly requests under the All Writs Act.

Davidson concludes her think piece with a misguided flourish. In using the All Writs Act, she argues, “the government is attempting to circumvent the constitutionally serious character about many questions about encryption and privacy.” Nonsense. There are no constitutional issues raised by this government demand, even if some such claims might arise in other circumstances. The Wall Street Journal reports that Apple has hired two “heavyweight” at the prestigious law firm Gibson Dunn, Theodore Olson and Theodore Boutrous. Both are great lawyers who will have their work cut out for them.

© 2016 by the Board of Trustees of Leland Stanford Junior University

Members have made 54 comments.

  1. 1
  2. 2
  1. Profile photo of Brian Clendinen Member

    I am glad you wrote this up. I keep trying to argue that this request from a technical standpoint is not “unreasonably burdensome” and not a backdoor just workaround on the existing security software that only Apple has the ability to perform without making it a huge science project that could fail. Apple is winning the PR war right now even though they are completely in the wrong

    • #1
    • February 22, 2016 at 12:51 pm
    • Like0 likes
  2. Profile photo of EJHill Contributor

    Here I beg to differ with the esteemed professor.

    Is not the government demanding a private company to produce, gratis, a method for accessing information on the phone? It is not the Fourth Amendment rights of the terrorist Apple is protecting, it is theirs. Since when can any judge demand someone provide labor or material support without compensation? Even men drafted into service can be compelled to serve, but not to serve for free.

    The public argument offered by Apple is probably just a smokescreen, as complaining about compensation may come across as mercenary, especially as it relates to terrorism.

    Apple could probably just take the phone into a back room and unlock it, thereby protecting any trade secrets and proprietary information, but that would violate the chain of custody of evidence and render information taken from the device as inadmissible for the prosecution of others.

    • #2
    • February 22, 2016 at 1:03 pm
    • Like0 likes
  3. Profile photo of TomJedrz Member

    Here is the problem.

    The government has demonstrated that can’t be trusted to follow it’s own rules, nor can it be trusted to keep secrets. So, once this backdoor is created by Apple, ostensibly for use only once for this one phone, it exists. The likelihood that it will be used again is 100%. The likelihood that it will be used illegally in the future is very nearly 100%. And the likelihood that it will be compromised and become available to China, Russia, North Korea, and/or Iran is very nearly 100% as well.

    So, in my mind, the small potential gain from getting into this terrorist’s phone is overcome by the huge risk and likelihood that the crack tool created by Apple would become widely known and used.

    • #3
    • February 22, 2016 at 1:13 pm
    • Like0 likes
  4. Profile photo of Matt Upton Thatcher

    Apple posted a FAQ about the issue that addresses some of the gaps in the first letter.

    • #4
    • February 22, 2016 at 1:30 pm
    • Like0 likes
  5. Profile photo of ctlaw Thatcher

    Step 1: this writ requiring Apple write the unlocking tool and use it on this iPhone.

    Step 2: a writ requiring Apple to provide the unlocking tool to the Government to use independently of Apple.

    Step 3: use the tool without warrants on the circular logic ground that once the Government has the tool, iPhone owners no longer have an expectation of privacy.

    • #5
    • February 22, 2016 at 1:36 pm
    • Like0 likes
  6. Profile photo of Joseph Stanko Member

    EJHill: Since when can any judge demand someone provide labor or material support without compensation?

    Since Congress passed the All Writs Act in 1789:

    Richard Epstein: The case law on this question is well settled, and the government brief has assembled an impressive list of precedents in which private parties have been required to assist the government in its legitimate enforcement efforts. These include “ordering a phone company to assist with a trap and trace device” or having a company “assist in accessing a cell phone’s files so that a warrant may be executed as originally contemplated.” It is also the case that Apple had assisted the government without complaint in over other 70 cases.

    • #6
    • February 22, 2016 at 1:37 pm
    • Like0 likes
  7. Profile photo of Eric Hines Inactive

    What Epstein carefully elides is this part of the government’s pleading in response to Apple’s impertinent demurral from complying without its appeal being heard.

    Where Apple designed its software and that design interferes with the execution of search warrants, where it manufactured and sold a phone used by an ISIL-inspired terrorist, where it owns and licensed the software used to further the criminal enterprise, where it retains exclusive control over the source code necessary to modify and install the software, and where that very software now must be used to enable the search ordered by the warrant, compulsion of Apple is permissible under New York Telephone Co.

    This is plainly, dishonestly specious on the government’s part, and it is, by design, purely character assassination. A such, it has no place in a Court brief. Comey’s FBI lawyers know this. Apple designed its software and…manufactured and sold a phone used by…millions of American citizens, where it owns and licensed the software used to further the private affairs of American citizens…. It is plain from the careful construction of the government’s argument that it intends to expand it to pry into all of our private affairs whenever it takes a notion to.

    And of course compliance with the order is a threat to other users of Apple products: the encryption, once broken or a way once found to bypass entry controls, is permanently and everywhere defeated. The FBI’s IT personnel know this. So do the government’s NSA personnel. Neither can Apple make clear to the world that it does not apply to other devices or users without those personnel making such statements being guilty of lying. Breaking an encryption algorithm or producing a way past its entry controls permanently and everywhere destroys the security of that algorithm. Without lawful court orders is just as disingenuous.

    Only Apple will have access to such a tool, and it may destroy it after its use? Not so fast. Apple may destroy it after the government is through with its case. And there’s that chain of custody detail that EJHill raised above. That will be preserved, which means the government’s men will be present while “Apple” uses the tool. And it won’t only be the government’s lawyers, it’ll be the government’s encryption and forensics experts watching, to ensure Apple is using the tool as ordered. And those government IT folks will be watching, and recording, most carefully. They’ll have the tool themselves when all is said and done.

    Keep in mind, too, that this is the same Comey FBI that has been demanding companies be required to give to government decryption keys, knowing full well that such back doors irrevocably destroy the encryption.

    Eric Hines

    • #7
    • February 22, 2016 at 1:41 pm
    • Like0 likes
  8. Profile photo of Tennessee Inactive

    “She therefore made it clear that Apple’s efforts are narrowly limited to the task of uncovering this requested information from this device, and no other…”

    That is not how this is going to work and we all know it. Remember your Princess Bride: “We are men of action. Lies do not become us.”

    They have to… Have. To… write code to do the work around. This is the key. Do you think they’ll be allowed to unwrite it? Unring the bell?

    NSA steals encryption keys. Google Gemalto. I doubt the code will go anywhere but straight into the government’s pocket.

    FBI Director Comey has already declared publicly that the FBI wants tech companies to change their business models–and this is his test case to make that happen.

    It is clear the FBI/NSA doesn’t want people to have secure private spaces.

    It isn’t only bad guys who need encryption and security.

    • #8
    • February 22, 2016 at 1:42 pm
    • Like0 likes
  9. Profile photo of Joseph Stanko Member

    ctlaw:Step 1: this writ requiring Apple write the unlocking tool and use it on this iPhone.

    Step 2: a writ requiring Apple to provide the unlocking tool to the Government to use independently of Apple.

    Then Apple should keep their powder dry, comply with this current order, and then speak up if Step #2 occurs when they will be on solid legal and moral ground to resist instead of crying wolf now.

    • #9
    • February 22, 2016 at 1:43 pm
    • Like0 likes
  10. Profile photo of Eric Hines Inactive

    Joseph Stanko:

    ctlaw:Step 1: this writ requiring Apple write the unlocking tool and use it on this iPhone.

    Step 2: a writ requiring Apple to provide the unlocking tool to the Government to use independently of Apple.

    Then Apple should keep their powder dry, comply with this current order, and then speak up if Step #2 occurs when they will be on solid legal and moral ground to resist instead of crying wolf now.

    And be denied their right to appeal the order? Will the electrons on the phone evaporate during the contest?

    Never mind that Step 1 makes the others irrelevant.

    Eric Hines

    • #10
    • February 22, 2016 at 1:45 pm
    • Like0 likes
  11. Profile photo of ctlaw Thatcher

    Joseph Stanko:

    ctlaw:Step 1: this writ requiring Apple write the unlocking tool and use it on this iPhone.

    Step 2: a writ requiring Apple to provide the unlocking tool to the Government to use independently of Apple.

    Then Apple should keep their powder dry, comply with this current order, and then speak up if Step #2 occurs when they will be on solid legal and moral ground to resist instead of crying wolf now.

    As Eric stated, that’s not the only problem.

    • #11
    • February 22, 2016 at 1:47 pm
    • Like0 likes
  12. Profile photo of Chris B Member

    TomJedrz: And the likelihood that it will be compromised and become available to China, Russia, North Korea, and/or Iran is very nearly 100% as well.

    Not so much. Apple has the ability to create custom builds of iOS that will only run on devices with specific hardware IDs. They do this all the time with test releases of new versions.

    Implementing the FBI’s requested changes requires accessing the device in firmware update mode. The iPhone hardware is designed in such a way that it will always check to see if the firmware being uploaded to it is signed with Apple’s unique security certificate. Any modification to a signed software package will change the hash value and render the signing invalid.

    It is therefore possible for Apple to remain completely in control over which phones are rendered accessible. Even with the source code (which no one is asking for), it won’t be possible for a third party to create a signed version that will actually run on another phone so long as no one is able to get their hands on Apple’s Certificate Authority server. If anyone ever manages that . . . Well, let’s just say Apple has bigger problems at that point.

    • #12
    • February 22, 2016 at 1:48 pm
    • Like0 likes
  13. Profile photo of Tuck Inactive

    Richard Epstein: …Nor is Apple being asked to make an operating system. The government just wants to find the password without destroying the data….

    And the only way to do that is to create a custom operating system to allow it. So the first sentence is in error.

    Here’s how a computer-security expert explains it:

    “But that iPhone has a security flaw. While the data is encrypted, the software controlling the phone is not. This means that someone can create a hacked version of the software and install it on the phone without the consent of the phone’s owner and without knowing the encryption key. This is what the FBI ­ and now the court ­ is demanding Apple do: It wants Apple to rewrite the phone’s software to make it possible to guess possible passwords quickly and automatically.”

    The “phone’s software” is the operating system.

    • #13
    • February 22, 2016 at 1:54 pm
    • Like0 likes
  14. Profile photo of Tuck Inactive

    Joseph Stanko: Then Apple should keep their powder dry, comply with this current order, and then speak up if Step #2 occurs…

    It will be too late then, won’t it? Horse left barn.

    They’re doing the right thing standing up now, and they’re to be commended for protecting our Liberty…

    • #14
    • February 22, 2016 at 1:58 pm
    • Like0 likes
  15. Profile photo of Joseph Stanko Member

    Chris B: Even with the source code (which no one is asking for), it won’t be possible for a third party to create a signed version that will actually run on another phone so long as no one is able to get their hands on Apple’s Certificate Authority server. If anyone ever manages that . . . Well, let’s just say Apple has bigger problems at that point.

    Just so. I thought this part of Apple’s FAQ was more than a little disingenuous:

    Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.

    Apple already has such a key that it uses to sign iOS updates. If Apple cannot be trusted to protect this key, then our iPhones are already vulnerable even if they win this fight with the FBI.

    • #15
    • February 22, 2016 at 1:59 pm
    • Like0 likes
  16. Profile photo of Sash Member

    Here’s the conspiracy theory at our house:

    It’s an act to make terrorists exclusively use Iphones so the government can assess information more easily and not use phones from outside the US, where the inscription might be too hard to get through.

    The proof of it will come in due time when the FBI guy who “forgot” the code, remembers, and the FBI says, never mind Apple is off the hook with deception in place.

    Okay, maybe it’s crazy, but Judge Napolitano said the NSA already has a back door, why not use that?

    • #16
    • February 22, 2016 at 2:32 pm
    • Like0 likes
  17. Profile photo of Trinity Waters Thatcher

    I’m fairly sure the professor is on sound legal footing in this post, but the existential problem for all of us is that our government is not.

    Should I just repost this to every thread involving the Apple issue?

    A few stipulations:

    Government is almost entirely incompetent, avaricious, and untrustworthy.

    What it is demanding of Apple, in compromising our privacy, cannot be undone by electing somebody smarter.

    This issue falls securely in the framework of “look, squirrel!”

    The phone is question was government property in the first place.

    I don’t trust the government. My security clearance has already resulted in my data hitting cyberspace. All the rhetoric I see spilled above is meaningless at the point of government’s gun.

    • #17
    • February 22, 2016 at 2:37 pm
    • Like0 likes
  18. Profile photo of PHCheese Member

    Just a question. Are safe and vault manufacturers required to open for the government the safe of a private citizen who has lost the combination?

    • #18
    • February 22, 2016 at 2:39 pm
    • Like0 likes
  19. Profile photo of EJHill Contributor

    Joseph Stanko: Apple already has such a key that it uses to sign iOS updates. If Apple cannot be trusted to protect this key, then our iPhones are already vulnerable even if they win this fight with the FBI.

    Are you saying that Apple can update the iOS without you entering your passcode?

    Joseph Stanko (quoting Epstein): It is also the case that Apple had assisted the government without complaint in over other 70 cases.

    Is it 70 cases with the exact same iOS?

    Also this is where I split with lawyers and judges. Precedent is an awful argument.

    Precedent2

    • #19
    • February 22, 2016 at 2:52 pm
    • Like0 likes
  20. Profile photo of Tuck Inactive

    EJHill: …Is it 70 cases with the exact same iOS?…

    They’ve never backdoored their software like the FBI is asking them to do here. They’re perfectly willing to respond to a search warrant to provide the data they have access to, as they’ve made clear. That’s what they’re referring to in those 70 cases.

    The cartoon’s hilarious.

    • #20
    • February 22, 2016 at 2:55 pm
    • Like0 likes
  21. Profile photo of John Walker Contributor

    Tennessee: FBI Director Comey has already declared publicly that the FBI wants tech companies to change their business models–and this is his test case to make that happen.

    I don’t think this is even a test case. It’s a classic propaganda offensive, cherry-picking a high-profile terrorist incident to hammer down the protruding nail of Apple who, unlike many other technology companies, has been aggressive in defending its customers’ private data. (This is not just a matter of principle for Apple: success of their Apple Pay initiative depends upon customers’ perception that the financial accounts linked to their mobile devices are secure.)

    It is unlikely that two and a half months after the San Bernardino attacks, the perpetrator’s phone would contain ticking time bomb information needed to prevent future attacks. I also believe it highly probable that the FBI computer forensics people or NSA can get the information off the phone without the assistance of Apple. If they can’t, they might ask John McAfee for help.

    It seems to me that what’s really going on is that the panopticon snooper state is using this to force Apple to bend to its will and establish a precedent which will force other technology companies to comply.

    • #21
    • February 22, 2016 at 3:04 pm
    • Like0 likes
  22. Profile photo of Linc Wolverton Member

    It seems to me there is an easier solution. Hand the phone over to Apple, have Apple decrypt the device and send the contents back to the government along with the phone in its original received condition–that is, with the password intact.

    • #22
    • February 22, 2016 at 3:11 pm
    • Like0 likes
  23. Profile photo of Stoicous Inactive

    The problem is not that Apple Inc. is be compelled to work for free, Apple would probably volunteer to help the government without a Court Order; if the Court Order were not a serious compromise of their entire company.

    Data encryption is a very important aspect of Apple phones, one of the things that makes them more favorable, to many people, than the Open-Source Android devices. Part of that security is its totality, that not even Apple has the capability to walking in the “backdoor”. However, if Apple Inc. were to have a backdoor, the security would be compromised severely because the existence of a backdoor means it is possible to use it. It is like a real house, if one were to build a house with no doors, it would be much more safe from lock-key burgles than a house with a door, even if that door had many different locks on it. If there is a key, somebody is going to be able to break it.

    What this case comes down to is a citizen’s right to own a device that doesn’t have a backdoor. The FBI is seeking to use this high-profile and emotional case to set a precedent for the government having the right to acquire and utilize backdoors, hence nullifying that right.

    Terrorists having encrypted phones is not reason to end full data encryption for the entire public; just like public shootings don’t justify repealing the 2nd Amendment.

    • #23
    • February 22, 2016 at 3:13 pm
    • Like0 likes
  24. Profile photo of deovindice556 Member

    Comment I’ve already posted on WSJ but too lazy to write a new one:

    People here are entirely missing the point. Prior to 2008 when Apple unlocked iPhones for the feds, iOS 7 didn’t have as robust security features as iOS 9 does. This is a GOOD thing. Strong encryption and national security go hand in hand. Let’s call this what it is: A power grab. The FBI doesn’t like you having technology it cannot access. Apple & co.’s attorneys are no dummies. We know 100% that this “one-time use” key will be used time and again. It will be appropriated, modified, and abused, enabling warrantless searches on innocent citizens. Once this special backdoor is created, it WILL be compromised by hackers. The FBI is hiding behind sanctimonious emotional appeals, smearing Apple as terrorist-enablers, and obviously the WSJ editorial board is falling for it as well. Keep the front door safe and we won’t have Uncle Stupid stifling innovation at the expense of national security.

    • #24
    • February 22, 2016 at 3:20 pm
    • Like0 likes
  25. Profile photo of Joseph Stanko Member

    Stoicous: Terrorists having encrypted phones is not reason to end full data encryption for the entire public; just like public shootings don’t justify repealing the 2nd Amendment.

    I completely agree with you on that principle, but that’s not what this case is about. The judge did not order Apple to “end full data encryption” for all iPhones. The judge did not order Apple to modify future versions of iOS in any way.

    • #25
    • February 22, 2016 at 4:36 pm
    • Like0 likes
  26. Profile photo of Joseph Stanko Member

    Stoicous: Part of that security is its totality, that not even Apple has the capability to walking in the “backdoor”.

    But Apple does have the capability to “walk in the backdoor.” Apple is not disputing that it has the technical ability to do what the judge ordered in this case, ergo, Apple already has the ability to unlock this phone — and presumably mine as well.

    Perhaps Apple should respond to this case by designing iOS 10 such that it truly has no “backdoors” and Apple will be unable to comply with future orders like this. I’d have no objection to that.

    • #26
    • February 22, 2016 at 4:41 pm
    • Like0 likes
  27. Profile photo of Tennessee Inactive

    To John Walker:

    I agree with you.

    The horrible paradox is that terrorists do evil, government power expands (NSA, etc.)–reducing the good customs and expectations of a free people.

    Terrorism and Government
    Power become the two sheers of scissors, opposing forces, working to shred our free society.

    Intolerable.

    • #27
    • February 22, 2016 at 4:44 pm
    • Like0 likes
  28. Profile photo of Joseph Stanko Member

    EJHill: Are you saying that Apple can update the iOS without you entering your passcode?

    Yes. That’s what the judge ordered Apple to do in this case, and Apple does not dispute that it has the ability to do so.

    • #28
    • February 22, 2016 at 4:44 pm
    • Like0 likes
  29. Profile photo of Z in MT Member

    The backdoor already exists!!!!!

    Apple has all but admitted that they can do what the FBI asks and help them break into the phone. This means that there is already a security flaw in the iOS. Fortunately, Apple is the only one that can easily exploit the flaw.

    If the backdoor didn’t already exist, Apple could not do what the FBI is asking, and they would have responded, “It’s impossible for us to do what you are asking.”

    • #29
    • February 22, 2016 at 4:45 pm
    • Like0 likes
  30. Profile photo of Z in MT Member

    Joseph Stanko:

    Stoicous: Part of that security is its totality, that not even Apple has the capability to walking in the “backdoor”.

    But Apple does have the capability to “walk in the backdoor.” Apple is not disputing that it has the technical ability to do what the judge ordered in this case, ergo, Apple already has the ability to unlock this phone — and presumably mine as well.

    Perhaps Apple should respond to this case by designing iOS 10 such that it truly has no “backdoors” and Apple will be unable to comply with future orders like this. I’d have no objection to that.

    Beat me to it, and better said.

    • #30
    • February 22, 2016 at 4:48 pm
    • Like0 likes
  1. 1
  2. 2