Apple’s Reply to the FBI

 

Apple’s CEO Tim Cook has just released a message to Apple’s customers:

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

The Need for Encryption

Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

The San Bernardino Case

We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The Threat to Data Security

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

A Dangerous Precedent

Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Tim Cook

Your thoughts?

Published in Islamist Terrorism, Science & Technology
Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 185 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Robert McReynolds Member
    Robert McReynolds
    @

    I understand that this court order is specifically for San Bern shooters, but the government in general, FBI specifically, has been demanding that Apple create this back door for years and I think–as someone on here has already pointed out–this looks like the government’s attempt to make an example out of Apple for their intransigence. As many on here have voiced, I don’t think demonstrating the ability to crack this one phone would be left with only that one phone. I certainly don’t think that the government is simply going to walk away once this one phone is cracked either. They have demonstrated that they want all the information they can get on every single American that has a cell phone, so why would this be any different?

    • #121
  2. Nyadnar17 Inactive
    Nyadnar17
    @Nyadnar17

    Bob W:The FBI got a warrant to search one particular phone. There’s nothing wrong with that. Then they got an order to get Apple to help them execute that warrant. Either Apple is able to do that, or they are not. What’s the issue? If it’s impossible, just tell them that. If not, Why should they balk at it? This isn’t about whether they should make their phones hackable. It’s just about whether they can or cannot help with this warrant. What am I missing?

    It is. Its entirely about that. The specific help the FBI is requesting is a tool that would make every single product using an iOS hackable.

    • #122
  3. The King Prawn Inactive
    The King Prawn
    @TheKingPrawn

    Robert McReynolds: They have demonstrated that they want all the information they can get on every single American that has a cell phone, so why would this be any different?

    Exactly. This is being handled as a law enforcement matter rather than as a military operation, so the precedent for use of the power is against those suspected or accused of crimes. That is probably on purpose. The government may actually want the data for the reasons stated, but they want the power for way more.

    • #123
  4. Fake John/Jane Galt Coolidge
    Fake John/Jane Galt
    @FakeJohnJaneGalt

    Here is the situation as I see it.  The ability for an entity with enough resources like the government to break a smart phone is a given.  They can do it and they probably are doing it.  But it is not an easy thing or a cheap thing to do.  It would require virtualization of the device and considerable resources just to do one device.  It is not a technology that can readily be distributed at the street level.  By making the changes that the government is asking for technology will be created that will allow any phone to be opened by most local law enforcement offices.  As long as these entities follow the rules this also is not a problem.  The problem is that by changing the system in this way anybody with a couple of thousand to spend will be able to crack a smart phone using this same technology.  This could be a company, hacker, parent, etc.  Thus the security on the phone stops being like a padlock and starts to be sort of like a zipper.  It will keep casual people out but not anybody with the slightest determination.  Thus the phone will not be secure.

    • #124
  5. Tom Riehl Member
    Tom Riehl
    @

    anonymous:There is so much deception, manipulation, and game playing going on here that figuring out what is really happening is like opening up a matryoshka where each successive nested doll is telling you a different lie.

    First of all, what the FBI wants …

    In my opinion, what is really going on here at the centre of the nested dolls is that Apple has been a prominent opponent to the demands of the surveillance state, and the FBI has decided it’s time to pound down that protruding nail. So, they take a high-profile case and then, rather than crack the passcode themselves, force Apple, through a court order, to be complicit in bypassing their customers’ security, “just this one time”. But that one time makes Apple bend to the will of the snooper state, and demonstrate for all to see that its resistance has been broken. And when that happens, no iOS user can be confident their data are secure.

    You wrote the magic phrase: access to the source code.

    • #125
  6. Bob W Member
    Bob W
    @WBob

    Cook’s argument in his statement isn’t a legal argument. It’s a PR and “larger issue” argument. Maybe there’s something in the law that prevents the govt from impressing Apple into service to do something like this? Ok, make that argument. But saying it’s bad for business and has bad implications for their customers…sorry.

    • #126
  7. Matt Upton Inactive
    Matt Upton
    @MattUpton

    All these analogies break down because encryption isn’t the same as a traditional lock. Has there been any technology before high-bit encryption which rendered an object effectively invulnerable to all but the owner? Cook is wrong and right about precedent at the same time–there is legal precedent for this kind company compulsion, but never has the compulsion undermined the fundamental technology itself.

    It’s the equivalent of “Hi, your unbreakable glass is amazing. Perfectly legal, but can you modify it just a little so only we can break it.” Sure, it is a technologically trivial to make it breakable, but verging on impossible to make it only breakable to a single 3rd party.

    • #127
  8. Doctor Robert Member
    Doctor Robert
    @DoctorRobert

    I don’t understand Apple’s fuss.

    The answer is simple.  They create whatever hack they need to access the terrorists’ phones, do so in their own environs, and then destroy it.  Don’t give it to the govt.

    • #128
  9. Locke On Member
    Locke On
    @LockeOn

    Well, Claire, if your time overseas has caused you to miss the collapse of American citizens’ trust in our government, you now have an object example to correct your view.  This with a crowd many of whom would be predisposed to sympathize with LE and military needs.

    There are two different threat profiles here:

    Terrorists:  Shoot or blow up dozens to hundreds of people if they can manage it.

    Government:  Surveils all citizens it can reach, and harasses some based on their political and religious beliefs, with no upper limit on the force or resources employed.

    This issue splits right down the middle of these two, so it’s revealing which we feel is the greater actual threat.

    • #129
  10. Matt Upton Inactive
    Matt Upton
    @MattUpton

    Bob Wainright (#128): It’s a PR and “larger issue” argument.

    Whether by conviction or strategy, Apple has made an issue of user privacy not shared by most technology companies. While Google’s entire business model is entirely based upon selling your information, Apple sells you widgets. When those widgets are private and secure, it’s a big selling point.

    Apple refuses to even share subscriber/purchaser information for apps and periodicals with the companies selling those products. When you subscribe to one magazine, you won’t get 100 letters begging you to subscribe to others offered by the publisher.

    We finally get a tech company who determines it’s in their own interest to product user information. This is why Apple is specifically targeted, and the only company to make a public response.

    *Edited to fix quote source. 

    • #130
  11. Valiuth Member
    Valiuth
    @Valiuth

    anonymous:I continue to believe that forcing Apple to comply pour encourager les autres in the technology business is as much of a motivation for this as getting information off a particular phone.

    That is exactly what strikes me as the most nefarious aspect of this standoff.

    Sometimes you just have to play things hard. Which in this cases means the FBI and NSA cracking the iPhone on their own. If they had real intelligence that this was a life or death situation I think they could have presented it to Apple up front as part of a request to aid them, there is no indication that this is the case. But, they want Apple to do their jobs for them. Apple said no, and now they want to make Apple pay. All too human a reaction.

    • #131
  12. David Carroll Thatcher
    David Carroll
    @DavidCarroll

    Sometimes you just have to play things hard. Which in this cases means the FBI and NSA cracking the iPhone on their own. If they had real intelligence that this was a life or death situation I think they could have presented it to Apple up front as part of a request to aid them, there is no indication that this is the case. But, they want Apple to do their jobs for them. Apple said no, and now they want to make Apple pay. All too human a reaction.

    It is the government as bully, rather accomplishment by persuasion.  No wonder we have lost trust in the government.

    • #132
  13. Nyadnar17 Inactive
    Nyadnar17
    @Nyadnar17

    Doctor Robert:I don’t understand Apple’s fuss.

    The answer is simple. They create whatever hack they need to access the terrorists’ phones, do so in their own environs, and then destroy it. Don’t give it to the govt.

    Couple reasons to fuss.

    1. If Law Enforcement can use a court order to force Software Companies to create tools which do not exist in order to help them facilitate a case than there is pretty much nothing they can’t order any Software Company to do in name of a purported investigation or surveillance of a target.
    2. There is no reason at all to believe that Apple could keep such a hack in house even if the FBI allowed it. Even if they hack were somehow keep from ever leaking every single engineer who worked on the project or had access to that codebase would now be a security risk for the rest of their lives. The are not volunteer soldiers or government employees. These would be private sector individuals who basically had a gun put to their heads.
    • #133
  14. Robert McReynolds Member
    Robert McReynolds
    @

    Locke On:Well, Claire, if your time overseas has caused you to miss the collapse of American citizens’ trust in our government, you now have an object example to correct your view. This with a crowd many of whom would be predisposed to sympathize with LE and military needs.

    There are two different threat profiles here:

    Terrorists: Shoot or blow up dozens to hundreds of people if they can manage it.

    Government: Surveils all citizens it can reach, and harasses some based on their political and religious beliefs, with no upper limit on the force or resources employed.

    This issue splits right down the middle of these two, so it’s revealing which we feel is the greater actual threat.

    I always thought the whole premise of Conservatism was distrust of government, hence the necessity that it be made small.

    • #134
  15. Instugator Thatcher
    Instugator
    @Instugator

    Claire Berlinski, Ed.:

    The King Prawn: Again, limiting principle? If government says the threat is high enough is there nothing which remains out of its power even philosophically if not practically?

    But they’re not randomly asking for help to monitor everyone’s phone. This phone belonged to someone who really did walk into a facility for the developmentally disabled and kill fourteen people in the name of an international terrorist group with the means, motive, and opportunity to do it again. How could this not seem like reasonable grounds to search his phone? Do we imagine the FBI can solve this by saying “radical Islamic terrorism” three times loudly?

    No they aren’t. Apple asserts that the government is asking for a

    Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation.

    A new iPhone OS, that decrypts stuff – can be installed by a third party – oh and we promise not to keep a copy of it. PROMISE.

    Just like we promised to fairly vet non-profit 501c3 applications. Or serve healthcare needs of Vets.

    Guess what, Claire – the FBI can figure out things without having to decrypt the phone. It is called “investigating”. Besides, the perps are already dead and incapable of further carnage.

    Tell you what – when they turn over the Clinton file with a recommendation to prosecute (she is, after all, still alive and capable of really selling access assuming she wins in Nov) then maybe I’ll believe that they aren’t motivated by politics. But even then, no master key.

    • #135
  16. Robert McReynolds Member
    Robert McReynolds
    @

    Sympathizing with law enforcement and military entities has nothing to do with this.

    The notion that terrorism is enough to just allow the federal government to do as they please with society is probably the greatest threat from groups like al Qa’ida and ISIL. Law enforcement has been prevented from operating against the ideal that the suspects are automatically guilty. I know this particular case does not point to this notion, but the whole encrypted smartphone argument is not new and it is not relative to phones held by specific people. The government wants all phones software to have a backdoor.

    The military component is a non starter. I suspect that intelligence can be thrown in this category too. The whole notion of having an intelligence apparatus in the US is predicated on the idea that they not be used against US citizens without having first going through a stringent process to prosecute one–a SINGLE–target. Since the revelations of Snowden, we now know that the government no longer operates under that restriction and it believes it doesn’t have to. Furthermore, the government shows no signs of wanting to go back to that previous arrangement without being forced–so far it’s been through legislation that has compelled them. At what point will people arguing for the more “security” side of this say enough? This does not offer security because it hasn’t to date, but it does threaten liberties of innocent American citizens.

    • #136
  17. Nyadnar17 Inactive
    Nyadnar17
    @Nyadnar17

    Claire Berlinski, Ed.:

    The King Prawn: Again, limiting principle? If government says the threat is high enough is there nothing which remains out of its power even philosophically if not practically?

    But they’re not randomly asking for help to monitor everyone’s phone. This phone belonged to someone who really did walk into a facility for the developmentally disabled and kill fourteen people in the name of an international terrorist group with the means, motive, and opportunity to do it again. How could this not seem like reasonable grounds to search his phone? Do we imagine the FBI can solve this by saying “radical Islamic terrorism” three times loudly?

    Its not just this phone. I am sorry if I missed the post where you explained you understood this, but its one of the major cruxes of our opposition. What the FBI is asking for would not just give them access to this particular phone. If would give them access to every single iPhone and perhaps every single iOS device in existence.

    Not only that they are asking for a software tool created by a team of civilian engineers. If the tool or the know how was ever leaked even for a instance every single iOS on the planet(including those used by our Government and Law Enforcement officials) becomes vulnerable. The states here are enormous and I haven’t quite gotten the sense you respect that(again apologies if I missed a post).

    • #137
  18. MarciN Member
    MarciN
    @MarciN

    I’m on Apple’s side here, and I’m grateful they have publicized the issue.

    If Apple were to create this backdoor, the advantage we would gain in preventing terrorism would be fleeting–once it became public that this backdoor existed, the terrorists would find other means to communicate. This is why President Bush did not want the New York Times to publish the means with which the government was tracking money used for terrorism, because once the terrorists knew we were doing that, they would switch to something else. This is why burner phones enjoy such popularity in the criminal underworld.

    I’ve been watching a new show (in its second season), CSI: Cyber. It is about the FBI’s cyber crime division. If the Internet tools depicted on the show are at all real, I’d say we have more than enough ways to spy on Americans electronically.

    Furthermore, to create a new way for American citizens to get hacked–and if the backdoor existed, it would get hacked–would be insane, truly insane. Creating this new vulnerability is the most important reason to not ask Apple to do this.

    • #138
  19. Bob W Member
    Bob W
    @WBob

    Matt Upton:All these analogies break down because encryption isn’t the same as a traditional lock. Has there been any technology before high-bit encryption which rendered an object effectively invulnerable to all but the owner? Cook is wrong and right about precedent at the same time–there is legal precedent for this kind company compulsion, but never has the compulsion undermined the fundamental technology itself.

    It’s the equivalent of “Hi, your unbreakable glass is amazing. Perfectly legal, but can you modify it just a little so only we can break it.” Sure, it is a technologically trivial to make it breakable, but verging on impossible to make it only breakable to a single 3rd party.

    Is that the proper analogy? Apparently, the phone as it is currently constructed can be hacked. The info on it is no longer legally private. The concern is that the tool to hack it could make it possible for the govt to invade other phones as well? The govt has guns and battering rams which they could use to violate the fourth amendment if they wanted to.

    • #139
  20. Western Chauvinist Member
    Western Chauvinist
    @WesternChauvinist

    Since we’re drawing analogies, I’d like to treat this as a Second Amendment issue. Guns are personal protection devices of the physical type. Encryption is a personal protection technology of the information type. Any technology which impairs these methods of self defense, especially in the hands of an increasingly all-encompassing government (which by its nature is Force and corruption), should be vigorously resisted. What if, in order to “save lives,” the ATF required Smith and Wesson to develop a technology which (remotely, not by trigger lock) neutralizes its guns? I’m fairly certain we’d all object.

    But, let’s take the FBI’s case in good faith for a moment and assume this is meant for our “protection” (getting those gun control vibes yet?). Tim Cook dissents:

    Criminals and bad actors will still encrypt, using tools that are readily available to them.

    Gee, this sounds like a gun rights argument. Let’s restate:

    Criminals and bad actors will still get guns, using black markets that are readily available to them.

    Who knew? Tim Cook is a Second Amendment guy!

    BTW, I suspected the technology already existed before I read anonymous’s comment. Now I’m sure it does. Which does make one wonder what the security kabuki is all about. I think it may be to exonerate Apple when all its customers find out, as if it tried its damnedest to resist.

    • #140
  21. Eric Hines Inactive
    Eric Hines
    @EricHines

    The privacy and the security of our private identities, of our finances, of our health records, of any aspect of our lives we find useful to protect from prying eyes are critical to our ability to engage with our neighbors and our businesses and our government free from threats or attack.

    The privacy of our communications, the security of our speech, must absolutely be preserved. There is no security at all without our individual liberties, of which speech is one, held secure.

    “Law-enforcement agencies” and this Federal judge know this full well. And they know full well the truth of Cook’s statement in his letter:

    We can find no precedent for an American company being forced to expose its customers to a greater risk of attack.

    Eric Hines

    • #141
  22. Nyadnar17 Inactive
    Nyadnar17
    @Nyadnar17

    Bob W:Is that the proper analogy? Apparently, the phone as it is currently constructed can be hacked. The info on it is no longer legally private. The concern is that the tool to hack it could make it possible for the govt to invade other phones as well? The govt has guns and battering rams which they could use to violate the fourth amendment if they wanted to.

    If we want to use the terms of your analogy “gun and battering” currently do not exist and the court is ordering Apple to invent them.

    The analogy falls apart because we are talking about software and not hardware. Hardware is limited and its production and use can be tracked and monitored by public watchdog groups. Software is unlimited and its use can not be tracked or monitored by anyone.

    • #142
  23. Pilli Inactive
    Pilli
    @Pilli

    anonymous:

    …In my opinion, what is really going on here at the centre of the nested dolls is that Apple has been a prominent opponent to the demands of the surveillance state, and the FBI has decided it’s time to pound down that protruding nail. So, they take a high-profile case and then, rather than crack the passcode themselves, force Apple, through a court order, to be complicit in bypassing their customers’ security, “just this one time”. But that one time makes Apple bend to the will of the snooper state, and demonstrate for all to see that its resistance has been broken. And when that happens, no iOS user can be confident their data are secure.

    This is the heart of the matter.  If Apple loses the trust of its customers, it will inevitably lose sales.  I doubt that it will lose all that much in U.S. sales but outside the U.S. it will lose vast amounts.

    How will the FBI compensate Apple for those losses?  The FBI doesn’t care and won’t.

    The real question here is can the government command a company to build a product that even if only used one time would cost the company  a substantial loss that may never be regained?

    I am not a fan of Apple.  But our government finds it way too easy to force its citizens to do things that are damaging and expensive to the citizen.

    • #143
  24. Bob W Member
    Bob W
    @WBob

    Nyadnar17:

    Bob W:Is that the proper analogy? Apparently, the phone as it is currently constructed can be hacked. The info on it is no longer legally private. The concern is that the tool to hack it could make it possible for the govt to invade other phones as well? The govt has guns and battering rams which they could use to violate the fourth amendment if they wanted to.

    If we want to use the terms of your analogy “gun and battering” currently do not exist and the court is ordering Apple to invent them.

    The analogy falls apart because we are talking about software and not hardware. Hardware is limited and its production and use can be tracked and monitored by public watchdog groups. Software is unlimited and its use can not be tracked or monitored by anyone.

    That’s the part that I’m not sure about … ordering Apple to do it.  I have no idea if there is precedent for that. And it’s really the only thing about this which bothers me.

    • #144
  25. Lazy_Millennial Inactive
    Lazy_Millennial
    @LazyMillennial

    As usual, I agree with Kevin D. Williamson:

    http://www.nationalreview.com/article/431491/apples-tim-cook-right-resist-governments-demand

    • #145
  26. Rodin Member
    Rodin
    @Rodin

    Most citizens don’t know or consider some of the broad powers the government already uses. For example they can direct certain contractors to violate intellectual property rights and make use of such information in the conduct of the federal contract; they can interfere with private contracts to divert products promised to customers in order to fill a government need instead. Of course the government compensates entities for resulting losses. But that doesn’t mean that ordinary and customary rights are not abridged. These are extraordinary powers typically limited to wartime necessity. So the demand on Apple is unusual only with respect to criminal investigation rather than a wartime exigency. (“Wartime” in this context also included the Cold War and the arms race between the US and the USSR.)

    The “war on terror” of course falls variously in the post-criminal act investigative sphere and the imminent threat detection and prevention sphere. If WMDs were suspected as an imminent threat — specifically nuclear and biologicals — there might be broader public pressure on Apple to cave. Bad facts make for bad law.

    • #146
  27. Matt Upton Inactive
    Matt Upton
    @MattUpton

    Bob W: Is that the proper analogy? Apparently, the phone as it is currently constructed can be hacked. The info on it is no longer legally private. The concern is that the tool to hack it could make it possible for the govt to invade other phones as well? The govt has guns and battering rams which they could use to violate the fourth amendment if they wanted to.

    From what I understand, there are currently no tools to easily brute force an iPhone passcode in any reasonable time. The concern is that anything Apple does to provide the govt with access will create a security vulnerability anyone can exploit. As Apple claims now, even they themselves cannot access user encrypted information. The phone itself would need to be changed to allow someone without the passcode access.

    Apple wouldn’t be building a better battering ram, so to speak; rather Apple would be weakening the doors to their own products.

    • #147
  28. TKC1101 Member
    TKC1101
    @

    I assume Google has already caved on this?

    • #148
  29. Tennessee Inactive
    Tennessee
    @Tennessee

    Matt Upton:

    The concern is that anything Apple does to provide the govt with access will create a security vulnerability anyone can exploit.

    I think the newly developed access would inevitably escape Apple or the government’s hands–either through hacking or leaks–as discussed in a new Lawfare post by Robert Chesney.  He says he’s not sure how to assess how likely the new access would “get into the wild.”

    But NSA hacked Gemalto to steal encryption keys to their smart cards.  Do we think they’d be slow to steal from Apple?

    • #149
  30. Bob W Member
    Bob W
    @WBob

    Apple would be creating an updated operating system which lacks security features, as I understand it. Just like it issues normal updates periodically, it would be creating a special one for this circumstance that lacks the security features, and updating the phone with this unsecure operating system.  So this will always be within the capability of Apple to do, unless they create phones that can’t accept updates. It seems to me they could do this in a way that it couldn’t be copied or used for anything else, but what do I know.

    • #150
Become a member to join the conversation. Or sign in if you're already a member.