Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 50 original podcasts with new episodes released every day.
Apple’s Reply to the FBI
Apple’s CEO Tim Cook has just released a message to Apple’s customers:
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.
This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
The Need for Encryption
Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.
All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.
Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.
For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
The San Bernardino Case
We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.
When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
The Threat to Data Security
Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.
In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
A Dangerous Precedent
Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.
The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.
We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.
While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.
Tim Cook
Your thoughts?
Published in Islamist Terrorism, Science & Technology
It’s not the logic they’d be extending but the precedent and the technology.
Playing the ISIS card isn’t relevant. If it ain’t one thing, it’s another.
What the FBI really lacks in this is the Apple signature that will allow the phone to install the temporary OS into RAM and run the phone from that temporary installation. Why is the FBI not simply demanding Apple provide its signature so that the FBI can do the work it is demanding Apple to do with the materials provided by Apple? Is the FBI simply being lazy in this and commandeering Apple’s labor as well as its intellectual property? Temporary enslavement is still enslavement.
Yes, I agree. Until we die and go to Heaven, crime will be with us, and so will the need for law enforcement, prisons, and a justice system. And a military, for that matter.
That’s what it’s asking. And it’s happy for Apple to do that on Apple’s property, without showing them how it’s done. It just wants to know what’s on the phone.
The precedent is entirely on the FBI’s side, and they’re not asking for the technology, they’re happy to have Apple keep it. They want to know what’s on that phone.
Modifying an operating system is a complex task by nature. Software changes nearly always have unintended effects, thus the end result always must by subject to extensive testing. The reality behind the curtain in the world of technology is unknown to most people. Many people would be involved, most of whom don’t have the security clearance Hillary had.
The basic mathematics, science and extensive arcane knowledge necessary to evaluate technical choices limits the playing field to those with the expertise, hence Tim Cook overrules the FBI.
Open borders + encrypted phones = DISASTER.
I think if I were Apple, while appealing the decision I would indulge in some of what Glen Reynold (Instapundit) terms Irish Democracy. I would set up a software team with the worst program manager in my company, the most incompetent programmers I have, and the most inept software testers I can find, and say – “go for it.” I would make sure to include a SJW type in the team, who will file lots of complaints against other team members. (Who, of course, would have to be removed pending investigation.) I would also include the most anal-retentive, rules-driven QA person I could find to reject everything whenever a minor flaw is found for do-over.
I would also bill the government the highest rate allowable, and let the project roll. Whenever I get complaints from the FBI reply We are working on it. We have 20 people on this project. It’s complicated.” After a year or so, the government would give up. If not, it’s easy money.
Seawriter
The same logic as the NSA used to monitor all cellular phones.
My thoughts:
One: If Apple doesn’t build the back door, someone else will.
Two: I’m not sure I believe Tim Cook when he says that such a thing doesn’t already exist. Either at Apple or somewhere else.
Three: Tim Cook is making a huge PR play. I wouldn’t trust him as far as I could throw him. I have no doubt he ‘cares’ about his customers’ privacy. But probably not as much as he cares about the viability of his company.
Four: I wonder why everyone keeps telling me my iPhone can be easily hacked?
Five: I think the ‘house’ analogy (and similar analogies assuming illegal surveillance will suddenly become the norm) is terribly flawed. The [stick in name of loathed gummint agency here] can illegally enter any house it wants to, any time it wants to, without a warrant, or even ‘probable cause.’ Because existing technology allows them to do so without much effort. To be legal, they have to get a warrant first. And, hopefully, usually, they do. If Apple does create the ‘master key’ for iPhones, someone (government? Apple?) would have the technology to illegally read any phone they wanted without much effort, without a warrant or even ‘probable cause.’ Hopefully, usually, they would get one first. (That doesn’t mean that I think Apple cooperating in this regard is an excellent idea, but I don’t think it should be dismissed because of an imperfect analogy).
Six: The list of things Tim Cook says we should fear if Apple cooperates (“the government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge”) is a hoot. Doesn’t he know that people are already doing most of these things, and are most of the way to doing the rest of them, quite well, without Apple’s help? “Track your location?” Seriously? Where’s he been? (Please don’t tell me, although some of you probably could).
Seven: I really, really, hope there’s nothing on that phone, or even that there’s no event taking place, or about to take place that will cause anyone to wonder if it’s related to something on that phone. Otherwise, Tim Cook’s PR coup is going to turn into a disaster of unimaginable proportions, and, in the aftermath, you can kiss our ‘rights’ goodbye, because no one will care.
Huh? You are losing me here. What is limiting the growth of our government? Where is it limited? Can you name any fiscal year in which our federal government spent less than the year prior?
Agree. Ufortunately we have two political parties that prefer to change the second component rather than the first.
Remember the Context:
None of this redounds to freedom.
This actually makes my point. If the technology is there, the government will use it, even the use is illegal.
Of course, as in the case with Lois Lerner’s IRS, any illegal use will be severely punished. Oh wait. The complete opposite of that.
Another analogy, hopefully closer to this particular situation.
If a safe manufacturer developed a lock with no current way to be picked would you advocate for the government to demand the manufacturer develop a method for picking it just to open a single safe to obtain its contents?
I’d cooperate as fully as I could with law enforcement, myself. ISIS v. the FBI? I’m with the FBI.
Depends how hard that is to do and what’s in the safe. I’d go with common sense. If it can be done reasonably easily, and if I had good reason to think that inside the safe was information that might be extremely relevant to saving many Americans’ lives, of course. Wouldn’t you agree?
So I should fret myself into a nervous wreck because they’re coming for my house?
Again, limiting principle? If government says the threat is high enough is there nothing which remains out of its power even philosophically if not practically?
I think most would agree. However, I have no expectation that any policy, law enforcement or otherwise will change based on what is on that phone.
We are PC driven nation. We call terrorist attacks workplace violence. We can’t address ISIS without addressing Islam and there is no interest in doing so.
What will happen is the FBI will get the contents of the phone and analyze communication patterns and then go to FISA court and say because we can’t target Muslims we have to monitor every cell phone looking for similar patterns.
Then if in the monitoring of all cell phones looking for similar patterns other activity is detected it is used for whatever purpose can be driven through FISA.
The reality ultimately being that this fishing expedition becomes a Trojan horse.
Forget what is legal and what is illegal for a moment. This discussion is about balancing safety and security versus freedom.
If you have no problem with the government’s order (thanks, Claire, for the link to it), you favor trusting the government with potentially freedom-trashing information in favor of promoting safety and security (i.e., potentially finding more terrorists). If you favor protecting the security of all iPhones from the government, you see freedom from potential government abuse as more important than helping the government identify some more potential terrorists.
I guess in the long run, I fear government abuse (which would be more widespread and long lasting) more than I believe in the (probably remote) likelihood of the to-be-created back-door for (supposedly) one iPhone helping the government to stop the next terrorist bomber.
Here’s an interesting article about the British position on this sort of thing. And another one.
GCHQ=Government Communication Headquarters, the British Intelligence outfit that involves itself in this sort of thing.
Whether your think the British response is more forward-thinking, or just another example of mind-numbed robots in the grip of an all-seeing and all-powerful State, it’s certainly different. And they are at least open about it.
No. How hard a thing is to accomplish should never be the limiting factor on government. It would be pretty darn easy for government to simply withhold the entire earnings of every worker in the nation for whatever purpose government can devise for our own good, but the ease of which it could be accomplished would not justify it philosophically, morally, or constitutionally.
The government having a key to your apartment though would not be any breach of your rights. You have the right to not being searched without cause. Not to the government having access to your apartment theoretically. Which they would without a key any way since they could literally take down your door.
The question is do you have a right to an object that is physically unsearchable? I would argue yes. Does the government have a right to force the maker of such a device to undo their work? This I’m not sure about. I imagine given current law yes, but I’m not sure I like it.
But they’re not randomly asking for help to monitor everyone’s phone. This phone belonged to someone who really did walk into a facility for the developmentally disabled and kill fourteen people in the name of an international terrorist group with the means, motive, and opportunity to do it again. How could this not seem like reasonable grounds to search his phone? Do we imagine the FBI can solve this by saying “radical Islamic terrorism” three times loudly?
Potential usefulness in the cause of “safety and security” also should not be the limiting factor on government power.
But that wouldn’t in fact be in our own good. If they did that, we’d all starve. Stopping more terrorist attacks would be in our own good. We have a government to provide for the common defense. That’s the one legitimate purpose we all agree on, surely.
Saving lives is all about trade-offs. If the government wanted to same thousands of American lives each year, it would make the automobile illegal.
In this case, we lose some privacy (and freedom) on the speculative chance of catching some terrorists. No one knows what is in that phone. Maybe nothing. Maybe plenty. If the back door is created, the next court order is even easier.
They’re not forcing them to undo their work — forcing them to do that for all phones would amount to that. Forcing them to do it for a phone, — one very reasonably suspected of containing evidence about a heinous crime and information relevant to stopping another one — is quite another story.