What about Guarding the Front Door?

 

iPhone_6_PLUS_preview_MG_1875Yesterday, we learned that US Magistrate Judge Sheri Pym ordered Apple to help the FBI break into the iPhone of San Bernardino murderer Syed Rizwan Farook. Perhaps mindful of King Canute’s experience stopping the tide, the judge stopped short of commanding Farook’s phone to reveal the encrypted data directly, instead instructing Apple to reorient its developers from productive endeavors to undermining its own carefully constructed software- and hardware-based security architecture.

Not long ago, conservatives were appalled when Congress passed and the Supreme Court upheld a law requiring citizens to purchase a particular product — Obama-certified health insurance. Now we have a federal judge, drawing authority solely from the All Writs Act of 1789, ordering a private company to create a custom product for the government’s use. And what a product! If implemented, the judge’s order will create a weaponized piece of software capable of taking down the world’s second most valuable company. Collateral damage could include large swathes of our globally-connected economy. Hyperbole? Read the relevant portion of the order for yourself and imagine what malicious hackers could do with such an app:

Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

Apple’s reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (“SIF”) that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device’s flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade (“DFU”) mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.

But don’t worry. We can count on the FBI to keep iBackDoor every bit as secure as its sensitive background investigation files. What could go wrong?

A troubling pattern emerges. The government, for reasons of political correctness, fails in its particular security responsibilities and responds by putting civil liberties at risk generally out of a twisted sense of fairness. For instance, privacy concerns shielded the jihadist social media posts of San Bernardino shooter Tafsheen Malik from official scrutiny as she applied from Pakistan to join her husband Farook in soft target-rich Southern California, but nothing deters the FBI and a federal court from demanding that Apple risk the privacy of hundreds of millions of iPhone users.

I have a simple idea: Instead of conscripting a private company to open a back door into the smart phone used by millions, why not guard the front door into the United States?

Published in Law, Science & Technology
Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 52 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Bryan G. Stephens Thatcher
    Bryan G. Stephens
    @BryanGStephens

    Hear, Hear. But conservatives like Charles Krathamer think this is OK.

    Apple is right to stand up to them. Good for them.

    • #1
  2. The King Prawn Inactive
    The King Prawn
    @TheKingPrawn

    Kevin Williamson would agree:

    From the IRS to the ATF to the DEA to Hillary Rodham Clinton’s super-secret toilet e-mail server, the federal government has shown, time and again, that it cannot be trusted with any combination of power and sensitive information. Its usual range of official motion traces an arc from indifference through incompetence to malice.

    Where the federal government imagines that it gets the power to order a private firm to write software to do its incompetent minions’ jobs for them is anybody’s guess. Tim Cook and Apple are right to raise the corporate middle finger to this nonsense. Cook says that the software the FBI demands is “too dangerous to create” given the risk that it could fall into “the wrong hands.”

    Perhaps he is being polite, but the fact is that the FBI is the wrong hands…

    • #2
  3. Kozak Member
    Kozak
    @Kozak

    As someone whose personal data was hacked from the FEDS due to their carelessness, what could possibly go wrong?

    • #3
  4. Tennessee Inactive
    Tennessee
    @Tennessee

    Admiral Rogers of the NSA has made a myriad of speeches about the importance of cyber security. Now the government demands weakened security that could spread to the digital ecosystem.

    The vulnerabilities desired by the USA to fight bad guys will also be sought by Iran, China, etc. to oppress the good guys. It isn’t simply an issue of law enforcement or anti-terror.

    Am I wrong?

    • #4
  5. Valiuth Member
    Valiuth
    @Valiuth

    Yah I agree this is a bad step in my opinion where the risks I think far out way the potential benefits. Generally I have taken a rather passive stance toward government bulk data collection, and even their surveillance abroad. It isn’t clear to me that Metadata is in anyway my property, it is simply information I give off. People can learn things about me though it, but they could do likewise by watching me go about my business in public, and the internet is a public space. As far as spying is concerned, it is the job of our government to spy on foreigners abroad and even here at home. Espionage has always been illegal, and necessary. We need to hire patriots and solid people to do it.

    This iPhone business I think though goes out too far in all direction. If the government decided to use its resources and technical know how to hack an iPhone that is both their right and business assuming all is done in compliance with the law. Forcing a third party to do your work for you seems wrong and a dangerous precedent. This is not like asking a landlord for a key to a suspects home. That is done to spare the landlord the property damage that would result from the cops busting the door down to serve the warrant. It is in his interest to comply. This is like asking Coke to make a poisoned bottle to assassinate a terrorist.

    • #5
  6. Frank Soto Member
    Frank Soto
    @FrankSoto

    Completely agree with George on this.

    • #6
  7. Son of Spengler Member
    Son of Spengler
    @SonofSpengler

    Well put!

    • #7
  8. Frank Soto Member
    Frank Soto
    @FrankSoto

    Tennessee:Admiral Rogers of the NSA has made a myriad of speeches about the importance of cyber security. Now the government demands weakened security that could spread to the digital ecosystem.

    The vulnerabilities desired by the USA to fight bad guys will also be sought by Iran, China, etc. to oppress the good guys. It isn’t simply an issue of law enforcement or anti-terror.

    Am I wrong?

    You’re not wrong.

    There are a whole lot of members of government and law enforcement who don’t really understand what they are asking.  They don’t understand the dangers.

    This is an important issue that the Republicans need to get right.

    • #8
  9. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    George Savage: Read the relevant portion of the order for yourself and imagine what malicious hackers could do with such an app

    If this part is accurate:

    The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE.

    Then the software in question could only be used to break into that specific device, i.e. the iPhone formerly owned by Syed Rizwan Farook.  Wouldn’t do hackers much good at all.

    • #9
  10. Frank Soto Member
    Frank Soto
    @FrankSoto

    Joseph Stanko:

    George Savage: Read the relevant portion of the order for yourself and imagine what malicious hackers could do with such an app

    If this part is accurate:

    The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE.

    Then the software in question could only be used to break into that specific device, i.e. the iPhone formerly owned by Syed Rizwan Farook. Wouldn’t do hackers much good at all.

    This is effectively the FBI saying “make it so this can’t be abused”, to which Apple responded “That’s not how software works.”

    • #10
  11. The Cloaked Gaijin Member
    The Cloaked Gaijin
    @TheCloakedGaijin

    “As a society, we don’t allow phone companies to design their systems to avoid lawful, court-ordered searches.” — Senator Tom Cotton, Harvard lawyer and former U.S. Army captain

    Judges make these decisions as according to this thing called — the U.S. Constitution.

    Companies don’t get to make the law just because they have lots of money, influence, cool gadgets, and most importantly — because they are run by liberals instead of conservatives.

    A certain number of people have to die first?

    Ask Brendan Eich if you want to live in a country where companies get to pick the winners and losers.

    The libertarians can make their case, but I think Senator Cotton might make a better one.

    The solution might be somewhere in the middle.

    • #11
  12. Tennessee Inactive
    Tennessee
    @Tennessee

    Former NSA Director Michael Hayden says FBI is wrong (not directly referencing this case, though…).

    Amazing. And from the man who started collecting all our phone records…

    • #12
  13. Frank Soto Member
    Frank Soto
    @FrankSoto

    The Cloaked Gaijin:“As a society, we don’t allow phone companies to design their systems to avoid lawful, court-ordered searches.” — Senator Tom Cotton, Harvard lawyer and former U.S. Army captain

    Judges make these decisions as according to this thing called — the U.S. Constitution.

    Companies don’t get to make the law just because they have lots of money, influence, cool gadgets, and most importantly — because they are run by liberals instead of conservatives.

    A certain number of people have to die first?

    Ask Brendan Eich if you want to live in a country where companies get to pick the winners and losers.

    The libertarians can make their case, but I think Senator Cotton might make a better one.

    The solution might be somewhere in the middle.

    The damage incurred by weakening our encryption and security is going to be far greater than that of lone wolf attackers.

    This isn’t an inherently libertarian argument.  Building master keys to bypass digital security is a colossal mistake.

    • #13
  14. Tennessee Inactive
    Tennessee
    @Tennessee

    I like Sen. Tom Cotton, but I’m amazed that so many conservatives like big government surveillance of all Americans.

    • #14
  15. Frank Soto Member
    Frank Soto
    @FrankSoto

    Tennessee:I like Sen. Tom Cotton, but I’m amazed that so many conservatives like big government surveillance of all Americans.

    Bypassing the discussions about how much spying is okay, this approach to spying is reckless.

    • #15
  16. Steve C. Member
    Steve C.
    @user_531302

    We had this argument 20 years ago when the Clinton Administration was demanding tech companies build back doors into their devices In the name of ‘security and law enforcement”. They lost that fight then. They should lose it again. We have an NSA that has thousands of people and billions of dollars. If their capabilities are what we’ve been led to believe, then the government should be able to break into one phone.

    • #16
  17. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    Frank Soto:The damage incurred by weakening our encryption and security is going to be far greater than that of lone wolf attackers.

    This isn’t an inherently libertarian argument. Building master keys to bypass digital security is a colossal mistake.

    I agree with that in general, but it’s not clear to me if that’s what this specific court order requires.

    • #17
  18. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    Frank Soto: This is effectively the FBI saying “make it so this can’t be abused”, to which Apple responded “That’s not how software works.”

    Are you sure?  My understanding was that Apple does have the capability to sign software updates such that they would only work for a single device.  On a technical level that’s an entirely separate question from building any “master keys” or “backdoors” that would weaken encryption for all devices.

    • #18
  19. Stephen Dawson Inactive
    Stephen Dawson
    @StephenDawson

    Joseph Stanko:

    George Savage: Read the relevant portion of the order for yourself and imagine what malicious hackers could do with such an app

    If this part is accurate:

    The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE.

    Then the software in question could only be used to break into that specific device, i.e. the iPhone formerly owned by Syed Rizwan Farook. Wouldn’t do hackers much good at all.

    Except for establishing the precedent that law enforcement can force the company to break the code. Then the Manhattan DA can get a judge to order Apple do it for the 150 iPhones he has in his lockup … and so on (thanks to not-so-Lazy_Millennial for the link)

    • #19
  20. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    Stephen Dawson: Except for establishing the precedent that law enforcement can force the company to break the code. Then the Manhattan DA can get a judge to order Apple do it for the 150 iPhones he has in his lockup … and so on (thanks to not-so-Lazy_Millennial for the link)

    So you don’t think that even with a warrant the government should have access to the data?  That seems to be a separate argument from the OP’s claim that such access would create a backdoor that hackers could exploit.

    Are you against the power to issue warrants in general?  Should we eliminate all search warrants across the board?

    • #20
  21. Carey J. Inactive
    Carey J.
    @CareyJ

    Joseph Stanko:

    Frank Soto: This is effectively the FBI saying “make it so this can’t be abused”, to which Apple responded “That’s not how software works.”

    Are you sure? My understanding was that Apple does have the capability to sign software updates such that they would only work for a single device. On a technical level that’s an entirely separate question from building any “master keys” or “backdoors” that would weaken encryption for all devices.

    Once hackers (or foreign intel agencies) get their hands on the app, they can reverse engineer it and remove the single-device code. At that point, the app becomes a master key (or at least a lockpick) to any iPhone.

    • #21
  22. It's A Gas Member
    It's A Gas
    @ItsAGas

    We’ve had this conversation before and privacy won:

    https://www.epic.org/crypto/clipper/

    • #22
  23. George Savage Member
    George Savage
    @GeorgeSavage

    Carey J.:

    Once hackers (or foreign intel agencies) get their hands on the app, they can reverse engineer it and remove the single-device code. At that point, the app becomes a master key (or at least a lockpick) to any iPhone.

    Whatever one thinks of malicious hackers, as a group they are extremely clever.

    Carey J is exactly right. Apple engineers, operating in a more compartmentalized, security-conscious organization than the CIA, are sworn to secrecy concerning their employer’s products. They are also, collectively, probably the only people on earth capable of bypassing the layers of security integrated, onion-like, over the years into Apple’s hardware and software.

    Now, in a perfect storm for hackers, the Software-Engineering-by-Attorney branch of the US government is ordering Apple to assemble a unique cross-functional team tasked with comprehensively defeating its own otherwise impenetrable–indeed, unknowable–overlapping barriers. The sole remaining defense: a judicially decreed limitation to operate on a particular device serial number.

    Let’s tally the score: From an unknown but certainly daunting number of unidentified security hurdles to a single known breach point–the device ID. It’s hard to see how it could possibly get much better for hackers. And consider the prize: the world’s most popular smart phone platform.

    • #23
  24. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    Carey J.: Once hackers (or foreign intel agencies) get their hands on the app, they can reverse engineer it and remove the single-device code.

    Except the SIF is cryptographically signed using Apple’s private key, or at least that’s how I understood this part:

    providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (“SIF”)

    Therefore hackers can’t modify it without invalidating the signature.  Unless they crack the crypto, in which case they can hack in anyway with or without this court order.

    • #24
  25. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    George Savage: Carey J is exactly right. Apple engineers, operating in a more compartmentalized, security-conscious organization than the CIA, are sworn to secrecy concerning their employer’s products. They are also, collectively, probably the only people on earth capable of bypassing the layers of security integrated, onion-like, over the years into Apple’s hardware and software.

    But strong crypto isn’t based on security-by-obscurity, it’s based on hard math.  The best crypto algorithms (the kind Apple uses as far as I’m aware) are not proprietary secrets, rather they are published for peer review and implemented in open-source code that anyone can read.  If a lot of smart people have reviewed it and concluded there’s no computationally feasible way to crack it without knowing the secret key, then you can be reasonably confident it’s secure.

    • #25
  26. George Savage Member
    George Savage
    @GeorgeSavage

    Joseph Stanko:

    Carey J.: Once hackers (or foreign intel agencies) get their hands on the app, they can reverse engineer it and remove the single-device code.

    Except the SIF is cryptographically signed using Apple’s private key, or at least that’s how I understood this part:

    providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (“SIF”)

    Therefore hackers can’t modify it without invalidating the signature. Unless they crack the crypto, in which case they can hack in anyway with or without this court order.

    You are still left defending myriad routes to a single-point failure. Crack the crypto, social engineer a copy of the source code, or pay an Apple intern to share the framework for what you already know is the key to Apple’s kingdom and voilà.

    Right now, no such key exists, nobody knows whether or not it is possible to construct such a key, or even what level of effort is required to make the attempt.  Once Apple builds what NR’s Kevin Williamson terms “FBiOS,” it is very likely to proliferate. Consider, for instance, that the most closely guarded secret of the 2oth century, the design of the atomic bomb, lasted for only four years.

    • #26
  27. Stephen Dawson Inactive
    Stephen Dawson
    @StephenDawson

    Joseph Stanko:

    Stephen Dawson: Except for establishing the precedent that law enforcement can force the company to break the code. Then the Manhattan DA can get a judge to order Apple do it for the 150 iPhones he has in his lockup … and so on (thanks to not-so-Lazy_Millennial for the link)

    So you don’t think that even with a warrant the government should have access to the data? That seems to be a separate argument from the OP’s claim that such access would create a backdoor that hackers could exploit.

    Are you against the power to issue warrants in general? Should we eliminate all search warrants across the board?

    Former cop here. I’m okay with warrants. I’m not okay with conscription, which is principally what the OP was talking about.

    • #27
  28. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    George Savage: Right now, no such key exists

    What key are you talking about here?  I’m talking about the private key Apple uses today to sign their firmware and iOS updates to prove they are authentic updates from Apple that have not been tampered with by a man-in-the-middle whenever you download the latest iOS version.  That key exists today.  It’s not something new mandated by this judge.

    • #28
  29. MarciN Member
    MarciN
    @MarciN

    Most people sync all their devices–their home and work personal computers, their tablet, and their smartphones. It seems to me that entry into one would allow entry into all of them. A breach of security like this is going to put a big hole in online commerce. This is the very thing all of the tech companies have been trying avoid.

    • #29
  30. Kozak Member
    Kozak
    @Kozak

    “Let’s tally the score: From an unknown but certainly daunting number of unidentified security hurdles to a single known breach point–the device ID. It’s hard to see how it could possibly get much better for hackers. And consider the prize: the world’s most popular smart phone”

    Remember DVD encryption? Years spent on agreeing on a standard and implementing it. And a group of Norweigen hackers found one unencrypted key and we’re able to bust it wide open with DeCss…

    • #30
Become a member to join the conversation. Or sign in if you're already a member.