What about Guarding the Front Door?

 

iPhone_6_PLUS_preview_MG_1875Yesterday, we learned that US Magistrate Judge Sheri Pym ordered Apple to help the FBI break into the iPhone of San Bernardino murderer Syed Rizwan Farook. Perhaps mindful of King Canute’s experience stopping the tide, the judge stopped short of commanding Farook’s phone to reveal the encrypted data directly, instead instructing Apple to reorient its developers from productive endeavors to undermining its own carefully constructed software- and hardware-based security architecture.

Not long ago, conservatives were appalled when Congress passed and the Supreme Court upheld a law requiring citizens to purchase a particular product — Obama-certified health insurance. Now we have a federal judge, drawing authority solely from the All Writs Act of 1789, ordering a private company to create a custom product for the government’s use. And what a product! If implemented, the judge’s order will create a weaponized piece of software capable of taking down the world’s second most valuable company. Collateral damage could include large swathes of our globally-connected economy. Hyperbole? Read the relevant portion of the order for yourself and imagine what malicious hackers could do with such an app:

Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

Apple’s reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (“SIF”) that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device’s flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade (“DFU”) mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.

But don’t worry. We can count on the FBI to keep iBackDoor every bit as secure as its sensitive background investigation files. What could go wrong?

A troubling pattern emerges. The government, for reasons of political correctness, fails in its particular security responsibilities and responds by putting civil liberties at risk generally out of a twisted sense of fairness. For instance, privacy concerns shielded the jihadist social media posts of San Bernardino shooter Tafsheen Malik from official scrutiny as she applied from Pakistan to join her husband Farook in soft target-rich Southern California, but nothing deters the FBI and a federal court from demanding that Apple risk the privacy of hundreds of millions of iPhone users.

I have a simple idea: Instead of conscripting a private company to open a back door into the smart phone used by millions, why not guard the front door into the United States?

Published in Law, Science & Technology
Like this post? Want to comment? Join Ricochet’s community of conservatives and be part of the conversation. Join Ricochet for Free.

There are 52 comments.

Become a member to join the conversation. Or sign in if you're already a member.
  1. Tom Meyer, Ed. Member
    Tom Meyer, Ed.
    @tommeyer

    Good post, George.

    The government’s approach here is like trying to winterize your windows while leaving the front door open (if winterizing the windows somehow also made them more prone to being opened to the outside).

    • #31
  2. Son of Spengler Member
    Son of Spengler
    @SonofSpengler

    I find this just galling. I realize that the professionals and the political officials are not always in synch, but just as the professionals are insisting we give them tools that erode our privacy, the political bosses demonstrate each day how little regard they have for national security.

    Don’t look at Facebook posts. Mosques are off-limits too.

    Sure, the Secretary of State can run official business off of a homebrew email server. One with weak security for most of her tenure, and zero security for the first three months.

    Let’s run weapons to Mexican drug cartels and Al-Qaeda affiliated Lybian fighters. What could go wrong?

    When the government demonstrates its own commitment to a serious national security and anti-terrorism mindset — and the professionals go on record against their feckless bosses, even resigning if necessary– then maybe I’ll consider trusting them with increased power. Until then, no. No erosion of privacy rights for Potemkin security.

    • #32
  3. The Reticulator Member
    The Reticulator
    @TheReticulator

    This isn’t the real problem, anyway. The real problem is that the powerful home builders and real estate associations, despite benefiting from government tax preferences on their customers’ home loans, are resisting calls by the government to install mandatory listening devices in every room of the buildings where people live and work.  This handicaps the ability of the FBI and the DHS to fight terrorism.

    • #33
  4. Lazy_Millennial Inactive
    Lazy_Millennial
    @LazyMillennial

    Stephen Dawson:

    Joseph Stanko:

    Stephen Dawson: Except for establishing the precedent that law enforcement can force the company to break the code. Then the Manhattan DA can get a judge to order Apple do it for the 150 iPhones he has in his lockup … and so on (thanks to not-so-Lazy_Millennial for the link)

    So you don’t think that even with a warrant the government should have access to the data? That seems to be a separate argument from the OP’s claim that such access would create a backdoor that hackers could exploit.

    Are you against the power to issue warrants in general? Should we eliminate all search warrants across the board?

    Former cop here. I’m okay with warrants. I’m not okay with conscription, which is principally what the OP was talking about.

    DING DING DING we have a winner. As Kevin D Williamson puts it,

    … the federal government’s answer is: “Why won’t those mean meanies at Apple do our jobs for us? So what if that means rendering many of their products entirely worthless and betraying the trust of millions of customers?”

    • #34
  5. dittoheadadt Inactive
    dittoheadadt
    @dittoheadadt

    So someone can do Stuxnet, but we can’t break into one iPhone?  Boy, do I feel safe in the hands of our government.

    • #35
  6. George Savage Member
    George Savage
    @GeorgeSavage

    Son of Spengler:
    Son of Spengler

    I find this just galling. I realize that the professionals and the political officials are not always in synch, but just as the professionals are insisting we give them tools that erode our privacy, the political bosses demonstrate each day how little regard they have for national security.

    Son of Spengler, you just made my point far more succinctly than I could. Thank you.

    • #36
  7. George Savage Member
    George Savage
    @GeorgeSavage

    Joseph Stanko:

    George Savage: Right now, no such key exists

    What key are you talking about here? I’m talking about the private key Apple uses today to sign their firmware and iOS updates to prove they are authentic updates from Apple that have not been tampered with by a man-in-the-middle whenever you download the latest iOS version. That key exists today. It’s not something new mandated by this judge.

    Joseph, I understand that installation of the new–let’s use Williamson’s terrific term–FBiOS would initially be contingent on a valid Apple signing certificate. However, this is now a single point of failure. If anyone anywhere can ever find a way around this one item, by any means, then he has a comprehensive “master key” enabling remote access to iPhones everywhere. And if our hypothetical hacker gains access to the source code, he also has a detailed map to iPhone security that will enable custom-coded, more comprehensive hacks.

    And given the extraordinary value of the Apple ecosystem, every criminal or government intelligence hacker on earth will devote himself to getting around this single remaining barrier.

    I prefer defense in depth: break through one barrier only to be confronted by another that you didn’t even realize was there.

    • #37
  8. Hank Rhody Contributor
    Hank Rhody
    @HankRhody

    Son of Spengler: No erosion of privacy rights for Potemkin security.

    I often have trouble that, having read halfway through a comment and already liked it, I am unable to like it again when it comes to the end. Bravo!

    • #38
  9. Hank Rhody Contributor
    Hank Rhody
    @HankRhody

    Let’s try an analogy here. An iPhone, broadly speaking is analogous to a safe with a combination lock. The government can issue warrants for the contents of that safe, but they can’t force you to give up your combination. With a physical safe the government could come in with a plasma torch and cut the thing open, but they lack that option to get into an iPhone, so they’re reduced to asking politely.

    What the government is ordering here is for the safe manufacturer to provide a master keyhole in the combination lock. The government is then provided with a master key they can use to open the safe if they have a warrant.

    The problem here is that now, anyone with a set of lockpicks can get in and get those documents too. You can make that lock more complicated, but eventually the bad guys are going to find a way in. Doing this makes every iPhone fundamentally less secure. As a matter of precedent it makes every electronic device less secure over the long term.

    • #39
  10. Frank Soto Member
    Frank Soto
    @FrankSoto

    Hank Rhody:Let’s try an analogy here. An iPhone, broadly speaking is analogous to a safe with a combination lock. The government can issue warrants for the contents of that safe, but they can’t force you to give up your combination. With a physical safe the government could come in with a plasma torch and cut the thing open, but they lack that option to get into an iPhone, so they’re reduced to asking politely.

    What the government is ordering here is for the safe manufacturer to provide a master keyhole in the combination lock. The government is then provided with a master key they can use to open the safe if they have a warrant.

    The problem here is that now, anyone with a set of lockpicks can get in and get those documents too. You can make that lock more complicated, but eventually the bad guys are going to find a way in. Doing this makes every iPhone fundamentally less secure. As a matter of precedent it makes every electronic device less secure over the long term.

    The precedent is more concerning than the current case.  If the courts rule the all writs act means that the government may force software companies to weaken their security through back doors so the government can get in, the battle over things like encryption is already lost.

    • #40
  11. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    George Savage: If anyone anywhere can ever find a way around this one item, by any means, then he has a comprehensive “master key” enabling remote access to iPhones everywhere.

    Would it enable remote access though?

    Normally when Apple releases an iOS update you must unlock your phone, agree to the prompt to install the update, and then re-enter your pass code before the phone will verify and install the update.  This whole process was designed to prevent hackers from pushing out phony updates as a way to break into the system.  If there’s a way to bypass that today, it’s a flaw and Apple should fix it.

    I assumed the technique in question here requires physically connecting the phone via USB cable to a computer.  If so, there’s no way to use it for remote access, which greatly limits the scope of potential abuse.

    • #41
  12. Stoicous Inactive
    Stoicous
    @Stoicous

    If you throw away the right to keep an encrypted device, unless it has a backdoor, because it could be used for terrorist activities. Then you can’t possibly be surprised if someone else throws away your right to own a gun because it could be used for terrorist activities.

    • #42
  13. TechRhino Inactive
    TechRhino
    @TechRhino

    Thank you Mr. Savage! I get tired of all the things the government “has” to do to ordinary law-abiding US citizens (or companies) ranging from making us take off our shoes to demanding access to all of our personal data when most of it is necessary because they refuse to take even basic steps to keep potential or even actual terrorists and criminals out of the country.

    • #43
  14. George Savage Member
    George Savage
    @GeorgeSavage

    Joseph Stanko:

    George Savage: If anyone anywhere can ever find a way around this one item, by any means, then he has a comprehensive “master key” enabling remote access to iPhones everywhere.

    Would it enable remote access though?

    . . .

    I assumed the technique in question here requires physically connecting the phone via USB cable to a computer. If so, there’s no way to use it for remote access, which greatly limits the scope of potential abuse.

    Joseph, from the judge’s order quoted in my post above:

    (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE

    Seems like remote access to me.

    • #44
  15. Tom Davis Member
    Tom Davis
    @TomDavis

    Now it looks like this passcode was changed after this guy died and while the iPhone was in government custody.  It appears that the government may have wanted to pick a very unappealing guy’s iPhone to set a precedent.  As the adage goes, “Hard cases make bad law” and our government tried to make this as hard a case as it could even if it had to cheat to do it.

    Do not trust these folks.

    • #45
  16. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    George Savage:

    (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE

    Seems like remote access to me.

    I read that as requiring it to support at least one of those protocols, not all of them.  Enabling support “via the physical device port” alone would meet the requirement.

    Regardless this is enabled after installing the SIF.  My question was whether physical access to the device port was needed to load this SIF via “Device Firmware Upgrade (“DFU”) mode, recovery mode, or other applicable mode.”

    • #46
  17. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    Hank Rhody: What the government is ordering here is for the safe manufacturer to provide a master keyhole in the combination lock. The government is then provided with a master key they can use to open the safe if they have a warrant.

    This is a useful metaphor.  The government could as you say order the safe manufacturer to add a master keyhole to next year’s model of safes.  This would make the safes less secure for all their customers.  This is a terrible idea, and I agree with everyone else here the government shouldn’t force Apple to add back doors to future products.

    Here’s the part I’m not following, though: in your safe analogy, this would not help the government break into last year’s model of safe, since it was built without a master keyhole.  Making next year’s model less secure won’t help break into a particular safe that was already built and locked.

    Same thing applies here: Farook’s phone is running some existing version of iOS.  Whatever the judge orders, Apple can’t build a time machine, go back in time, and add a backdoor to that version of iOS.  So, if Apple has a master key to unlock the phone, it means they already have one for existing iPhones.

    It seems to me the judge is essentially ordering Apple to hand over a key they already have, not to create something new.  Those are separate issues, right?

    • #47
  18. Carey J. Inactive
    Carey J.
    @CareyJ

    Joseph Stanko:

    This is a useful metaphor. The government could as you say order the safe manufacturer to add a master keyhole to next year’s model of safes.

    Here’s the part I’m not following, though: in your safe analogy, this would not help the government break into last year’s model of safe, since it was built without a master keyhole. Making next year’s model less secure won’t help break into a particular safe that was already built and locked.

    Same thing applies here: Farook’s phone is running some existing version of iOS. Whatever the judge orders, Apple can’t build a time machine, go back in time, and add a backdoor to that version of iOS. So, if Apple has a master key to unlock the phone, it means they already have one for existing iPhones.

    It seems to me the judge is essentially ordering Apple to hand over a key they already have, not to create something new. Those are separate issues, right?

    There is no such key. The judge ordered them to create such a key and give it to the Feds. Once such a hack is created, it is inevitable that it will get into the hacker community. Just knowing that there is a way to beat Apple’s software will further spur efforts to break into iPhones.

    Once upon a time, “everybody knew” that a man couldn’t run a four minute mile. Now they’re commonplace.

    • #48
  19. Joseph Stanko Coolidge
    Joseph Stanko
    @JosephStanko

    Tom Davis:Now it looks like this passcode was changed after this guy died and while the iPhone was in government custody. It appears that the government may have wanted to pick a very unappealing guy’s iPhone to set a precedent. As the adage goes, “Hard cases make bad law” and our government tried to make this as hard a case as it could even if it had to cheat to do it.

    Do not trust these folks.

    A San Bernardino county employee reset the password to Farook’s iCloud account, and that’s how the feds were able to get a copy of his last iCloud backup.  That’s a different password than the pass code to unlock the phone.

    This is another interesting point that I haven’t seen discussed much: the phone didn’t belong to Farook, it was his work phone, not a personal phone.  His employer (the county) is the legal owner of the phone and wants access to the data on it.

    • #49
  20. Stoicous Inactive
    Stoicous
    @Stoicous

    Joseph Stanko:

    Tom Davis:

    ———————————————————–

    Do not trust these folks.

    ———————————————————————-

    This is another interesting point that I haven’t seen discussed much: the phone didn’t belong to Farook, it was his work phone, not a personal phone. His employer (the county) is the legal owner of the phone and wants access to the data on it.

    It is not a matter of the FBI having the right to access the phone. Farook is a dead terrorist, of course they have the right to investigate his device if they can.

    The issue is the FBI essentially demanding that Apple Inc. be conscripted to help them, and that Apple Inc. violate their own policies and develop a backdoor for the government to use to access this data on this class of device. The goal of the FBI is to set a precedent whereby they can require devices to have a special backdoor for law enforcement, essentially nullifying your right to have a %100 encrypted device.

    It ends up being a lot like Gun Rights. The government demanding Back doors be made is going to lead to heavily compromised security across these devices, hurting righteous citizens who want to use encryption to protect things like Trade Secrets; and a black market for encryption will develop anyways for criminals.

    • #50
  21. Hank Rhody Contributor
    Hank Rhody
    @HankRhody

    Joseph Stanko: Here’s the part I’m not following, though: in your safe analogy, this would not help the government break into last year’s model of safe, since it was built without a master keyhole. Making next year’s model less secure won’t help break into a particular safe that was already built and locked.

    Yeah, there’s no way to get the program onto the phone they’re trying to get into right now. But all of yesteryear’s iPhones? You gotta update them sometime. A physical safe doesn’t need the latest version to be secure. A program can have undetected security flaws that could also let someone in.

    Assuming you’re paranoid about this, you’ve got two options. Leave your iPhone without critical security updates in the hopes that no one has found a way to get in yet, or update to the new government approved version and hope that no one without a warrant figures out a way to get in.

    • #51
  22. Hank Rhody Contributor
    Hank Rhody
    @HankRhody

    Stoicous, TechRhino, I haven’t seen y’all around before. Welcome to Ricochet!

    • #52
Become a member to join the conversation. Or sign in if you're already a member.