Options

There's no backdoor to this [Encryption] thread

124»

Posts

  • Options
    DaedalusDaedalus Registered User regular
    edited February 2016
    Aioua wrote: »
    Daedalus wrote: »
    Aioua wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    which is kind of silly

    no shit their phone aren't so secure that Apple themselves can't hack them

    They in the past said "Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.", though this statement is no longer on their website.

    As a general principle, you should design your crypto system so that you yourself can't hack it. Taking the government argument out of the question for a moment, let's change the scenario to "Tim Cook wants to decrypt his nephew's phone without his nephew's consent, so he hands it to his engineering team and says that they'll get a million-dollar bonus when it's done. Can they do it?" If the answer is "yes", it wasn't secure in the first place.

    Oh, then in the past they were dirty dirty liars. I'm not even sure why people believed them. Unless the guess limiting is somehow built into the math itself of the crypto algorithm then it was possible to brute force. And since the private key was ultimately tied to a user-entered PIN then if guess limiting is circumvented then brute-forcing is trivial.

    The math itself limits the guesses to 80ms. Which is still a long time, really! Unless you use a six-digit numeric PIN, in which case it's like a day to brute-force the whole keyspace.

    Daedalus on
  • Options
    Phoenix-DPhoenix-D Registered User regular
    Aioua wrote: »
    Daedalus wrote: »
    Aioua wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    which is kind of silly

    no shit their phone aren't so secure that Apple themselves can't hack them

    They in the past said "Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.", though this statement never appeared on their website.

    As a general principle, you should design your crypto system so that you yourself can't hack it. Taking the government argument out of the question for a moment, let's change the scenario to "Tim Cook wants to decrypt his nephew's phone without his nephew's consent, so he hands it to his engineering team and says that they'll get a million-dollar bonus when it's done. Can they do it?" If the answer is "yes", it wasn't secure in the first place.

    Oh, then in the past they were dirty dirty liars. I'm not even sure why people believed them. Unless the guess limiting is somehow built into the math itself of the crypto algorithm then it was possible to brute force. And since the private key was ultimately tied to a user-entered PIN then if guess limiting is circumvented then brute-forcing is trivial.

    Right, but they could easily stick firmware updates behind having a valid SE unlock, preventing the guess limiting from being worked around.

    Which is how they should have done it.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    edited February 2016
    Phoenix-D wrote: »
    Aioua wrote: »
    Daedalus wrote: »
    Aioua wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    which is kind of silly

    no shit their phone aren't so secure that Apple themselves can't hack them

    They in the past said "Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.", though this statement never appeared on their website.

    As a general principle, you should design your crypto system so that you yourself can't hack it. Taking the government argument out of the question for a moment, let's change the scenario to "Tim Cook wants to decrypt his nephew's phone without his nephew's consent, so he hands it to his engineering team and says that they'll get a million-dollar bonus when it's done. Can they do it?" If the answer is "yes", it wasn't secure in the first place.

    Oh, then in the past they were dirty dirty liars. I'm not even sure why people believed them. Unless the guess limiting is somehow built into the math itself of the crypto algorithm then it was possible to brute force. And since the private key was ultimately tied to a user-entered PIN then if guess limiting is circumvented then brute-forcing is trivial.

    Right, but they could easily stick firmware updates behind having a valid SE unlock, preventing the guess limiting from being worked around.

    Which is how they should have done it.

    naw

    that would have made it harder

    but that shit is still software that could be circumvented

    if you want to say "we can't hack our user's data" then your users need proper-length encryption keys such that the keyspace isn't trivial

    this precludes basically anything a user is entering in manually

    Aioua on
    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    DaedalusDaedalus Registered User regular
    Aioua wrote: »
    Phoenix-D wrote: »
    Aioua wrote: »
    Daedalus wrote: »
    Aioua wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    which is kind of silly

    no shit their phone aren't so secure that Apple themselves can't hack them

    They in the past said "Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.", though this statement never appeared on their website.

    As a general principle, you should design your crypto system so that you yourself can't hack it. Taking the government argument out of the question for a moment, let's change the scenario to "Tim Cook wants to decrypt his nephew's phone without his nephew's consent, so he hands it to his engineering team and says that they'll get a million-dollar bonus when it's done. Can they do it?" If the answer is "yes", it wasn't secure in the first place.

    Oh, then in the past they were dirty dirty liars. I'm not even sure why people believed them. Unless the guess limiting is somehow built into the math itself of the crypto algorithm then it was possible to brute force. And since the private key was ultimately tied to a user-entered PIN then if guess limiting is circumvented then brute-forcing is trivial.

    Right, but they could easily stick firmware updates behind having a valid SE unlock, preventing the guess limiting from being worked around.

    Which is how they should have done it.

    naw

    that would have made it harder

    but that shit is still software that could be circumvented

    if you want to say "we can't hack our user's data" then your users need proper-length encryption keys such that the keyspace isn't trivial

    this precludes basically anything a user is entering in manually

    One of the things Apple did right is that the key encrypting the phone's flash is derived partly from a non-extractable key in the phone's CPU. (That fact is enforced in hardware) So you still need to run through the keyspace on the device itself rather than dumping flash and working on it on some gigantic compute cluster in Utah or wherever.

  • Options
    Phoenix-DPhoenix-D Registered User regular
    And someone changed the iCloud password after they got the phone.

    If that hadn't been the case, couldn't they just have triggered a backup and gotten the data that way?

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    Daedalus wrote: »
    Aioua wrote: »
    Phoenix-D wrote: »
    Aioua wrote: »
    Daedalus wrote: »
    Aioua wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    which is kind of silly

    no shit their phone aren't so secure that Apple themselves can't hack them

    They in the past said "Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.", though this statement never appeared on their website.

    As a general principle, you should design your crypto system so that you yourself can't hack it. Taking the government argument out of the question for a moment, let's change the scenario to "Tim Cook wants to decrypt his nephew's phone without his nephew's consent, so he hands it to his engineering team and says that they'll get a million-dollar bonus when it's done. Can they do it?" If the answer is "yes", it wasn't secure in the first place.

    Oh, then in the past they were dirty dirty liars. I'm not even sure why people believed them. Unless the guess limiting is somehow built into the math itself of the crypto algorithm then it was possible to brute force. And since the private key was ultimately tied to a user-entered PIN then if guess limiting is circumvented then brute-forcing is trivial.

    Right, but they could easily stick firmware updates behind having a valid SE unlock, preventing the guess limiting from being worked around.

    Which is how they should have done it.

    naw

    that would have made it harder

    but that shit is still software that could be circumvented

    if you want to say "we can't hack our user's data" then your users need proper-length encryption keys such that the keyspace isn't trivial

    this precludes basically anything a user is entering in manually

    One of the things Apple did right is that the key encrypting the phone's flash is derived partly from a non-extractable key in the phone's CPU. (That fact is enforced in hardware) So you still need to run through the keyspace on the device itself rather than dumping flash and working on it on some gigantic compute cluster in Utah or wherever.

    I agree! Apple can say with pride that--without the original phone--the data is unhackable. (Barring exploitable flaws in the cryto math or generating hardware IDs)

    I think to match their claims you'd need a phone with both that hardware chip along with some kind of hardware keyfob that contained the user's half of a robust key. Unless you have both no dice.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    emnmnmeemnmnme Registered User regular
    Phoenix-D wrote: »
    And someone changed the iCloud password after they got the phone.

    If that hadn't been the case, couldn't they just have triggered a backup and gotten the data that way?

    Apple would be happy to hand over the iCloud data to authorities but I don't think it existed.
    https://gma.yahoo.com/san-bernardino-shooters-apple-id-passcode-changed-while-234003785--abc-news-topstories.html
    Missing the opportunity for a backup was crucial because some of the information stored on the phone would have been backed up to the iCloud and could have potentially been retrieved. According to court records, the iPhone had not been backed up since Oct. 19, 2015, one-and-a-half months before the attack and that this “indicates to the FBI that Farook may have disabled the automatic iCloud backup function to hide evidence.”

  • Options
    Phoenix-DPhoenix-D Registered User regular
    emnmnme wrote: »
    Phoenix-D wrote: »
    And someone changed the iCloud password after they got the phone.

    If that hadn't been the case, couldn't they just have triggered a backup and gotten the data that way?

    Apple would be happy to hand over the iCloud data to authorities but I don't think it existed.
    https://gma.yahoo.com/san-bernardino-shooters-apple-id-passcode-changed-while-234003785--abc-news-topstories.html
    Missing the opportunity for a backup was crucial because some of the information stored on the phone would have been backed up to the iCloud and could have potentially been retrieved. According to court records, the iPhone had not been backed up since Oct. 19, 2015, one-and-a-half months before the attack and that this “indicates to the FBI that Farook may have disabled the automatic iCloud backup function to hide evidence.”

    Actually I was thinking more along the lines of forcing auto backup to trigger / turning it back on, which seems easier than breaking into the phone directly.

    It's possible the "stop auto backup" method was just "don't bring it to authorized wifi points" as well.

  • Options
    DaedalusDaedalus Registered User regular
    Aioua wrote: »
    Daedalus wrote: »
    Aioua wrote: »
    Phoenix-D wrote: »
    Aioua wrote: »
    Daedalus wrote: »
    Aioua wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    which is kind of silly

    no shit their phone aren't so secure that Apple themselves can't hack them

    They in the past said "Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.", though this statement never appeared on their website.

    As a general principle, you should design your crypto system so that you yourself can't hack it. Taking the government argument out of the question for a moment, let's change the scenario to "Tim Cook wants to decrypt his nephew's phone without his nephew's consent, so he hands it to his engineering team and says that they'll get a million-dollar bonus when it's done. Can they do it?" If the answer is "yes", it wasn't secure in the first place.

    Oh, then in the past they were dirty dirty liars. I'm not even sure why people believed them. Unless the guess limiting is somehow built into the math itself of the crypto algorithm then it was possible to brute force. And since the private key was ultimately tied to a user-entered PIN then if guess limiting is circumvented then brute-forcing is trivial.

    Right, but they could easily stick firmware updates behind having a valid SE unlock, preventing the guess limiting from being worked around.

    Which is how they should have done it.

    naw

    that would have made it harder

    but that shit is still software that could be circumvented

    if you want to say "we can't hack our user's data" then your users need proper-length encryption keys such that the keyspace isn't trivial

    this precludes basically anything a user is entering in manually

    One of the things Apple did right is that the key encrypting the phone's flash is derived partly from a non-extractable key in the phone's CPU. (That fact is enforced in hardware) So you still need to run through the keyspace on the device itself rather than dumping flash and working on it on some gigantic compute cluster in Utah or wherever.

    I agree! Apple can say with pride that--without the original phone--the data is unhackable. (Barring exploitable flaws in the cryto math or generating hardware IDs)

    I think to match their claims you'd need a phone with both that hardware chip along with some kind of hardware keyfob that contained the user's half of a robust key. Unless you have both no dice.

    Even if the user used an alphanumeric password (and it was truly random), it could take years! But ain't nobody got time for that, I guess.

  • Options
    programjunkieprogramjunkie Registered User regular
    Daedalus wrote: »
    Aioua wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    which is kind of silly

    no shit their phone aren't so secure that Apple themselves can't hack them

    They in the past said "Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.", though this statement is no longer on their website.

    As a general principle, you should design your crypto system so that you yourself can't hack it. Taking the government argument out of the question for a moment, let's change the scenario to "Tim Cook wants to decrypt his nephew's phone without his nephew's consent, so he hands it to his engineering team and says that they'll get a million-dollar bonus when it's done. Can they do it?" If the answer is "yes", it wasn't secure in the first place.

    Yeah. To be clear, the problem isn't this specific case, it is the fact that breakable is breakable. The case we are talking about is a valid warrant on a murderer's phone that wasn't even really theirs. No one should disagree with the legitimacy of that hyper specific data being accessed, if it can be. But again, the same technique can be applied to a retiree who is hacked to steal their life savings, a pro-democracy protester in China, US government officials being hacked by enemy intelligence, etc, etc.

    If Apple finds a way to break into the phone, they are morally obligated to bar the gate behind themselves, so that method can never again be used. Or, if it isn't possible to fix, then inform all affected customers and offer them a reasonable upgrade price. But a lot of the people arguing in favor of this warrant are suggesting exactly the opposite of that, like:
    Nothing should be beyond the reach of a valid warrant.

    which means that no one in the entire world is allowed decent security, and I'd argue that is self-evidently not ideal.

  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Daedalus wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    Now that's bullshit; I can't get OS-level software to consistently work on two different pieces of hardware when I'm actually trying to. Unless they mean the general technique of "firmware update removes barriers to brute-forcing the PIN on the phone", in which case, yeah, that means "secure enclave" was a useless but expensive engineering distraction. Well, we've all been there, they can try harder next time.

    The HW is probably similar enough between models

  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    Nothing should be beyond the reach of a valid warrant.

    which means that no one in the entire world is allowed decent security, and I'd argue that is self-evidently not ideal.

    That's not the point though. The point is in this particular case Apple isn't objecting they can't do it, they're saying they won't. That's quite different to actually having a security model that can't be broken at all.

    And that's the point: the warrant is valid, for something Apple can provide.

  • Options
    ClipseClipse Registered User regular
    Phyphor wrote: »
    Daedalus wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    Now that's bullshit; I can't get OS-level software to consistently work on two different pieces of hardware when I'm actually trying to. Unless they mean the general technique of "firmware update removes barriers to brute-forcing the PIN on the phone", in which case, yeah, that means "secure enclave" was a useless but expensive engineering distraction. Well, we've all been there, they can try harder next time.

    The HW is probably similar enough between models

    The iPhone 5c in this case uses the A6 CPU; the iPhone 5s and future models use the A7 (or future CPUs). The A7 was the first Apple iDevice CPU to incorporate the "Secure Enclave" (cite). There is pretty good reason to believe that any workaround on an iPhone 5c will not apply on later versions of the iPhone (or other iDevices).

  • Options
    LD50LD50 Registered User regular
    Clipse wrote: »
    Phyphor wrote: »
    Daedalus wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    Now that's bullshit; I can't get OS-level software to consistently work on two different pieces of hardware when I'm actually trying to. Unless they mean the general technique of "firmware update removes barriers to brute-forcing the PIN on the phone", in which case, yeah, that means "secure enclave" was a useless but expensive engineering distraction. Well, we've all been there, they can try harder next time.

    The HW is probably similar enough between models

    The iPhone 5c in this case uses the A6 CPU; the iPhone 5s and future models use the A7 (or future CPUs). The A7 was the first Apple iDevice CPU to incorporate the "Secure Enclave" (cite). There is pretty good reason to believe that any workaround on an iPhone 5c will not apply on later versions of the iPhone (or other iDevices).

    The workaround has nothing to do with the hardware or how the keys are stored, but with how often/how many times the OS will allow someone to try a pin code before bricking the phone. The FBI needs Apple to write the code because only Apple has the ability to sign the code so that the phone will update to it

Sign In or Register to comment.