Options

There's no backdoor to this [Encryption] thread

13

Posts

  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Orthanc wrote: »
    Polaritie wrote: »
    Actually, I'm not convinced that you couldn't reverse engineer any master key to an algorithm from source, and it's not practical to make a unique dupe key for every encryption that happens and save it with the government somewhere (actually, impossible, because unplug from internet, encrypt locally, send file).

    Bit of a tangent since I agree with your overall point, but it is absolutely possible to have a master key that's hard to reverse engineer from source, at least if you discount brute forcing the key.

    If you're not trying to hide the back door it's easy, all you need to do is include the law enforcement public key, then encrypt everything with that as well as the destination public key. Law enforcement has the private key so can decrypt the message, just like the intended recipient can, but nobody else can as they don't have the key. The fact the public key is embedded in the source doesn't help reverse engineering the key as it was public anyway. Ultimately the private key could be brute forced, but it's basically just a matter of making it big enough to resist brute force.

    If you're trying to be subtle about it things get a bit harder, but it's still possible. The key to the theorised Dual EC DRGB backdoor can't be easily reversed engineered as it involves solving the same discrete log problem that EC crypto is based on in the first place. It's been shown that if you generate the constants a particular way you can backdoor the algorithm, and there's no explanation of how the NIST constants were generated, and it matches some of the projects hinted at in the Snowden docs. So it's probably backdoored but no researcher has found the key and proved that yet.

    The big problem is not that the keys can be reverse engineered from source, it's that whoever the backdoor is targeted at has to have the key for the back door to be useful. History has shown that securing such a key is very problematic and it will leak eventually. Then you would need to have some way of updating all copies of the algorithm to use a new key, which is never actually going to happen.

    As we know, software tends to live longer than expected, so any assumption about how big the key needs to be to reduce brute forcing, or how long a process needs to live to allow updating in the event of disclousre of the master key are bound to be wrong.

    You can't do anything that involves separate encryptions as it is trivial to encrypt "haha screw you cops" with the law enforcement key and encrypt your actual message with the destination key - even ignoring the guarantee that the law enforcement key would leak immediately

  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    Also, quantum computing is not a panacea either - it does break everything that currently relies on discrete log or factorization yes, but it does not break symmetrical (effectively halves key length) and there are asymmetrical algorithms that are (as-yet) unbroken by it, they just suck compared to the current algorithms - long keys (megabit), slow execution time, etc. It's a short term fix

  • Options
    PolaritiePolaritie Sleepy Registered User regular
    PantsB wrote: »
    Polaritie wrote: »
    LD50 wrote: »
    PantsB wrote: »
    The real issue is this conflict
    1. There's a legitimate and important need for governments (such as the US) to conduct electronic surveillance and intercept electronic communications
      • To protect national interests and security.
      • To prevent and provide evidence against violations of criminal law
    2. Encryption makes it very difficult for these needs to be met (if those communicating encrypt their data), and decrypting most encryption algorithms is not feasible with conventional means
    3. Even if larger software vendors provided a means by which their encryption could be circumvented, there are common algorithms that can be implemented with very little development time (hours) that make it nearly impossible to decrypt a message (such as RSA, you can implement that in an afternoon).

    There's nothing theoretically impossible saying we can't just decrypt these messages. We just don't have a method yet. By definition, any message can be decrypted given a sufficient number of attempts. If we can use quantum computing to skip to the right answer (essentially), there may be a way for encryption to be secure against nearly everyone but not absolutely everyone.

    I have some contentions with number 1. The government doesn't need, nor is allowed, to open and read my mail. They're not allowed to break into my house and drill open my safe. They've never been able to just do these kinds of things with it warrants, and they've never had trouble staying in control of national security.

    But that's the whole point - the government can do all those things, provided that they demonstrate that there is a legitimate government interest to a court and acquire a warrant. Part of the issue here is that there are people who want to make even that functionally impossible.

    If you go back some time, during the TorrentSpy case, there was a lot of criticism of the government penalizing them for interfering with the process of discovery, even though discovery rules are considered a legitimate use of government power.

    It's been possible to communicate in ways beyond the reach of warrants since forever though. Cryptography isn't new. It's certainly easier, but it's not new. In-person meetings, documents destroyed after delivery... this is all centuries-old.
    In what way are those beyond the reach of warrants? In person meetings can be overheard, including electronically. Documents can be intercepted and read - that would be directly analogous.

    They're beyond retroactive searches - if you have a warrant before it happens you're good. If not, you missed your chance. Which coincidentally also applies to modern encryption because if you have a warrant you can start doing things the NSA already does overseas and just hijack hardware (find the guy via cell, and just root his phone, etc.)

    Steam: Polaritie
    3DS: 0473-8507-2652
    Switch: SW-5185-4991-5118
    PSN: AbEntropy
  • Options
    PantsBPantsB Fake Thomas Jefferson Registered User regular
    Polaritie wrote: »
    PantsB wrote: »
    Polaritie wrote: »
    LD50 wrote: »
    PantsB wrote: »
    The real issue is this conflict
    1. There's a legitimate and important need for governments (such as the US) to conduct electronic surveillance and intercept electronic communications
      • To protect national interests and security.
      • To prevent and provide evidence against violations of criminal law
    2. Encryption makes it very difficult for these needs to be met (if those communicating encrypt their data), and decrypting most encryption algorithms is not feasible with conventional means
    3. Even if larger software vendors provided a means by which their encryption could be circumvented, there are common algorithms that can be implemented with very little development time (hours) that make it nearly impossible to decrypt a message (such as RSA, you can implement that in an afternoon).

    There's nothing theoretically impossible saying we can't just decrypt these messages. We just don't have a method yet. By definition, any message can be decrypted given a sufficient number of attempts. If we can use quantum computing to skip to the right answer (essentially), there may be a way for encryption to be secure against nearly everyone but not absolutely everyone.

    I have some contentions with number 1. The government doesn't need, nor is allowed, to open and read my mail. They're not allowed to break into my house and drill open my safe. They've never been able to just do these kinds of things with it warrants, and they've never had trouble staying in control of national security.

    But that's the whole point - the government can do all those things, provided that they demonstrate that there is a legitimate government interest to a court and acquire a warrant. Part of the issue here is that there are people who want to make even that functionally impossible.

    If you go back some time, during the TorrentSpy case, there was a lot of criticism of the government penalizing them for interfering with the process of discovery, even though discovery rules are considered a legitimate use of government power.

    It's been possible to communicate in ways beyond the reach of warrants since forever though. Cryptography isn't new. It's certainly easier, but it's not new. In-person meetings, documents destroyed after delivery... this is all centuries-old.
    In what way are those beyond the reach of warrants? In person meetings can be overheard, including electronically. Documents can be intercepted and read - that would be directly analogous.

    They're beyond retroactive searches - if you have a warrant before it happens you're good. If not, you missed your chance. Which coincidentally also applies to modern encryption because if you have a warrant you can start doing things the NSA already does overseas and just hijack hardware (find the guy via cell, and just root his phone, etc.)

    No you can't. That's the whole point. There's literally no technological different between intercepting a message with a warrant or without. Yes if you can physically insert hardware into someone's computer or phone you can bypass encryption. That's rarely the case

    11793-1.png
    day9gosu.png
    QEDMF xbl: PantsB G+
  • Options
    OrthancOrthanc Death Lite, Only 1 Calorie Off the end of the internet, just turn left.Registered User, ClubPA regular
    So this is interesting.

    Looks like if you're using Windows 10 disk encryption, the encryption key is sent to Microsoft. WIth no ability to turn this feature off.

    It appears to be more a considered usability feature than a back door. I.e. MS would prefer to have the perception problems with holding keys than the perception problems from "loosing" someone's data when they forget the password.

    But still, not a great move.

    orthanc
  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    All these ban encryption bills popping up at the moment is getting seriously worrying. The issue is very much not that it won't work, or even backdoors - it's that any encryption authorities can't bypass is going to be made illegal ala Britain's 'you must disclose passwords when asked' laws which allow for indefinite detention on the supposition you know them or that encrypted volumes may exist (which can be made unprovable - what true crypt did).

    I have a couple of gpg keys I've forgotten the password too, for example.

  • Options
    CalicaCalica Registered User regular
    All these ban encryption bills popping up at the moment is getting seriously worrying. The issue is very much not that it won't work, or even backdoors - it's that any encryption authorities can't bypass is going to be made illegal ala Britain's 'you must disclose passwords when asked' laws which allow for indefinite detention on the supposition you know them or that encrypted volumes may exist (which can be made unprovable - what true crypt did).

    I have a couple of gpg keys I've forgotten the password too, for example.

    Wait, so under Britain's password disclosure laws, the police can put someone in jail for up to five years without even needing to prove that a crime took place, let alone that the defendant is guilty? Does the foundation of Britain's justice system actually allow for that, or is the law itself illegal?

  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    edited January 2016
    Calica wrote: »
    All these ban encryption bills popping up at the moment is getting seriously worrying. The issue is very much not that it won't work, or even backdoors - it's that any encryption authorities can't bypass is going to be made illegal ala Britain's 'you must disclose passwords when asked' laws which allow for indefinite detention on the supposition you know them or that encrypted volumes may exist (which can be made unprovable - what true crypt did).

    I have a couple of gpg keys I've forgotten the password too, for example.

    Wait, so under Britain's password disclosure laws, the police can put someone in jail for up to five years without even needing to prove that a crime took place, let alone that the defendant is guilty? Does the foundation of Britain's justice system actually allow for that, or is the law itself illegal?

    Yep, this has been threatened to people on the basis of "you have TrueCrypt installed, you must have a secret hidden partition"

    There has to be an actiive investigation, but "terrorism" is pretty broad.

    electricitylikesme on
  • Options
    programjunkieprogramjunkie Registered User regular
    Yeah, Britain's "well, you must be guilty of something, or we wouldn't be investigating you" stance is throwing out all decent conceptions of justice because a computer is involved.

    In good news, someone is competent:

    http://www.digitaltrends.com/computing/nsa-director-actually-says-encryption-backdoors-are-a-bad-idea/
    NSA director Admiral Michael Rogers has said that encryption is “foundational to the future” — and privacy shouldn’t be sacrificed for security.

    The statement is quite a contrast to the government’s usual pro-backdoor rhetoric, which has called for some kind of access into encrypted communications and has subsequently faced fierce opposition from civil liberties groups and tech companies.

    “Concerns about privacy have never been higher,” said Rogers in a speech Thursday to the Atlantic Council in Washington D.C. on the balance between privacy and security. He added that officials should be “trying to get all those things right, to realize that it isn’t about one or the other.”

    Rogers went so far as to say that arguing against encryption is a “waste of time” and added that large scale attacks on the government, like last year’s OPM breach, will only happen again if the government doesn’t act fast.

    Related: AT&T’s CEO says Congress, not Silicon Valley, should decide on encryption

    He stated that there has to be a balance found between preserving privacy and maintaining national security, such as monitoring terrorist threats. Just how exactly that can be accomplished is not clear and remains the bone of contention in the ongoing encryption debate.

    Rogers’ remarks are the other side of the black and white cookie from those of FBI director James Comey, who has repeatedly called for backdoors into encryption technology for law enforcement. In the wake of his frequent comments on the matter, organizations like the Electronic Freedom Foundation (EFF) have decried the idea that encryption can be broken and access allowed for law enforcement.

    Rogers’ comments this week may come as a surprise considering he is the director of the NSA, but he is not the first official to make remarks against weakening encryption. Michael Hayden, a former director of the agency, said last month that building encryption backdoors is a “weak security position” for the government to take.

    Nevertheless, government representatives continue to table anti-encryption laws, most notably Senator Diane Feinstein’s proposal that would make it mandatory for companies to decrypt data if a court order is in place.

    Since he actually has a clue what the fuck he is talking about, he realizes both that it won't help him against high priority targets, but it will substantially damage America's already highly ineffective, potentially deadly cyber defense posture.

  • Options
    OrthancOrthanc Death Lite, Only 1 Calorie Off the end of the internet, just turn left.Registered User, ClubPA regular
    And to counter the unexpected having a clue from the NSA.....

    GCHQ advocates crypto with a bad back door for phones
    According to Steven J. Murdoch, a Royal Society University Research Fellow in the Information Security Research Group of University College, MIKEY-SAKKE contains a backdoor that allows communications to be decrypted in bulk. It can be activated by anyone who has access to a master private key that's responsible for generating intermediate private keys. Because the master key is required to create new keys and to update existing ones, network providers must keep the master key permanently available.

    orthanc
  • Options
    CalicaCalica Registered User regular
    edited February 2016
    Welp.

    FBI orders Apple to facilitate decryption of San Bernadino shooter's phone by disabling or overriding the "lock this phone after 10 wrong passcode attempts" feature.
    Apple's reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

    Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade ("DFU") mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.

    If Apple determines that it can achieve the three functions stated above in paragraph 2, as well as the functionality set forth in paragraph 3, using an alternate technological means from that recommended by the government, and the government concurs, Apple may comply with this Order in that way.


    The order also sets out that:

    To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order.

    Apple, obviously, says no.

    I can't wait to see how this gets resolved.

    Calica on
  • Options
    LostNinjaLostNinja Registered User regular
    I'm glad to see Apple was proactive in explaining to the public why they aren't going to do this. Before public outcry forced them to.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    How about the FBI gets off their lazy asses and clones the drive before they start trying to crack it?

    I mean, c'mon dudes.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    Jubal77Jubal77 Registered User regular
    Ugh... stopped reading after McAfee and the "using obscure law from 1700s" garbage.

  • Options
    DaedalusDaedalus Registered User regular
    Calica wrote: »

    Mr. McAfee's method will consist of him consuming enough "bath salts" that the phone's PIN just appears in his mind.

    (also, we have a few pages of argument on this subject in the surveillance thread, just fyi)

  • Options
    witch_iewitch_ie Registered User regular
    I think the core of this question (beyond the technical) is whether digital privacy is something that we value as a society more than we value the national security that could be provided by unlimited access to all personal digital information, knowing that it's likely it won't just be our security agencies that will have that access.

    I believe the government being allowed to coerce (not incentivize) companies to develop anything that they have no plans to develop creates an environment that significantly limits our freedom as a nation, and is something that I would expect to hear being an issue in Russia, not the United States. From that perspective alone, I applaud Apple's response to this.

    With respect to the privacy question, I think this really an important question for our country as a whole even though tech companies, because of their proximity to this issue, may be the initial leaders in this discussion. Privacy in general is something that many people seem to hold dear, but beyond search and seizure without a warrant, we have no explicit right to it in the Constitution, despite what Griswold v Connecticut and Roe v Wade would have us believe.

    Although this is a bigger issue, I think this, and other practices regarding the government and others infringing on the imaginary right of privacy of individuals that have been better understood in recent years, provides us with an opportunity to think about where we want to draw the line and explicitly outline criteria for at least government access to our personal information and experiences.


  • Options
    CalicaCalica Registered User regular
    Daedalus wrote: »
    (also, we have a few pages of argument on this subject in the surveillance thread, just fyi)
    I didn't know that, sorry!

  • Options
    CalicaCalica Registered User regular
    witch_ie wrote: »
    I think the core of this question (beyond the technical) is whether digital privacy is something that we value as a society more than we value the national security that could be provided by unlimited access to all personal digital information, knowing that it's likely it won't just be our security agencies that will have that access.

    I believe the government being allowed to coerce (not incentivize) companies to develop anything that they have no plans to develop creates an environment that significantly limits our freedom as a nation, and is something that I would expect to hear being an issue in Russia, not the United States. From that perspective alone, I applaud Apple's response to this.

    With respect to the privacy question, I think this really an important question for our country as a whole even though tech companies, because of their proximity to this issue, may be the initial leaders in this discussion. Privacy in general is something that many people seem to hold dear, but beyond search and seizure without a warrant, we have no explicit right to it in the Constitution, despite what Griswold v Connecticut and Roe v Wade would have us believe.

    Although this is a bigger issue, I think this, and other practices regarding the government and others infringing on the imaginary right of privacy of individuals that have been better understood in recent years, provides us with an opportunity to think about where we want to draw the line and explicitly outline criteria for at least government access to our personal information and experiences.


    A person who has access to my devices also has access to my email, which they can easily use to commit identity theft. They could also trick my friends and family into installing malware. So no, I do not think the dubious benefit to national security is worth giving up privacy for.

  • Options
    override367override367 ALL minions Registered User regular
    I'm not a programmer, but why can't the FBI clone the phone a fuckoffload of times and attempt to break each clone seperately in a virtual environment

  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    edited February 2016
    I'm not a programmer, but why can't the FBI clone the phone a fuckoffload of times and attempt to break each clone seperately in a virtual environment

    Presumably the encryption keys are not trivially available (eg, on the flash itself) so that wouldn't work - there would be no reason to expect another phone to do it, or the flash can't be extracted

    Phyphor on
  • Options
    DaedalusDaedalus Registered User regular
    I'm not a programmer, but why can't the FBI clone the phone a fuckoffload of times and attempt to break each clone seperately in a virtual environment

    The key to the encrypted data is derived from the user's PIN combined with a non-extractable key in the phone CPU, so if you want to break it by testing all possible PINs, you need to do it on the phone itself.

  • Options
    override367override367 ALL minions Registered User regular
    The justice department seems to be escalating this

    what are the government's options if Apple just says fuck you no means no

  • Options
    Phoenix-DPhoenix-D Registered User regular
    The justice department seems to be escalating this

    what are the government's options if Apple just says fuck you no means no
    Assuming no stays? Contempt of court time!

    This is where things like "exponentially increasing fines" start getting tossed around.

  • Options
    override367override367 ALL minions Registered User regular
    edited February 2016
    I'm not sure if it's smart for the US government is to try and bully a company that can send a message to ~50 million citizens saying "the US government is trying to force us to break into your phones for them"

    but that's like, really weird, that the government could even force something like that

    It's not paying a fine or some simple task. What if the individual engineers refuse to do it? Can the government force them to stay employed at Apple and do it?

    override367 on
  • Options
    WiseManTobesWiseManTobes Registered User regular
    Calica wrote: »
    Welp.

    FBI orders Apple to facilitate decryption of San Bernadino shooter's phone by disabling or overriding the "lock this phone after 10 wrong passcode attempts" feature.
    Apple's reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

    Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File ("SIF") that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE. The SIF will be loaded via Device Firmware Upgrade ("DFU") mode, recovery mode, or other applicable mode available to the FBI. Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2. The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery analysis.

    If Apple determines that it can achieve the three functions stated above in paragraph 2, as well as the functionality set forth in paragraph 3, using an alternate technological means from that recommended by the government, and the government concurs, Apple may comply with this Order in that way.


    The order also sets out that:

    To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order.

    Apple, obviously, says no.

    I can't wait to see how this gets resolved.

    Can't the government just do what the rest of us do when we get locked out of our phone?

    Or does the government not have a 12 year old nephew?

    Steam! Battlenet:Wisemantobes#1508
  • Options
    DaedalusDaedalus Registered User regular
    edited February 2016
    The justice department seems to be escalating this

    what are the government's options if Apple just says fuck you no means no

    Apple has a range of options available to them in the US legal system, but when those run out, they're up shit creek unless they want to move out of the US entirely.

    edit:
    I'm not sure if it's smart for the US government is to try and bully a company that can send a message to ~50 million citizens saying "the US government is trying to force us to break into your phones for them"

    but that's like, really weird, that the government could even force something like that

    It's not paying a fine or some simple task. What if the individual engineers refuse to do it? Can the government force them to stay employed at Apple and do it?
    This is why the FBI is using the San Bernadino terrorist's phone for this case rather than the phone of some random meth dealer. (I have no doubt that the FBI has a whole room full of iPhone 5Cs that they want opened after the exploit is developed for this one, of course).

    Daedalus on
  • Options
    override367override367 ALL minions Registered User regular
    Daedalus wrote: »
    The justice department seems to be escalating this

    what are the government's options if Apple just says fuck you no means no

    Apple has a range of options available to them in the US legal system, but when those run out, they're up shit creek unless they want to move out of the US entirely.


    they are definitely capable of doing this too!

  • Options
    Phoenix-DPhoenix-D Registered User regular
    I'm not sure if it's smart for the US government is to try and bully a company that can send a message to ~50 million citizens saying "the US government is trying to force us to break into your phones for them"

    but that's like, really weird, that the government could even force something like that

    It's not paying a fine or some simple task. What if the individual engineers refuse to do it? Can the government force them to stay employed at Apple and do it?

    "Apple is protecting the terrorists" is a nasty punch the other way.

  • Options
    DaedalusDaedalus Registered User regular
    edited February 2016
    What if the individual engineers refuse to do it? Can the government force them to stay employed at Apple and do it?

    In some hypothetical never-gonna-happen scenario where every one of Apple's engineers resign rather than develop the exploit, then I suspect that the FBI would take Apple's firmware signing key and the iOS 8 source code and hand them to some east coast security firm to develop the exploit instead; they'll have plenty of firms to choose from. (The iOS source isn't strictly necessary for this but makes development take a lot less time)

    Ultimately, Apple should have built a better system. The political argument has been done to death in the other thread, so let's talk technical details: the backdoor already exists, really. Or rather the ability to develop the backdoor exists in the form of Apple's firmware signing private key. It's not good enough to develop a system where you have a backdoor but you won't give it to the government; you need to not have a backdoor in the first place. In this case it probably would have been enough to require the phone to be unlocked before a firmware update can get pushed! In the long term for their newer phones, they need to make sure they don't have some "hardware debugging convenience feature" in their secure processor.

    Or I guess they can go back to "lock your phone with a long password and the guy who steals it from the bar probably can't break into it maybe".

    Daedalus on
  • Options
    Phoenix-DPhoenix-D Registered User regular
    This is specific to IOS 7 actually. I think 8 (and phones with the SE chip) did solve that.

  • Options
    DaedalusDaedalus Registered User regular
    Phoenix-D wrote: »
    This is specific to IOS 7 actually. I think 8 (and phones with the SE chip) did solve that.

    No, the phone in question is a 5C with iOS 8. The 5C doesn't have the SE chip so whether Apple can do a similar exploit on later phones is still an open question. (I'm kind of leaning towards "yes they can" the more I think about it, though)

  • Options
    FeralFeral MEMETICHARIZARD interior crocodile alligator ⇔ ǝɹʇɐǝɥʇ ǝᴉʌoɯ ʇǝloɹʌǝɥɔ ɐ ǝʌᴉɹp ᴉRegistered User regular
    Daedalus wrote: »
    Phoenix-D wrote: »
    This is specific to IOS 7 actually. I think 8 (and phones with the SE chip) did solve that.

    No, the phone in question is a 5C with iOS 8. The 5C doesn't have the SE chip so whether Apple can do a similar exploit on later phones is still an open question. (I'm kind of leaning towards "yes they can" the more I think about it, though)

    I've read conflicting things about whether the Secure Enclave chip is hard coded to 1) discard intermediate keys in the event of multiple failed login attempts and/or 2) incrementally increase the delay between multiple login attempts in rapid succession.

    every person who doesn't like an acquired taste always seems to think everyone who likes it is faking it. it should be an official fallacy.

    the "no true scotch man" fallacy.
  • Options
    Phoenix-DPhoenix-D Registered User regular
    edited February 2016
    Daedalus wrote: »
    Phoenix-D wrote: »
    This is specific to IOS 7 actually. I think 8 (and phones with the SE chip) did solve that.

    No, the phone in question is a 5C with iOS 8. The 5C doesn't have the SE chip so whether Apple can do a similar exploit on later phones is still an open question. (I'm kind of leaning towards "yes they can" the more I think about it, though)

    The last summary I read had about 5 paragraphs of "Apple asked if it had iOS 7". Do they have more than one going?

    Edit: yes yes they do.

    Phoenix-D on
  • Options
    jmcdonaldjmcdonald I voted, did you? DC(ish)Registered User regular
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

  • Options
    DaedalusDaedalus Registered User regular
    Feral wrote: »
    Daedalus wrote: »
    Phoenix-D wrote: »
    This is specific to IOS 7 actually. I think 8 (and phones with the SE chip) did solve that.

    No, the phone in question is a 5C with iOS 8. The 5C doesn't have the SE chip so whether Apple can do a similar exploit on later phones is still an open question. (I'm kind of leaning towards "yes they can" the more I think about it, though)

    I've read conflicting things about whether the Secure Enclave chip is hard coded to 1) discard intermediate keys in the event of multiple failed login attempts and/or 2) incrementally increase the delay between multiple login attempts in rapid succession.

    Secure Enclave doesn't discard anything, it just increases delays between guesses.

    I have two concerns. (Or rather, I would have two concerns if I owned an Apple phone and wanted it to be secure against Apple reading the data. I have two criticisms from a security engineering perspective.)

    1) In real life there's no such thing as "hard coded". There isn't, like, an actual hardware circuit enforcing a five second delay between key derivation requests in that SE chip. It's a processor with a device-specific key running a microkernel. Apple can and has pushed software updates to it. Why can't they push an update to it that reduces the delay to 0, just like they're doing to the 5C?

    2) The real worry isn't that Apple will be required to decrypt a dead terrorist's phone that's already in the FBI's possession, the worry is that they'll later be required by court order to push a similar update to some specific phone that doesn't belong to a dead person. If Apple can push a signed OS update to a specific phone, why can't the push one to your phone that just records your PIN as you type it in, and while we're at it also listens to whatever conversations you're having while it's in your pocket? What if I have a court order telling Apple to do that? They already did this last thing; they've shown that it's possible.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    edited February 2016
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    which is kind of silly

    no shit their phone aren't so secure that Apple themselves can't hack them

    Apple would probably look better off if they were like "When we get a request for information off of a device, we demand to see a valid warrant, and we demand to be allowed to keep the device in-house so the compromising code never leaves our labs."

    e: (in case you aren't aware this is what's already an option in their order, but playing it off as their idea to protect the general security of the device would be nice optics)

    Aioua on
    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    DaedalusDaedalus Registered User regular
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    Now that's bullshit; I can't get OS-level software to consistently work on two different pieces of hardware when I'm actually trying to. Unless they mean the general technique of "firmware update removes barriers to brute-forcing the PIN on the phone", in which case, yeah, that means "secure enclave" was a useless but expensive engineering distraction. Well, we've all been there, they can try harder next time.

  • Options
    DaedalusDaedalus Registered User regular
    edited February 2016
    Aioua wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    which is kind of silly

    no shit their phone aren't so secure that Apple themselves can't hack them

    They in the past said "Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.", though this statement is no longer on their website.

    As a general principle, you should design your crypto system so that you yourself can't hack it. Taking the government argument out of the question for a moment, let's change the scenario to "Tim Cook wants to decrypt his nephew's phone without his nephew's consent, so he hands it to his engineering team and says that they'll get a million-dollar bonus when it's done. Can they do it?" If the answer is "yes", it wasn't secure in the first place.

    Daedalus on
  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    edited February 2016
    Daedalus wrote: »
    Aioua wrote: »
    jmcdonald wrote: »
    Apparently an apple exec is saying this would work on all models.

    So, now the real reason for their foot dragging comes out - their phones are not nearly as secure as they have intimated.

    which is kind of silly

    no shit their phone aren't so secure that Apple themselves can't hack them

    They in the past said "Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.", though this statement never appeared on their website.

    As a general principle, you should design your crypto system so that you yourself can't hack it. Taking the government argument out of the question for a moment, let's change the scenario to "Tim Cook wants to decrypt his nephew's phone without his nephew's consent, so he hands it to his engineering team and says that they'll get a million-dollar bonus when it's done. Can they do it?" If the answer is "yes", it wasn't secure in the first place.

    Oh, then in the past they were dirty dirty liars. I'm not even sure why people believed them. Unless the guess limiting is somehow built into the math itself of the crypto algorithm then it was possible to brute force. And since the private key was ultimately tied to a user-entered PIN then if guess limiting is circumvented then brute-forcing is trivial.

    Aioua on
    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
Sign In or Register to comment.