As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

There's no backdoor to this [Encryption] thread

CalicaCalica Registered User regular
Encryption has been one of the hot topics in politics lately, with Hillary Clinton, John McCain, and various Republican presidential candidates voicing support of adding a "back door" to encrypted systems, so that the government and law enforcement can access encrypted data. This is a bad idea for several reasons, the most important of which is that there's no such thing as a weakness that only the good guys can exploit. Tech companies are responding with the corporate equivalent of "WTF, No."

I don't actually believe that laws will be passed requiring back doors to be built into encryption systems. However, the fact that we're even talking about this highlights the growing gap in understanding between the people who build and run information systems, and the government that oversees them. The incompetence of politicians in general when it comes to tech is starting to cause breakdowns: see the de-facto fast lanes allowed by a loophole in the net neutrality regulations, or the UK government-mandated porn filters that also blocked sex education websites and resources for victims of rape and abuse, or Spain's failed mandatory link tax law.

The question for discussion is this: At what point do tech companies have the right - or even the responsibility - to refuse to follow laws intended to regulate them that are nonsensical or even harmful? How should the potential consequences of such disobedience (major perceived loss of government authority, for one thing) be weighed against the consequences of implementing a bad law? Discuss.

«134

Posts

  • Options
    AngelHedgieAngelHedgie Registered User regular
    The problem is that there's two sides to this argument, though. I don't think the government should be forcing weaknesses into these systems that could be exploited by bad actors, but at the same time, there's the concern of inhibiting legitimate functions of the government, either accidentally or intentionally.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Options
    shrykeshryke Member of the Beast Registered User regular
    Calica wrote: »
    Encryption has been one of the hot topics in politics lately, with Hillary Clinton, John McCain, and various Republican presidential candidates voicing support of adding a "back door" to encrypted systems, so that the government and law enforcement can access encrypted data. This is a bad idea for several reasons, the most important of which is that there's no such thing as a weakness that only the good guys can exploit. Tech companies are responding with the corporate equivalent of "WTF, No."

    I don't actually believe that laws will be passed requiring back doors to be built into encryption systems. However, the fact that we're even talking about this highlights the growing gap in understanding between the people who build and run information systems, and the government that oversees them. The incompetence of politicians in general when it comes to tech is starting to cause breakdowns: see the de-facto fast lanes allowed by a loophole in the net neutrality regulations, or the UK government-mandated porn filters that also blocked sex education websites and resources for victims of rape and abuse, or Spain's failed mandatory link tax law.

    The question for discussion is this: At what point do tech companies have the right - or even the responsibility - to refuse to follow laws intended to regulate them that are nonsensical or even harmful? How should the potential consequences of such disobedience (major perceived loss of government authority, for one thing) be weighed against the consequences of implementing a bad law? Discuss.

    At the point where we can trust them to make that call for us.

    So never.

  • Options
    QuidQuid Definitely not a banana Registered User regular
    Calica wrote: »
    The question for discussion is this: At what point do tech companies have the right - or even the responsibility - to refuse to follow laws intended to regulate them that are nonsensical or even harmful? How should the potential consequences of such disobedience (major perceived loss of government authority, for one thing) be weighed against the consequences of implementing a bad law? Discuss.

    They have the right to do so if a law were unconstitutional or an action taken by them would be illegal. That's pretty much it.

  • Options
    AngelHedgieAngelHedgie Registered User regular
    shryke wrote: »
    Calica wrote: »
    Encryption has been one of the hot topics in politics lately, with Hillary Clinton, John McCain, and various Republican presidential candidates voicing support of adding a "back door" to encrypted systems, so that the government and law enforcement can access encrypted data. This is a bad idea for several reasons, the most important of which is that there's no such thing as a weakness that only the good guys can exploit. Tech companies are responding with the corporate equivalent of "WTF, No."

    I don't actually believe that laws will be passed requiring back doors to be built into encryption systems. However, the fact that we're even talking about this highlights the growing gap in understanding between the people who build and run information systems, and the government that oversees them. The incompetence of politicians in general when it comes to tech is starting to cause breakdowns: see the de-facto fast lanes allowed by a loophole in the net neutrality regulations, or the UK government-mandated porn filters that also blocked sex education websites and resources for victims of rape and abuse, or Spain's failed mandatory link tax law.

    The question for discussion is this: At what point do tech companies have the right - or even the responsibility - to refuse to follow laws intended to regulate them that are nonsensical or even harmful? How should the potential consequences of such disobedience (major perceived loss of government authority, for one thing) be weighed against the consequences of implementing a bad law? Discuss.

    At the point where we can trust them to make that call for us.

    So never.

    And that's the big thing - the tech industry doesn't have the best track record in this department. Again, at what point is it principled opposition, and at what point is it an attempt to evade legitimate regulation? This is an industry well known for not being fond of regulation of any kind, after all.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Options
    TL DRTL DR Not at all confident in his reflexive opinions of thingsRegistered User regular
    edited December 2015
    Quid wrote: »
    Calica wrote: »
    The question for discussion is this: At what point do tech companies have the right - or even the responsibility - to refuse to follow laws intended to regulate them that are nonsensical or even harmful? How should the potential consequences of such disobedience (major perceived loss of government authority, for one thing) be weighed against the consequences of implementing a bad law? Discuss.

    They have the right to do so if a law were unconstitutional or an action taken by them would be illegal. That's pretty much it.

    Yep. Email services have been shut down, for instance, for providing to provide access to the government.

    There are perhaps technical solutions to some of these problems, however, such as structuring things in such a way that the host company does not have keys to private user data.

    TL DR on
  • Options
    zepherinzepherin Russian warship, go fuck yourself Registered User regular
    I feel like the government is just about to make a terrible law, for no good reason. It's not like all of the sudden all of the terrorists will stop using encryption. Like some kinda magic gotcha. The ones that use it, will use ones developed in other countries without back doors. The ones that don't use it, are not really effected. There is no rational reason for shitty encryption.

  • Options
    TL DRTL DR Not at all confident in his reflexive opinions of thingsRegistered User regular
    This is only about terrorism inasmuch as terrorism is a convenient excuse for expanded surveillance. There were no reports of using public wi-fi or encrypted communication in relation to the Paris attacks, for instance, but both of those things were in the sights of the French government's subsequent emergency legislation.

  • Options
    CalicaCalica Registered User regular
    The problem is that there's two sides to this argument, though. I don't think the government should be forcing weaknesses into these systems that could be exploited by bad actors, but at the same time, there's the concern of inhibiting legitimate functions of the government, either accidentally or intentionally.
    Quid wrote: »
    Calica wrote: »
    The question for discussion is this: At what point do tech companies have the right - or even the responsibility - to refuse to follow laws intended to regulate them that are nonsensical or even harmful? How should the potential consequences of such disobedience (major perceived loss of government authority, for one thing) be weighed against the consequences of implementing a bad law? Discuss.

    They have the right to do so if a law were unconstitutional or an action taken by them would be illegal. That's pretty much it.

    The thing is, in the specific case of weakening encryption (though I'm sure we could come up with hypothetical others), I would find that unethical, even if the government were to require it. Complying with warrants from law enforcement for users' information is one thing; breaking encryption for everybody would be quite another.

    At my job, I have occasionally been asked to implement things in software that were either illegal or "merely" unethical (in that they would have created unacceptable security risks for our users). In most cases it was because the people asking for the changes didn't understand the implications. In every case, we all sat down, explained why whatever it was was a bad idea, and hashed out a better solution. That doesn't always work with governments, as shown by the examples above.

    I mean, really, you could apply the same question to any field. State governments imposing bad regulations on doctors have been terrible for women's health, for example. The best doctors either find ways around the laws (e.g., by giving government-mandated false information very quietly while playing music), or just ignore them outright. It seems to me the only difference is one of scale.

    (For the record, I'm not an anarchist or libertarian or anything like that. The "perfect" solution would be getting policymakers up to speed on everything they need to know.)

  • Options
    TraceTrace GNU Terry Pratchett; GNU Gus; GNU Carrie Fisher; GNU Adam We Registered User regular
    edited December 2015
    as far as I'm concerned anything like this should require a warrant signed by a judge. by which I mean if this backdoor does exist there better be some pretty hardline protection of civil liberties and protection of privacy written into the laws.

    Trace on
  • Options
    Emissary42Emissary42 Registered User regular
    edited December 2015
    From the Surveillance Thread:
    Emissary42 wrote: »
    Relevant article. A snippet:

    The argument against backdoors, however, has not changed since 1993. Back then, Whitfield Diffie—one of the creators of the Diffie-Hellman Protocol for secure key exchange—spoke to a congressional hearing about the "Clipper Chip," an encryption chip for digital voice communications announced by the Clinton administration. His main points remain relevant:
    • The backdoor would put providers in an awkward position with other governments and international customers, weakening its value.
    • Those who want to hide their conversations from the government for nefarious reasons can get around the backdoor easily.
    • The only people who would be easy to surveil would be people who didn't care about government surveillance in the first place.
    • There was no guarantee someone else might not exploit the backdoor for their own purposes.

    In my opinion we're both too reliant on truly secure computer systems to risk adding weaknesses like backdoors, and the genie is out of the bottle on encrypting things yourself (the second bullet point from my other post). Math is kind of impossible to censor.

    Emissary42 on
  • Options
    shrykeshryke Member of the Beast Registered User regular
    TL DR wrote: »
    This is only about terrorism inasmuch as terrorism is a convenient excuse for expanded surveillance. There were no reports of using public wi-fi or encrypted communication in relation to the Paris attacks, for instance, but both of those things were in the sights of the French government's subsequent emergency legislation.

    This assumes the government should only be reactive on this front. Rather then perhaps preparing for the avenue s a future situation might involve.

    Which isn't too say that there isn't also a degree of wanting to do it regardless but there are completely legitimate reasons to expand capabilities beyond just what was used in the latest attack.

  • Options
    AthenorAthenor Battle Hardened Optimist The Skies of HiigaraRegistered User regular
    edited December 2015
    What would this look like, though? A universal key? Registering your public gpg keys with a government keyring? Requiring the private keys to be turned over with the password in case of a warrant?

    I work for a state university. My life is awash in upholding FERPA. HIPAA. PCI, and all sorts of regulation. Just keeping communications in my department secure is a major headache, before going to the other IT shops on campus. And then there are the academics, who are fine with all information on the internet being public and damn what might be done with it, oh and they have fought every attempt to enforce password security over the last 10 years.

    No. After watching PCI compliance regs morph to a degree of ridiculousness due to misunderstanding as to require an entirely separate physical network for anything that may touch a credit card, I don't want government requiring us to weaken our security because they will undoubtedly think they are smarter than those of us who know how encryption works

    Athenor on
    He/Him | "A boat is always safest in the harbor, but that’s not why we build boats." | "If you run, you gain one. If you move forward, you gain two." - Suletta Mercury, G-Witch
  • Options
    TL DRTL DR Not at all confident in his reflexive opinions of thingsRegistered User regular
    shryke wrote: »
    TL DR wrote: »
    This is only about terrorism inasmuch as terrorism is a convenient excuse for expanded surveillance. There were no reports of using public wi-fi or encrypted communication in relation to the Paris attacks, for instance, but both of those things were in the sights of the French government's subsequent emergency legislation.

    This assumes the government should only be reactive on this front. Rather then perhaps preparing for the avenue s a future situation might involve.

    Which isn't too say that there isn't also a degree of wanting to do it regardless but there are completely legitimate reasons to expand capabilities beyond just what was used in the latest attack.

    That's a good point. I think there's an element of reactivity in a lot of our security that is ineffective - banning liquids on flights after an attempted attack using binary liquid explosives while still failing to catch most edged weapons, for instance. Preparing for the next threat is obviously a preferable solution.

    This just happens to be a case in which an authoritarian solution is being proposed by a government, which warrants careful scrutiny. The recent French issue is a good example because it's full of new "state of emergency" powers expanding the ability to conduct house arrests while limiting freedom of association.

    This isn't a nuanced technocratic response that includes encryption as part of an overall approach to disrupting known and predicted threat models, but rather one that includes measures which can be expected to be used to disrupt political dissidence as has been the case elsewhere.

  • Options
    Phoenix-DPhoenix-D Registered User regular
    edited December 2015
    TL DR wrote: »
    Quid wrote: »
    Calica wrote: »
    The question for discussion is this: At what point do tech companies have the right - or even the responsibility - to refuse to follow laws intended to regulate them that are nonsensical or even harmful? How should the potential consequences of such disobedience (major perceived loss of government authority, for one thing) be weighed against the consequences of implementing a bad law? Discuss.

    They have the right to do so if a law were unconstitutional or an action taken by them would be illegal. That's pretty much it.

    Yep. Email services have been shut down, for instance, for providing to provide access to the government.

    There are perhaps technical solutions to some of these problems, however, such as structuring things in such a way that the host company does not have keys to private user data.

    This sort of thing- in the form of Apple's device encryption for iphones- is exactly what started the most recent round of encryption bitching. So not really a solution.

    I have yet to see a proposal for an encryption backdoor that would actually work without fatally weakening security. And that's not a theoretical problem, either.. FREAK was (indirectly) caused by attempting to block export of high grade encryption. Even years after it became irrelevant whoops, it's back to bite us in the ass.

    Phoenix-D on
  • Options
    PaladinPaladin Registered User regular
    This is the same government that lost the fingerprints of 5.6 million federal employees right? And we're going to give them the keys to all encryption software?

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • Options
    OrthancOrthanc Death Lite, Only 1 Calorie Off the end of the internet, just turn left.Registered User, ClubPA regular
    Phoenix-D wrote: »
    TL DR wrote: »
    I have yet to see a proposal for an encryption backdoor that would actually work without fatally weakening security. And that's not a theoretical problem, either.. FREAK was (indirectly) caused by attempting to block export of high grade encryption. Even years after it became irrelevant whoops, it's back to bite us in the ass.

    This is the big practical one for me, no matter what argument is made for or against back doors on security or moral grounds, it does not seem feasible to implement a backdoor that can only be used by "the good guys". FREAK is not the only example here. The recent Juniper backdoor (here for detail) looks to be baddies rekeying an existing backdoor rather than actually inserting one.

    The closest I've seen to a "Nobody But Us" backdoor is the WeakDH issue. Granted this does not appear deliberate. Because of the computing power required it's limited to basically the US (maybe China & Russia) for now, but give it 5 years and a decent criminal syndicate will be able to exploit this, And of course vunerable devices will still be around because firmware is so rarely patched.

    Of course, if you have a keyed front door like is being asked for (e.g. Encrypting the secret with a law enforcement key as well as that of the recipient) then you have the whole issue of who holds the law enforcement key. Does hardware sold in each country get a different key for that particular country? It's not going to help fight terrorism if the US still can't decrypt the message but the Syrian's can.

    I like the anology of the Safe, while it might be more technically crackable than modern encryption, the normal approach is still to get a warent require me to surrender the key, and jail me for contempt of court if I don't.

    orthanc
  • Options
    The EnderThe Ender Registered User regular
    Calica wrote: »
    Encryption has been one of the hot topics in politics lately, with Hillary Clinton, John McCain, and various Republican presidential candidates voicing support of adding a "back door" to encrypted systems, so that the government and law enforcement can access encrypted data. This is a bad idea for several reasons, the most important of which is that there's no such thing as a weakness that only the good guys can exploit. Tech companies are responding with the corporate equivalent of "WTF, No."

    I don't actually believe that laws will be passed requiring back doors to be built into encryption systems. However, the fact that we're even talking about this highlights the growing gap in understanding between the people who build and run information systems, and the government that oversees them. The incompetence of politicians in general when it comes to tech is starting to cause breakdowns: see the de-facto fast lanes allowed by a loophole in the net neutrality regulations, or the UK government-mandated porn filters that also blocked sex education websites and resources for victims of rape and abuse, or Spain's failed mandatory link tax law.

    The question for discussion is this: At what point do tech companies have the right - or even the responsibility - to refuse to follow laws intended to regulate them that are nonsensical or even harmful? How should the potential consequences of such disobedience (major perceived loss of government authority, for one thing) be weighed against the consequences of implementing a bad law? Discuss.

    ...It seems like offering some sort of carte blanche privileges / expectations to tech companies doesn't really address the issue (that there are currently too many people in western governments that desperately want to believe it's still the 60s and refuse to acknowledge changes in culture, technology, the environment, etc), and I'm skeptical that this would offer much protection to people that a hypothetical backdoor mandate would render vulnerable to law enforcement (I'm think it would mostly be minority kids who have cell phones?).


    I'm not sure what a good solution might be. The most direct approach would be just to vote out people with those regressive & aggressively ignorant attitudes, but somehow this rarely appears to pan out. :|

    With Love and Courage
  • Options
    OrthancOrthanc Death Lite, Only 1 Calorie Off the end of the internet, just turn left.Registered User, ClubPA regular
    The Ender wrote: »
    I'm not sure what a good solution might be. The most direct approach would be just to vote out people with those regressive & aggressively ignorant attitudes, but somehow this rarely appears to pan out. :|

    Whether this works or not is based on the scale you look at I believe. In terms of US politics then it does seem that it's actually impossible to get elected if you have an understanding of the current environment.

    However, on a wider scale, I'm genuinely hopeful that countries that enact such policies will start to fall behind as the tech and innovation moves to more sensible locales, which in turn will gradually reverse the bias that seems to be present in the US currently.

    Of course the biggest risk to that is forced export of US policy as is happening with copyright through TPP and similar (though that's probably another thread) and consumer apathy. It does seem that the biggest revelation out of Snowden was that everyone was already assuming the NSA was survealing everyone, and that somehow that was OK.

    orthanc
  • Options
    The EnderThe Ender Registered User regular
    edited December 2015
    Snowden also revealed that if you're relying on Big Tech to be your light in the darkness, then you've already lost.

    "We want you to build backdoors into all of your operating systems so that we can always spy on what black teenagers we've charged with petty crimes have been texting about protect the public from TERRORISM."

    "...Hm. Well, that is highly unethical. We'll do it only if we can reserve the right to use that same backdoor to harvest advertiser-relevant data from our users and use them for draconian DRM shenanigans."

    "Deal."


    The Ender on
    With Love and Courage
  • Options
    PolaritiePolaritie Sleepy Registered User regular
    edited December 2015
    Emissary42 wrote: »
    From the Surveillance Thread:
    Emissary42 wrote: »
    Relevant article. A snippet:

    The argument against backdoors, however, has not changed since 1993. Back then, Whitfield Diffie—one of the creators of the Diffie-Hellman Protocol for secure key exchange—spoke to a congressional hearing about the "Clipper Chip," an encryption chip for digital voice communications announced by the Clinton administration. His main points remain relevant:
    • The backdoor would put providers in an awkward position with other governments and international customers, weakening its value.
    • Those who want to hide their conversations from the government for nefarious reasons can get around the backdoor easily.
    • The only people who would be easy to surveil would be people who didn't care about government surveillance in the first place.
    • There was no guarantee someone else might not exploit the backdoor for their own purposes.

    In my opinion we're both too reliant on truly secure computer systems to risk adding weaknesses like backdoors, and the genie is out of the bottle on encrypting things yourself (the second bullet point from my other post). Math is kind of impossible to censor.

    Ultimately this is the key thing - the only thing government action can actually do is weaken the security of normal activities. They can't actually make it easier to get at criminals/terrorists/etc. because nigh-unbreakable encryption schemes have been public for decades, and any bad actor with a modicum of sense would just grab source code circa today and bam.

    Furthermore, since crypto algorithms more or less are public by default (because otherwise they can't be audited and so a secret algorithm is never going to be trusted by any expert) any bad actor with access to a programmer would just remove the backdoor.

    Actually, I'm not convinced that you couldn't reverse engineer any master key to an algorithm from source, and it's not practical to make a unique dupe key for every encryption that happens and save it with the government somewhere (actually, impossible, because unplug from internet, encrypt locally, send file).

    I am of the firm opinion that the current agency heads are either lying through their teeth to the public (and should be fired for it) or incompetent (and don't know it, and so should be fired for it). There is no excuse for them trying to sell the public on handing all our PII to China et al. on a silver platter.

    Polaritie on
    Steam: Polaritie
    3DS: 0473-8507-2652
    Switch: SW-5185-4991-5118
    PSN: AbEntropy
  • Options
    override367override367 ALL minions Registered User regular
    edited December 2015
    shryke wrote: »
    Calica wrote: »
    Encryption has been one of the hot topics in politics lately, with Hillary Clinton, John McCain, and various Republican presidential candidates voicing support of adding a "back door" to encrypted systems, so that the government and law enforcement can access encrypted data. This is a bad idea for several reasons, the most important of which is that there's no such thing as a weakness that only the good guys can exploit. Tech companies are responding with the corporate equivalent of "WTF, No."

    I don't actually believe that laws will be passed requiring back doors to be built into encryption systems. However, the fact that we're even talking about this highlights the growing gap in understanding between the people who build and run information systems, and the government that oversees them. The incompetence of politicians in general when it comes to tech is starting to cause breakdowns: see the de-facto fast lanes allowed by a loophole in the net neutrality regulations, or the UK government-mandated porn filters that also blocked sex education websites and resources for victims of rape and abuse, or Spain's failed mandatory link tax law.

    The question for discussion is this: At what point do tech companies have the right - or even the responsibility - to refuse to follow laws intended to regulate them that are nonsensical or even harmful? How should the potential consequences of such disobedience (major perceived loss of government authority, for one thing) be weighed against the consequences of implementing a bad law? Discuss.

    At the point where we can trust them to make that call for us.

    So never.

    And that's the big thing - the tech industry doesn't have the best track record in this department. Again, at what point is it principled opposition, and at what point is it an attempt to evade legitimate regulation? This is an industry well known for not being fond of regulation of any kind, after all.

    well I mean, what's the government going to do if google and apple refuse to comply? shut them down?

    there'd be riots

    I agree these guys need to be regulated, but what we're talking about here is the government mandating that everyone keep a key within 10 feet of all locked doors, it's nonsensically horrible

    override367 on
  • Options
    MadicanMadican No face Registered User regular
    shryke wrote: »
    Calica wrote: »
    Encryption has been one of the hot topics in politics lately, with Hillary Clinton, John McCain, and various Republican presidential candidates voicing support of adding a "back door" to encrypted systems, so that the government and law enforcement can access encrypted data. This is a bad idea for several reasons, the most important of which is that there's no such thing as a weakness that only the good guys can exploit. Tech companies are responding with the corporate equivalent of "WTF, No."

    I don't actually believe that laws will be passed requiring back doors to be built into encryption systems. However, the fact that we're even talking about this highlights the growing gap in understanding between the people who build and run information systems, and the government that oversees them. The incompetence of politicians in general when it comes to tech is starting to cause breakdowns: see the de-facto fast lanes allowed by a loophole in the net neutrality regulations, or the UK government-mandated porn filters that also blocked sex education websites and resources for victims of rape and abuse, or Spain's failed mandatory link tax law.

    The question for discussion is this: At what point do tech companies have the right - or even the responsibility - to refuse to follow laws intended to regulate them that are nonsensical or even harmful? How should the potential consequences of such disobedience (major perceived loss of government authority, for one thing) be weighed against the consequences of implementing a bad law? Discuss.

    At the point where we can trust them to make that call for us.

    So never.

    And that's the big thing - the tech industry doesn't have the best track record in this department. Again, at what point is it principled opposition, and at what point is it an attempt to evade legitimate regulation? This is an industry well known for not being fond of regulation of any kind, after all.

    well I mean, what's the government going to do if google and apple refuse to comply? shut them down?

    there'd be riots

    I agree these guys need to be regulated, but what we're talking about here is the government mandating that everyone keep a key within 10 feet of all locked doors, it's nonsensically horrible

    Take them over or hamstring them.

  • Options
    OrthancOrthanc Death Lite, Only 1 Calorie Off the end of the internet, just turn left.Registered User, ClubPA regular
    Polaritie wrote: »
    Actually, I'm not convinced that you couldn't reverse engineer any master key to an algorithm from source, and it's not practical to make a unique dupe key for every encryption that happens and save it with the government somewhere (actually, impossible, because unplug from internet, encrypt locally, send file).

    Bit of a tangent since I agree with your overall point, but it is absolutely possible to have a master key that's hard to reverse engineer from source, at least if you discount brute forcing the key.

    If you're not trying to hide the back door it's easy, all you need to do is include the law enforcement public key, then encrypt everything with that as well as the destination public key. Law enforcement has the private key so can decrypt the message, just like the intended recipient can, but nobody else can as they don't have the key. The fact the public key is embedded in the source doesn't help reverse engineering the key as it was public anyway. Ultimately the private key could be brute forced, but it's basically just a matter of making it big enough to resist brute force.

    If you're trying to be subtle about it things get a bit harder, but it's still possible. The key to the theorised Dual EC DRGB backdoor can't be easily reversed engineered as it involves solving the same discrete log problem that EC crypto is based on in the first place. It's been shown that if you generate the constants a particular way you can backdoor the algorithm, and there's no explanation of how the NIST constants were generated, and it matches some of the projects hinted at in the Snowden docs. So it's probably backdoored but no researcher has found the key and proved that yet.

    The big problem is not that the keys can be reverse engineered from source, it's that whoever the backdoor is targeted at has to have the key for the back door to be useful. History has shown that securing such a key is very problematic and it will leak eventually. Then you would need to have some way of updating all copies of the algorithm to use a new key, which is never actually going to happen.

    As we know, software tends to live longer than expected, so any assumption about how big the key needs to be to reduce brute forcing, or how long a process needs to live to allow updating in the event of disclousre of the master key are bound to be wrong.

    orthanc
  • Options
    PolaritiePolaritie Sleepy Registered User regular
    Orthanc wrote: »
    Polaritie wrote: »
    Actually, I'm not convinced that you couldn't reverse engineer any master key to an algorithm from source, and it's not practical to make a unique dupe key for every encryption that happens and save it with the government somewhere (actually, impossible, because unplug from internet, encrypt locally, send file).

    Bit of a tangent since I agree with your overall point, but it is absolutely possible to have a master key that's hard to reverse engineer from source, at least if you discount brute forcing the key.

    If you're not trying to hide the back door it's easy, all you need to do is include the law enforcement public key, then encrypt everything with that as well as the destination public key. Law enforcement has the private key so can decrypt the message, just like the intended recipient can, but nobody else can as they don't have the key. The fact the public key is embedded in the source doesn't help reverse engineering the key as it was public anyway. Ultimately the private key could be brute forced, but it's basically just a matter of making it big enough to resist brute force.

    If you're trying to be subtle about it things get a bit harder, but it's still possible. The key to the theorised Dual EC DRGB backdoor can't be easily reversed engineered as it involves solving the same discrete log problem that EC crypto is based on in the first place. It's been shown that if you generate the constants a particular way you can backdoor the algorithm, and there's no explanation of how the NIST constants were generated, and it matches some of the projects hinted at in the Snowden docs. So it's probably backdoored but no researcher has found the key and proved that yet.

    The big problem is not that the keys can be reverse engineered from source, it's that whoever the backdoor is targeted at has to have the key for the back door to be useful. History has shown that securing such a key is very problematic and it will leak eventually. Then you would need to have some way of updating all copies of the algorithm to use a new key, which is never actually going to happen.

    As we know, software tends to live longer than expected, so any assumption about how big the key needs to be to reduce brute forcing, or how long a process needs to live to allow updating in the event of disclousre of the master key are bound to be wrong.

    Oh dammit. I blame typing on an empty stomach for that.

    Steam: Polaritie
    3DS: 0473-8507-2652
    Switch: SW-5185-4991-5118
    PSN: AbEntropy
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    Orthanc wrote: »
    Polaritie wrote: »
    Actually, I'm not convinced that you couldn't reverse engineer any master key to an algorithm from source, and it's not practical to make a unique dupe key for every encryption that happens and save it with the government somewhere (actually, impossible, because unplug from internet, encrypt locally, send file).

    Bit of a tangent since I agree with your overall point, but it is absolutely possible to have a master key that's hard to reverse engineer from source, at least if you discount brute forcing the key.

    If you're not trying to hide the back door it's easy, all you need to do is include the law enforcement public key, then encrypt everything with that as well as the destination public key. Law enforcement has the private key so can decrypt the message, just like the intended recipient can, but nobody else can as they don't have the key. The fact the public key is embedded in the source doesn't help reverse engineering the key as it was public anyway. Ultimately the private key could be brute forced, but it's basically just a matter of making it big enough to resist brute force.

    If you're trying to be subtle about it things get a bit harder, but it's still possible. The key to the theorised Dual EC DRGB backdoor can't be easily reversed engineered as it involves solving the same discrete log problem that EC crypto is based on in the first place. It's been shown that if you generate the constants a particular way you can backdoor the algorithm, and there's no explanation of how the NIST constants were generated, and it matches some of the projects hinted at in the Snowden docs. So it's probably backdoored but no researcher has found the key and proved that yet.

    The big problem is not that the keys can be reverse engineered from source, it's that whoever the backdoor is targeted at has to have the key for the back door to be useful. History has shown that securing such a key is very problematic and it will leak eventually. Then you would need to have some way of updating all copies of the algorithm to use a new key, which is never actually going to happen.

    As we know, software tends to live longer than expected, so any assumption about how big the key needs to be to reduce brute forcing, or how long a process needs to live to allow updating in the event of disclousre of the master key are bound to be wrong.

    Duuuhh... Does that first description (basically standard pki, with an extra key pair) generate two cyphertexts or does the one decrypt via either key? The later sort of is not my understanding to the normal function of asymmetrical encryption.

    They moistly come out at night, moistly.
  • Options
    PolaritiePolaritie Sleepy Registered User regular
    redx wrote: »
    Orthanc wrote: »
    Polaritie wrote: »
    Actually, I'm not convinced that you couldn't reverse engineer any master key to an algorithm from source, and it's not practical to make a unique dupe key for every encryption that happens and save it with the government somewhere (actually, impossible, because unplug from internet, encrypt locally, send file).

    Bit of a tangent since I agree with your overall point, but it is absolutely possible to have a master key that's hard to reverse engineer from source, at least if you discount brute forcing the key.

    If you're not trying to hide the back door it's easy, all you need to do is include the law enforcement public key, then encrypt everything with that as well as the destination public key. Law enforcement has the private key so can decrypt the message, just like the intended recipient can, but nobody else can as they don't have the key. The fact the public key is embedded in the source doesn't help reverse engineering the key as it was public anyway. Ultimately the private key could be brute forced, but it's basically just a matter of making it big enough to resist brute force.

    If you're trying to be subtle about it things get a bit harder, but it's still possible. The key to the theorised Dual EC DRGB backdoor can't be easily reversed engineered as it involves solving the same discrete log problem that EC crypto is based on in the first place. It's been shown that if you generate the constants a particular way you can backdoor the algorithm, and there's no explanation of how the NIST constants were generated, and it matches some of the projects hinted at in the Snowden docs. So it's probably backdoored but no researcher has found the key and proved that yet.

    The big problem is not that the keys can be reverse engineered from source, it's that whoever the backdoor is targeted at has to have the key for the back door to be useful. History has shown that securing such a key is very problematic and it will leak eventually. Then you would need to have some way of updating all copies of the algorithm to use a new key, which is never actually going to happen.

    As we know, software tends to live longer than expected, so any assumption about how big the key needs to be to reduce brute forcing, or how long a process needs to live to allow updating in the event of disclousre of the master key are bound to be wrong.

    Duuuhh... Does that first description (basically standard pki, with an extra key pair) generate two cyphertexts or does the one decrypt via either key? The later sort of is not my understanding to the normal function of asymmetrical encryption.

    It's basically encrypt twice, send the one encrypted using government public key to them. Defeated by unplugging the ethernet cord.

    Steam: Polaritie
    3DS: 0473-8507-2652
    Switch: SW-5185-4991-5118
    PSN: AbEntropy
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    Ewwww.... That's terrible for so many reasons. The government would have some box with an IP address, with all encrypted communication going to it?

    That is, like, 32 different flavors of pants on the head retarded.

    They moistly come out at night, moistly.
  • Options
    OrthancOrthanc Death Lite, Only 1 Calorie Off the end of the internet, just turn left.Registered User, ClubPA regular
    Polaritie wrote: »
    redx wrote: »
    Orthanc wrote: »
    Polaritie wrote: »
    Actually, I'm not convinced that you couldn't reverse engineer any master key to an algorithm from source, and it's not practical to make a unique dupe key for every encryption that happens and save it with the government somewhere (actually, impossible, because unplug from internet, encrypt locally, send file).

    Bit of a tangent since I agree with your overall point, but it is absolutely possible to have a master key that's hard to reverse engineer from source, at least if you discount brute forcing the key.

    If you're not trying to hide the back door it's easy, all you need to do is include the law enforcement public key, then encrypt everything with that as well as the destination public key. Law enforcement has the private key so can decrypt the message, just like the intended recipient can, but nobody else can as they don't have the key. The fact the public key is embedded in the source doesn't help reverse engineering the key as it was public anyway. Ultimately the private key could be brute forced, but it's basically just a matter of making it big enough to resist brute force.

    If you're trying to be subtle about it things get a bit harder, but it's still possible. The key to the theorised Dual EC DRGB backdoor can't be easily reversed engineered as it involves solving the same discrete log problem that EC crypto is based on in the first place. It's been shown that if you generate the constants a particular way you can backdoor the algorithm, and there's no explanation of how the NIST constants were generated, and it matches some of the projects hinted at in the Snowden docs. So it's probably backdoored but no researcher has found the key and proved that yet.

    The big problem is not that the keys can be reverse engineered from source, it's that whoever the backdoor is targeted at has to have the key for the back door to be useful. History has shown that securing such a key is very problematic and it will leak eventually. Then you would need to have some way of updating all copies of the algorithm to use a new key, which is never actually going to happen.

    As we know, software tends to live longer than expected, so any assumption about how big the key needs to be to reduce brute forcing, or how long a process needs to live to allow updating in the event of disclousre of the master key are bound to be wrong.

    Duuuhh... Does that first description (basically standard pki, with an extra key pair) generate two cyphertexts or does the one decrypt via either key? The later sort of is not my understanding to the normal function of asymmetrical encryption.

    It's basically encrypt twice, send the one encrypted using government public key to them. Defeated by unplugging the ethernet cord.

    There have been a few variants. With many PKI usages, the only thing actually encrypted using the asymmetric algorithm is a session key, the rest of the encryption is symmetric. So in those cases it's just the session key that's re-encrypted for the government.

    And no, they don't send the government copy to the governement, it's just sent along with the copy to the recipient and they rely on existing interception mechanisms to get the cypher text. The point of the back door is just to allow them to decrypt it once they've got it, not to obtain the cypher text.

    There are also good ways of stopping people just stripping out the government encrypted copy, specifically you require that the decription algorithm checks for it and validates it using a hash or similar. That way, if you remove the government copy the message is no longer considered valid and so the intended recipient can't read it.

    Of course, as was pointed out earlier in the thread, any competent adversary would just get a programmer to remove both bits, allowing secure encrypted communication. But in doing so, they are no longer able to communicate with standard clients. You protect against this with laws etc. Basically requiring companies like Apple to only ship the version of the algorithm that both adds and checks for the govt copy, and require it to be in tamper resistant hardware. This doesn't prevent truly bad actors as they just use hacked clients, but does prevent regular users circumventing and so does allow for routine mass surveillance. As you said earlier, all the government can do is weaken encryption for regular users, not really prevent bad actors,

    What I'm describing is basically the Clipper chip from the last round.

    In terms of the relevance of DUAL EC DRGB, this blog explains it much better than I would. But basically, the random number generator is backdoored meaning that an attacker can predict what random numbers will come out. This often means the whole pki crypto layer becomes irrelevant as you can just guess the session key or similar.

    Another blog by Matthew Green gives an excellent summary of the different ways of building back doors.

    Note: Still saying that this is a terrible idea, just explaining how it's done.

    orthanc
  • Options
    LD50LD50 Registered User regular
    edited December 2015
    Even with key escrow ala Clipper chip there are ways to subvert it such that the government key can't decrypt the message like it's supposed to, and without a client knowing the message has been tampered with. Especially in software. Clipper chip was supposed to be undefeatable because it was done in hardware where a user couldn't tamper with it, but even then people found ways around it.

    LD50 on
  • Options
    PantsBPantsB Fake Thomas Jefferson Registered User regular
    Calica wrote: »
    Encryption has been one of the hot topics in politics lately, with Hillary Clinton ... voicing support of adding a "back door" to encrypted systems, so that the government and law enforcement can access encrypted data
    It should be noted that Hillary Clinton has explicitly not supported a backdoor as the link says. Rather she hopes for technological solution, even if it requires a large scale "Manhattan Project-like" research project to do so.

    The inference being (IMO) quantum computing. Such hardware would be prohibitively expensive for all but large institutions for the medium term at least. And sufficiently developed quantum computing would render most encryption relatively trivial to decrypt. If possible, and it appears to be an engineering issue/challenge rather than a question of the possibility at this point (as I understand it), this would be the middle ground of allowing governments (especially if sufficiently large devices were regulated explicitly) to have the ability to legitimately access messages while retaining the general utility of encryption.

    11793-1.png
    day9gosu.png
    QEDMF xbl: PantsB G+
  • Options
    PantsBPantsB Fake Thomas Jefferson Registered User regular
    edited December 2015
    The real issue is this conflict
    1. There's a legitimate and important need for governments (such as the US) to conduct electronic surveillance and intercept electronic communications
      • To protect national interests and security.
      • To prevent and provide evidence against violations of criminal law
    2. Encryption makes it very difficult for these needs to be met (if those communicating encrypt their data), and decrypting most encryption algorithms is not feasible with conventional means
    3. Even if larger software vendors provided a means by which their encryption could be circumvented, there are common algorithms that can be implemented with very little development time (hours) that make it nearly impossible to decrypt a message (such as RSA, you can implement that in an afternoon).

    There's nothing theoretically impossible saying we can't just decrypt these messages. We just don't have a method yet. By definition, any message can be decrypted given a sufficient number of attempts. If we can use quantum computing to skip to the right answer (essentially), there may be a way for encryption to be secure against nearly everyone but not absolutely everyone.

    PantsB on
    11793-1.png
    day9gosu.png
    QEDMF xbl: PantsB G+
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    I don't really have a problem with the government creating new technology that can break encryption. It's slightly scary due to 1) I don't trust the government and 2) other governments and potentially bad actors will achieve the same goals.

    Actual backdoors, particularly something with essentially a single key, is just pants on the head crazy to me. I do information security, and over the last 2 years there has been a huge number of vulnerabilities tied to flawed/weak/reverse compatible with weak implementations of transport encryption. To intentionally create flaws and hope no one finds them, and that companies and end users can respond quickly is ridiculous(even more so with mobile).

    Laws are laws and companies operating in (providing services to users in) a country need to follow them, but it is irresponsible to the point of being unethical to expose users and a company to this sort of known risk. I can't help but think such a law would encourage companies to move offshore as much as is possible. I think it is sort of hard to overstate how much potential damage would result from hackers basically being able to break into any communication on the fly, which is sort of what we are talking about.

    They moistly come out at night, moistly.
  • Options
    milskimilski Poyo! Registered User regular
    Clinton's Manhattan project statement either shows she does not understand encryption particularly well or understands but thinks the details would be too wonkish for her base to understand in the space of a debate question. It does show she has advisors saying that a government backdoor is not a good idea.

    If the Clinton administration wants to invent quantum computers or prove p = np, well, those have the same problem as encryption: math is neutral. It'd be a paradigm shift but one I'd rather the US be ahead on. Though those are huge assumptions on what is actually possible to do in the nearish term.

    I ate an engineer
  • Options
    TL DRTL DR Not at all confident in his reflexive opinions of thingsRegistered User regular
    redx wrote: »
    I don't really have a problem with the government creating new technology that can break encryption. It's slightly scary due to 1) I don't trust the government and 2) other governments and potentially bad actors will achieve the same goals.

    Actual backdoors, particularly something with essentially a single key, is just pants on the head crazy to me. I do information security, and over the last 2 years there has been a huge number of vulnerabilities tied to flawed/weak/reverse compatible with weak implementations of transport encryption. To intentionally create flaws and hope no one finds them, and that companies and end users can respond quickly is ridiculous(even more so with mobile).

    Laws are laws and companies operating in (providing services to users in) a country need to follow them, but it is irresponsible to the point of being unethical to expose users and a company to this sort of known risk. I can't help but think such a law would encourage companies to move offshore as much as is possible. I think it is sort of hard to overstate how much potential damage would result from hackers basically being able to break into any communication on the fly, which is sort of what we are talking about.

    This seems reasonable. A technological arms race in which exploits and other mechanisms are used, revealed, and/or expended by governments and other actors in accordance with their priorities (which we would hope but acknowledge are often not good) while their opponents (criminals, terrorists, political dissidents, corporate entities seeking to protect trade secrets) invest in a level of protection that is 'good enough'. It is problematic that a given actor could intercept encrypted data and just sit on it until a flaw is discovered or sufficient computing power attained to crack it, but that at least affords the protection of timeliness.

    Back doors, as you said, are not a good solution even if your perspective is that privacy rights are anachronistic.

    It would seem that the solution to the problem of government overreach in this regard would be to educate the public both on the importance of privacy and encryption and the mechanisms by which they could make use of it. Surely we aren't going to get Craig in Accounting to use PGP when he can't even be made not to clink phishing links in his email, but it may be possible to shift the tone of the conversation.

  • Options
    LD50LD50 Registered User regular
    PantsB wrote: »
    The real issue is this conflict
    1. There's a legitimate and important need for governments (such as the US) to conduct electronic surveillance and intercept electronic communications
      • To protect national interests and security.
      • To prevent and provide evidence against violations of criminal law
    2. Encryption makes it very difficult for these needs to be met (if those communicating encrypt their data), and decrypting most encryption algorithms is not feasible with conventional means
    3. Even if larger software vendors provided a means by which their encryption could be circumvented, there are common algorithms that can be implemented with very little development time (hours) that make it nearly impossible to decrypt a message (such as RSA, you can implement that in an afternoon).

    There's nothing theoretically impossible saying we can't just decrypt these messages. We just don't have a method yet. By definition, any message can be decrypted given a sufficient number of attempts. If we can use quantum computing to skip to the right answer (essentially), there may be a way for encryption to be secure against nearly everyone but not absolutely everyone.

    I have some contentions with number 1. The government doesn't need, nor is allowed, to open and read my mail. They're not allowed to break into my house and drill open my safe. They've never been able to just do these kinds of things with it warrants, and they've never had trouble staying in control of national security.

  • Options
    The EnderThe Ender Registered User regular
    PantsB wrote: »
    Calica wrote: »
    Encryption has been one of the hot topics in politics lately, with Hillary Clinton ... voicing support of adding a "back door" to encrypted systems, so that the government and law enforcement can access encrypted data
    It should be noted that Hillary Clinton has explicitly not supported a backdoor as the link says. Rather she hopes for technological solution, even if it requires a large scale "Manhattan Project-like" research project to do so.

    The inference being (IMO) quantum computing. Such hardware would be prohibitively expensive for all but large institutions for the medium term at least. And sufficiently developed quantum computing would render most encryption relatively trivial to decrypt. If possible, and it appears to be an engineering issue/challenge rather than a question of the possibility at this point (as I understand it), this would be the middle ground of allowing governments (especially if sufficiently large devices were regulated explicitly) to have the ability to legitimately access messages while retaining the general utility of encryption.

    The invention of quantum computing (it's perhaps worth noting that quantum computing is strictly theoretical at this point in time, and some experts in computer engineering have begin to question whether or not it's even possible) isn't the same as the creation of atomic weapons, though, because atomic weapons are always necessarily limited to whomever can access both fissile material and a means to assemble that fissile material into a bomb (...though, even granted these restrictions, admittedly proliferation became a problem rather quickly).

    Quantum computers would no doubt become mass market items a few years out from their creation, much like conventional computers. So, after a few years of tender & loving warrantless state surveillance, you're either back to square one, because now people can make sufficiently complex encryption to defeat quantum processing power, or encryption is now a thing of the past and any 4chan user can break into anything they like. So, which is worse for U.S. national security: the government being unable to decrypt some messages that (ostensibly) would allow police / military forces to intervene against TERRORISM!, or 4chan being able to break into / edit / exploit whatever they want, however they want, whenever they want?

    With Love and Courage
  • Options
    milskimilski Poyo! Registered User regular
    I think that's a false comparison because if quantum computing is possible it's going to happen whether the US invests in it or not, and ithe would be better for security if we are on the cutting edge rather than others.

    I ate an engineer
  • Options
    Phoenix-DPhoenix-D Registered User regular
    LD50 wrote: »
    PantsB wrote: »
    The real issue is this conflict
    1. There's a legitimate and important need for governments (such as the US) to conduct electronic surveillance and intercept electronic communications
      • To protect national interests and security.
      • To prevent and provide evidence against violations of criminal law
    2. Encryption makes it very difficult for these needs to be met (if those communicating encrypt their data), and decrypting most encryption algorithms is not feasible with conventional means
    3. Even if larger software vendors provided a means by which their encryption could be circumvented, there are common algorithms that can be implemented with very little development time (hours) that make it nearly impossible to decrypt a message (such as RSA, you can implement that in an afternoon).

    There's nothing theoretically impossible saying we can't just decrypt these messages. We just don't have a method yet. By definition, any message can be decrypted given a sufficient number of attempts. If we can use quantum computing to skip to the right answer (essentially), there may be a way for encryption to be secure against nearly everyone but not absolutely everyone.

    I have some contentions with number 1. The government doesn't need, nor is allowed, to open and read my mail. They're not allowed to break into my house and drill open my safe. They've never been able to just do these kinds of things with it warrants, and they've never had trouble staying in control of national security.

    I'm presuming that was supposed to read without warrants? Because warrants can authorize a lot of things and those are definitely both on the list.

  • Options
    AngelHedgieAngelHedgie Registered User regular
    LD50 wrote: »
    PantsB wrote: »
    The real issue is this conflict
    1. There's a legitimate and important need for governments (such as the US) to conduct electronic surveillance and intercept electronic communications
      • To protect national interests and security.
      • To prevent and provide evidence against violations of criminal law
    2. Encryption makes it very difficult for these needs to be met (if those communicating encrypt their data), and decrypting most encryption algorithms is not feasible with conventional means
    3. Even if larger software vendors provided a means by which their encryption could be circumvented, there are common algorithms that can be implemented with very little development time (hours) that make it nearly impossible to decrypt a message (such as RSA, you can implement that in an afternoon).

    There's nothing theoretically impossible saying we can't just decrypt these messages. We just don't have a method yet. By definition, any message can be decrypted given a sufficient number of attempts. If we can use quantum computing to skip to the right answer (essentially), there may be a way for encryption to be secure against nearly everyone but not absolutely everyone.

    I have some contentions with number 1. The government doesn't need, nor is allowed, to open and read my mail. They're not allowed to break into my house and drill open my safe. They've never been able to just do these kinds of things with it warrants, and they've never had trouble staying in control of national security.

    But that's the whole point - the government can do all those things, provided that they demonstrate that there is a legitimate government interest to a court and acquire a warrant. Part of the issue here is that there are people who want to make even that functionally impossible.

    If you go back some time, during the TorrentSpy case, there was a lot of criticism of the government penalizing them for interfering with the process of discovery, even though discovery rules are considered a legitimate use of government power.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Options
    PolaritiePolaritie Sleepy Registered User regular
    edited December 2015
    LD50 wrote: »
    PantsB wrote: »
    The real issue is this conflict
    1. There's a legitimate and important need for governments (such as the US) to conduct electronic surveillance and intercept electronic communications
      • To protect national interests and security.
      • To prevent and provide evidence against violations of criminal law
    2. Encryption makes it very difficult for these needs to be met (if those communicating encrypt their data), and decrypting most encryption algorithms is not feasible with conventional means
    3. Even if larger software vendors provided a means by which their encryption could be circumvented, there are common algorithms that can be implemented with very little development time (hours) that make it nearly impossible to decrypt a message (such as RSA, you can implement that in an afternoon).

    There's nothing theoretically impossible saying we can't just decrypt these messages. We just don't have a method yet. By definition, any message can be decrypted given a sufficient number of attempts. If we can use quantum computing to skip to the right answer (essentially), there may be a way for encryption to be secure against nearly everyone but not absolutely everyone.

    I have some contentions with number 1. The government doesn't need, nor is allowed, to open and read my mail. They're not allowed to break into my house and drill open my safe. They've never been able to just do these kinds of things with it warrants, and they've never had trouble staying in control of national security.

    But that's the whole point - the government can do all those things, provided that they demonstrate that there is a legitimate government interest to a court and acquire a warrant. Part of the issue here is that there are people who want to make even that functionally impossible.

    If you go back some time, during the TorrentSpy case, there was a lot of criticism of the government penalizing them for interfering with the process of discovery, even though discovery rules are considered a legitimate use of government power.

    It's been possible to communicate in ways beyond the reach of warrants since forever though. Cryptography isn't new. It's certainly easier, but it's not new. In-person meetings, documents destroyed after delivery... this is all centuries-old.

    Polaritie on
    Steam: Polaritie
    3DS: 0473-8507-2652
    Switch: SW-5185-4991-5118
    PSN: AbEntropy
Sign In or Register to comment.