• Tag Archives encryption
  • Fancy New Terms, Same Old Backdoors: The Encryption Debate in 2019


    Almost every week, we hear about another corporate data breach or government attack on privacy. For anyone who wants real privacy online, encryption is the essential component.

    Governments around the world keep trying to break encryption, seeking to enhance the power of their law enforcement agencies. They’ve tried for years to require companies to build backdoors into encrypted software and devices, which would enable them to listen in on potentially any digital conversation. The FBI has coined a phrase, “going dark,” that it has used since the late ’90s to describe their “problem”—the lack of an omnipresent, all-powerful surveillance tool.

    But encryption with special access for a select group isn’t some kind of superpower—it’s just broken encryption. The same security flaws used by U.S. police will be used by oppressive regimes and criminal syndicates.

    The only innovation in 2019 has been rhetorical—anti-encryption authorities are determined not to call a backdoor a backdoor. Instead, we saw a proposal from UK intelligence agency GCHQ to add “ghost” listeners to encrypted messaging applications. Later in the year, we saw a revival of the idea of “key escrow,” a discredited idea about how to square the circle on encryption.

    Other approaches included ideas like “client-side scanning,” which is also sometimes called “endpoint filtering” or “local processing.” This array of terms describes a system where a messaging application maintains end-to-end encryption, but when users upload images or other content, it can be first checked locally against a set of “hashes” or fingerprints for contraband. These strategies have been proposed as solutions to the problem of child exploitation images, a problem that the DOJ highlighted frequently in the latter half of 2019, trying to reframe the use of encryption as enabling criminal behavior.

    The promise of end-to-end encryption is, ultimately, a simple value proposition: it’s the idea that no one but you and your intended recipients can read your messages. There’s no amount of wordsmithing that can get around that. It’s high time to start convening conferences and panels of experts to research and publish ideas about how effective law enforcement can co-exist with tools for privacy and strong encryption, rather than trying to break them.

    Keeping Promises on Encryption

    Government pressure hasn’t caused tech companies to abandon encryption, at least not yet. In March, Facebook CEO Mark Zuckerberg publicly embraced end-to-end encryption for all of Facebook’s messaging products. That sounds great, in theory, but the proof is in the pudding—we still don’t know how Facebook might seek to monetize an end-to-end encrypted service. There are also policy and competition concerns about the company’s intention to merge WhatsApp, Instagram, and Facebook Messenger.

    But those policy concerns might be rendered moot if the company backpedals under the glare of increasing government demands. In October, top law enforcement officials in the U.S., U.K., and Australia called on Zuckerberg to simply stop his plan to encrypt the merged messenger products. Again waving the flag of child safety, law enforcement agencies in these three countries made clear their ultimate goal: access to every conversation, on every digital device. Civil society hasn’t been silent. We joined together with more than 100 other NGOs to write our own letter urging Facebook to proceed with its plans. In December, Facebook itself signaled it won’t bow to that pressure.

    The stakes couldn’t be higher. Whichever way the social media giant moves on encryption, other companies are sure to follow.

    Source: Fancy New Terms, Same Old Backdoors: The Encryption Debate in 2019 | Electronic Frontier Foundation


  • Deputy Attorney General Rosenstein’s “Responsible Encryption” Demand is Bad and He Should Feel Bad

    Deputy Attorney General Rod Rosenstein delivered a speech on Tuesday about what he calls “responsible encryption” today. It misses the mark, by far.

    Rosenstein starts with a fallacy, attempting to convince you that encryption is unprecedented:

    Our society has never had a system where evidence of criminal wrongdoing was totally impervious to detection, especially when officers obtain a court-authorized warrant. But that is the world that technology companies are creating.

    In fact, we’ve always had (and will always have) a perfectly reliable system whereby criminals can hide their communications with strong security: in-person conversations. Moreover, Rosenstein’s history lesson forgets that, for about 70 years, there was an unpickable lock. In the 1770s, engineer Joseph Bramah created a lock that remained unpickable until 1851. Installed in a safe, the owner could ensure that no one could get inside, or at least not without destroying the contents in the process.

    Billions of instant messages are sent and received each day using mainstream apps employing default end-to-end encryption. The app creators do something that the law does not allow telephone carriers to do: they exempt themselves from complying with court orders.

    Here, Rosenstein ignores the fact that Congress exempted those app creators-“electronic messaging services”- from the Computer Assistance for Law Enforcement Act (CALEA). Moreover, CALEA does not require telephone carriers to decrypt encryption where users hold the keys. Instead, Section 1002(b)(3) of CALEA provides:

    (3) Encryption. A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.

    By definition, when the customer sends end-to-end encrypted messages—in any kind of reasonably secure implementation—the carrier does not (and should not) possess the information necessary to decrypt them.

    With his faulty premises in place, Rosenstein makes his pitch, coining yet another glib phrase to describe a backdoor.

    Responsible encryption is achievable. Responsible encryption can involve effective, secure encryption that allows access only with judicial authorization. Such encryption already exists. Examples include the central management of security keys and operating system updates; the scanning of content, like your e-mails, for advertising purposes; the simulcast of messages to multiple destinations at once; and key recovery when a user forgets the password to decrypt a laptop.

    As an initial matter, “the scanning of content, like your e-mails, for advertising purposes” is not an example of encryption, “responsible” or otherwise. Rosenstein’s other examples are just describing systems where the government or another third party holds the keys. This is known as “key escrow,” and, as well explained in the Keys Under Doormats paper, the security and policy problems with key escrow are not only unsolved, but unsolvable.

    Perhaps sensitive to the criticisms of the government’s relentless attempts to rename backdoors, Rosenstein claims “No one calls any of those functions a “back door.” In fact, those capabilities are marketed and sought out by many users.” In fact, critics of backdoors have fairly consistently called key escrow solutions “backdoors.” And any reasonable reader would call Google’s ability to access your email a backdoor, especially when that backdoor is used by unauthorized parties such as Chinese hackers.

    Such a proposal would not require every company to implement the same type of solution. The government need not require the use of a particular chip or algorithm, or require any particular key management technique or escrow. The law need not mandate any particular means in order to achieve the crucial end: when a court issues a search warrant or wiretap order to collect evidence of crime, the provider should be able to help.

    This is the new DOJ dodge. In the past, whenever the government tried to specify ‘secure’ backdoored encryption solutions, researchers found security holes – for example, rather famously the Clipper Chip was broken quickly and thoroughly.

    So now, the government refuses to propose any specific technical solution, choosing to skate around the issue by simply asking technologists to “nerd harder” until the magical dream of secure golden keys is achieved.

    Rosenstein attempts to soften his demand with an example of a company holding private keys.

    A major hardware provider, for example, reportedly maintains private keys that it can use to sign software updates for each of its devices. That would present a huge potential security problem, if those keys were to leak. But they do not leak, because the company knows how to protect what is important.

    This is a fallacy for several reasons. First, perfect security is an unsolved problem. No one, not even the NSA, knows how to protect information with zero chance of leaks. Second, the security challenge of protecting a signing key, used only to sign software updates, is much less than the challenge of protecting a system which needs access to the keys for communications at the push of a button, for millions of users around the globe.

    Rosenstein then attempts to raise the stakes to near apocalyptic levels:

    If companies are permitted to create law-free zones for their customers, citizens should understand the consequences. When police cannot access evidence, crime cannot be solved. Criminals cannot be stopped and punished.

    This is a bit much. For a long time, people have had communications that were not constantly available for later government access. For example, when pay phones were ubiquitous, criminals used them anonymously, without a recording of every call. Yet, crime solving did not stop. In any case, law enforcement has been entirely unable to provide solid examples of encryption foiling even a handful of actual criminal prosecutions.

    Finally, in his conclusion, Rosenstein misstates the law and misunderstands the Constitution.

    Allow me to conclude with this thought: There is no constitutional right to sell warrant-proof encryption. If our society chooses to let businesses sell technologies that shield evidence even from court orders, it should be a fully-informed decision.

    This is simply incorrect. Code is speech, and courts have recognized a Constitutional right to distribute encryption code. As the Ninth Circuit Court of Appeals noted:

    The availability and use of secure encryption may … reclaim some portion of the privacy we have lost. Gov’t efforts to control encryption thus may well implicate not only the First Amendment rights … but also the constitutional rights of each of us as potential recipients of encryption’s bounty.

    Here, Rosenstein focuses on a “right to sell,” so perhaps the DOJ means to distinguish “selling” under the commercial speech doctrine, and argue that First Amendment protections are therefore lower. That would be quite a stretch, as commercial speech is generally understood as speech proposing a commercial transaction. Newspapers, for example, do not face weaker First Amendment protections simply because they sell their newspapers.

    The Department of Justice has said that they want to have an “adult conversation” about encryption. This is not it. The DOJ needs to understand that secure end-to-end encryption is a responsible security measure that helps protect people.

    Source: Deputy Attorney General Rosenstein’s “Responsible Encryption” Demand is Bad and He Should Feel Bad | Electronic Frontier Foundation



  • Australian PM Calls for End-to-End Encryption Ban, Says the Laws of Mathematics Don’t Apply Down Under

    “The laws of mathematics are very commendable but the only law that applies in Australia is the law of Australia”, said Australian Prime Minister Malcolm Turnbull today. He has been rightly mocked for this nonsense claim, that foreshadows moves to require online messaging providers to provide law enforcement with back door access to encrypted messages. He explained that “We need to ensure that the internet is not used as a dark place for bad people to hide their criminal activities from the law.” It bears repeating that Australia is part of the secretive spying and information sharing Five Eyes alliance.

    But despite the well-deserved mockery that ensued, we shouldn’t make too much light of the real risk that this poses to Internet freedom in Australia. It’s true enough, for now, that a ban on end-to-end encrypted messaging in Australia would have absolutely no effect on “bad people”, who would simply avoid using major platforms with weaker forms of encryption, in favor of other apps that use strong end-to-end encryption based on industry standard mathematical algorithms. It would hurt ordinary citizens who rely on encryption to make sure that their conversations are secure and private from prying eyes.

    However, as similar demands are made elsewhere around the world, more and more app developers might fall under national laws that require them to compromise their encryption standards. Users of those apps, who may have a network of contacts who use the same app, might hesitate to shift to another app that those contacts don’t use, even if it would be more secure. They might also worry that using end-to-end encryption would be breaking the law (a concern that “bad people” tend to be far less troubled by). This will put those users at risk.

    If enough countries go down the same misguided path, that sees Australia following in the steps of Russia and the United Kingdom, the future could be a new international agreement banning strong encryption. Indeed, the Prime Minister’s statement is explicit that this is exactly what he would like to see. It may seem like an unlikely prospect for now, with strong statements at the United Nations level in support of end-to-end encryption, but we truly can’t know what the future will bring. What seems like a global accord today might very well start to crumble as more and more countries defect from it.

    We can’t rely on politicians to protect our privacy, but thankfully we can rely on math (“maths”, as Australians say). That’s what makes access to strong encryption so important, and Australia’s move today so worrying. Law enforcement should have the tools they need to investigate crimes, but that cannot extend to a ban on the use of mathematical algorithms in software. Mr Turnbull has to understand that we either have an internet that “bad people” can use, or we don’t have an Internet. It’s actually as simple as that.

    Source: Australian PM Calls for End-to-End Encryption Ban, Says the Laws of Mathematics Don’t Apply Down Under | Electronic Frontier Foundation