Twitter | Search | |
Matthew Green Jan 31
I retweeted this post yesterday, but I urge people who care about encryption online to read it, because this legislation should scare you. I am going to follow with a (much less thorough and accurate) summary thread below.
Reply Retweet Like
Matthew Green
1. So in case you haven’t been paying attention, there’s been a bit of a struggle going on between law enforcement and the tech industry over encryption. TL;DR: law enforcement doesn’t like it. The FBI even made a website:
Because of warrant-proof encryption, the government often cannot obtain the electronic evidence necessary to investigate and prosecute threats to public and national safety, even with a warrant or...
FBI FBI @FBI
Reply Retweet Like More
Matthew Green Jan 31
Replying to @matthew_d_green
2. The problem is that US law makes end-to-end encryption legal. Laws like CALEA require wiretap capability for some providers, but only if they have the decryption kegs. Similarly, providers are exempted from liability for many types of “bad” content.
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
3. US law enforcement (and friendly senators) have been trying to change this for a while. Attempts to legislate crypto backdoors have been unsuccessful. Weirdly, people don’t seem to prioritize the government reading their mail.
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
4. Since demanding backdoors has gone nowhere, US AG William Barr and others recently switched strategies. In an open letter, they demanded that Facebook delay encryption plans because it would hinder filtering of “child sexual abuse material” (CSAM).
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
5. This filtering already happens in some networks, like Facebook. It involves scanning every (unencrypted) picture and video you send *in real time* to see if it contains child pornography. Any hits are reported to an agency called NCMEC.
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
6. End-to-end encryption disrupts this CSAM scanning process, because, well, let’s be honest, these scanners are a mass surveillance system — one with a specific (well-meaning) intent — and end-to-end encryption is designed to *stop* mass surveillance.
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
7. I am *deeply* skeptical of Barr’s motivation here. After several years of opposing encryption on very different grounds (criminals, terrorists) and asking for access only with a warrant, suddenly making a hard right turn and saying “think about the children” — feels cynical.
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
8. Technically, the request is also a radical new ask. Previously, law enforcement wanted “exceptional access” — meaning only occasionally would they need to decrypt things. But CSAM scanning can’t be “exceptional”. It has to scan every single image you send.
Reply Retweet Like
Matthew Green Jan 31
Replying to @Riana_Crypto
9. All of this has just been a prelude to describing the new proposed legislation discusses. This legislation is being introduced by Senators Graham and Blumenthal, and it reads like a “backdoor” attempt to squash end-to-end encryption.
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
10. The basic strategy of this law is to make providers (Apple, Facebook, Google etc.) criminally liable for CSAM, unless they comply with a set of “recommended best practices” for detecting the stuff. But who determines those practices, and is encryption one of them?
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
11. In short, the bill establishes an unelected comission, which must consist of “4 law enforcement reps, 4 tech industry reps, 2 reps of child safety organizations, and 2 computer scientists/software engineering experts”. They’ll decide what the best practices are.
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
12. The commission has to consider privacy and security. But that consideration is all they’re required to do. And even if they do recommend encryption: the AG can just override whatever they decide. And those problems are the tip of the iceberg.
Reply Retweet Like
Matthew Green Jan 31
Replying to @alexstamos
13. This thread has been long and I want to end it on a different note. There are a number of thoughtful people, including notably , who feel that tech providers need to work harder to find ways to square this circle: ie allow encryption and CSAM detection to co-exist.
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
14. It is really hard for me to look at this kind of legislation (and the underlying, constantly shifting law enforcement strategy) and say “yes, these people are working with good intent to solve a problem, let’s make things easier for them.”
Reply Retweet Like
Matthew Green Jan 31
Replying to @matthew_d_green
15. “Let’s build encryption systems that are somehow compatible with (currently well-intentioned) mass surveillance, and hand them over to politicians who have displayed no consistent principles in seeking this capability” does not feel like the winning move in this game. //END
Reply Retweet Like