dunce-capBy Mike Mesnick

enators Richard Burr and Dianne Feinstein are not giving up that quickly on their ridiculous and technically ignorant plan to outlaw real encryption. The two have now penned an op-ed in the WSJ that lays out all the same talking points they’ve laid out before, without adding anything new. Instead, it just continues to make statements that show how incredibly ignorant they are. The piece is called Encryption Without Tears (and may be paywalled, though by now everyone knows how to get around that), which already doesn’t make any sense. What they’re pushing for is ending basic encryption, which will lead to many, many tears.

It starts out with their standard ridiculous line, pretending that because a company builds end-to-end encryption, it’s acting “above the law.”

In an increasingly digital world, strong encryption of devices is needed to prevent criminal misuse of data. But technological innovation must not mean placing individuals or companies above the law.

People have gone over this time and time again: this is not about anyone being “above the law.” It’s about whether or not companies can be forced to directly undermine the safety and security of their products (and the public). A paper shredder can destroy evidence. A paper shredder maker is not “above the law” when it decides not to build a system for piecing back together the shreds.

And speaking of “above the law” I still don’t see Feinstein or Burr commenting on the FBI/DOJ announcing that it will ignore a court order to reveal how it hacked into computers over Tor. Thatis being above the law. That involves a situation where a court has asked for information that the FBI absolutely has. The FBI is just saying “nope.” If Burr and Feinstein are really worried about being “above the law,” shouldn’t they worry about this situation?

Over the past year the two of us have explored the challenges associated with criminal and terrorist use of encrypted communications. Two examples illustrate why the status quo is unacceptable.

I love this. They give two examples that have been rolled out a bunch in the last few weeks. The attack in Garland, Texas, where the attackers supposedly exchanged some messages with potential ISIS people, and the case of Brittney Mills, who was tragically murdered, and whose case hasn’t been solved. Mills had her smartphone, but no one can get into it. Of course, it took nearly two years of fretting before law enforcement could dig up these two cases, and neither make a very strong argument for why we need to undermine all encryption.

It’s a simple fact that law enforcement never gets to have all of the evidence. In many, many, many criminal scenarios, that’s just the reality. People destroy evidence, or law enforcement doesn’t find it or law enforcement just doesn’t understand it. That’s not the end of the world. This is why we have police detectives, who are supposed to piece together whatever evidence they dohave and build a picture for a case. Burr and Feinstein are acting like in the past, law enforcement immediately was handed all evidence. That’s never been the way it works. Yes, law enforcement doesn’t get access to some information. That’s how it works.

You don’t go and undermine the very basis of computer security just because law enforcement can’t find a few pieces of evidence.

Our draft bill wouldn’t impose a one-size-fits-all solution on all covered entities, which include device manufacturers, software developers and electronic-communications services. The proposal doesn’t define the technological solutions or tell businesses how to solve the problem.

This is also misleading. The bill requires an end to real encryption. That’s it. Real encryption means that only one person has the key. This is what Burr and Feinstein don’t seem to get. They seem to think it’s trivial to leave a key with Apple or whoever. But as basically every crypto expert has explained, it is not. Doing so creates a vulnerability… and worse, it’s a vulnerability that cannot be patched. That’s hellishly dangerous. Sure, the bill doesn’t tell them exactly how to do this, but it does make it clear: you cannot offer real encryption, you can only offer something that can be hacked. That’s a problem.

We want to provide businesses with full discretion to decide how best to design and build systems that maintain data security while at the same time complying with court orders.

We want to provide businesses with full discretion to decide how best to travel back in time, in order to prevent crimes.

Seriously: this is basically the same thing that Burr and Feinstein are saying here. They’re asking for something that’s impossible, and acting like it’s a routine suggestion. If they need to comply with these All Writs Act style orders, they cannot build systems that maintain data security. That’s a fact. It’s mind-boggling that Burr and Feinstein still can’t understand this.

Critics in the industry suggest that providing access to encrypted data will weaken their systems. But these same companies, for business purposes, already maintain and have access to vast amounts of encrypted personal information, such as credit-card numbers, bank-account information and purchase histories.

Argh. This paragraph shows that whatever poor staffer Burr and Feinstein assigned to write this drivel doesn’t understand even the first thing about what he or she is talking about. Storing encrypted passwords, credit card info, bank account info, etc. is a totally different thing. Those are encrypted to keep them safe, and part of the reason they’re encrypted is so that even those companies cannot reveal them. This point is making the opposite point of what Burr and Feinstein think. Companies encrypt passwords and credit card info and the like so that they’re not storing the plaintext info, and there’s no easy way for anyone to get that info. This protects user data, and the companies cannot actually provide the plaintext. They’re comparing hashes. That’s what keeps it safe.

If we received a court order demanding our users’ passwords, we couldn’t provide them. Because they’re encrypted. We don’t know our users’ passwords and can’t give them to you. When someone logs in to our website, we can compare a hash of their password to our hashed version and then if they match, we let them in. But we don’t know what their password is. So this is a terribleexample that actually goes against what Burr and Feinstein are saying. Those encrypted stores of information would be illegal under this bill!

We are not asking companies to provide law enforcement with unfettered access to encrypted data. We aren’t even asking companies to tell the government how they gain access to this encrypted data. All we are doing is asking companies to find a way to keep their data secure while also cooperating with law enforcement in terrorism and criminal investigations.

Again, that last line is impossible. They’re asking the impossible — and in the process, making everyone less safe. The only way to provide such info to law enforcement is to no longer keep the data truly secure. And the big concern is not unfettered access for law enforcement, but rather whatever this backdoor means for those with malicious intent, who will be very, very, very focused on finding these vulnerabilities and exploiting them.

President Obama said earlier this year, “You cannot take an absolutist view on this.” We agree—and believe that strong data security and compliance with the justice system don’t have to be mutually exclusive.

Because you don’t know what you’re talking about.

American technology companies have done some amazing things that are the envy of the world. We think that finding a way to achieve both goals simultaneously is not beyond their capabilities.

So, in the end, despite basically every cryptography expert telling them this is impossible, Burr and Feinstein come back with “NERD HARDER, NERDS!”

Share This