Hiding Keys Under the Mat: Governments Could Ensure Universal Insecurity

Gonzalo Álvarez Marañón    17 December, 2020
Hiding Keys Under the Mat: Governments Could Ensure Universal Insecurity

The doorbell rang. “Who will be ringing now?” asked Brittney Mills, as she struggled to get off the couch. Her eight months of pregnancy were beginning to hinder her movements. “You don’t move,” she said as she passed her 10-year-old daughter sitting in front of the TV. When Brittney opened the door, two bullets left her bleeding dry on the floor. Her daughter ran to hide in the bathroom when she heard the shots. Her baby died a few hours later, the killer was never found. The authorities turned to her iPhone for incriminating evidence but were unable to unlock it. They turned to Apple, but the company claimed it could not enter her smartphone because its content was encrypted and without her unlocking password it was impossible to recover the keys.

This real case, which occurred in April 2015 in Baton Rouge, Louisiana, along with many others, such as the shooter Syed Farook, who killed 14 people and injured 22 in San Bernardino, have pitted authorities against Apple and reopened an old debate: should encryption technology be available to everyone, with the consequent risk of obstructing criminal investigations?

You May Not Be Aware of It, But You Use the Most Robust Cryptography That Has Ever Existed

When you use messaging apps such as Whatsapp, iMessage or Telegram; or video conferencing applications such as Facetime or Zoom; or some email services such as ProtonMail or OpenPGP; you are using end-to-end encryption: the communication between your device and the other person’s device is fully encrypted and no one, not even the service provider, can find out about the content.

What’s more, your information inside the smartphone is encrypted using a master key generated inside the device from your unlock code, that never leaves the device. You can also encrypt your laptop with a master key derived from your password.

In the end, it turns out that everyday consumer products incorporate such powerful cryptography that no one can break it. And of course, it’s not just you who use a smartphone or laptop, but also criminals, terrorists and murderers. The authorities watch helplessly as mountains of smartphones, tablets and computers are piled up in court rooms with priceless evidence inside – that no one can access!

Should cryptography be banned, and should security measures of current technological devices be reduced? Some governments are proposing a halfway: keeping a copy of the keys (key escrow).

The Idea Behind Key Protection

Apparently, the concept is quite simple: a trusted authority keeps a copy of the encryption keys used by all existing devices in the country. In other words, the aim is that the bad guys don´t have access to citizens’ information, but for the good guys to have it, of course only in exceptional cases.

There are precedents for this idea since the 1990s. At that time, the US government still considered cryptography to be ammunition and therefore its export was prohibited unless it was weakened to 80-bit keys. For the computing power of the time it was not even that bad, because it was assumed that no one other than the NSA could break them. What could have gone wrong?

You don’t have to look very far. Consider the SSL/TLS protocol. Browsers and websites were forced to include 80-bit key encryption suites. You know the famous computer adagio: “if it works, don’t touch it”. So, 20 years later, TLS was still supporting the weakened suites even though the export restriction had been lifted in 2000. And in 2015 came the FREAK and LogJam attacks, which allowed a website’s security to be downgraded to export encryption suites, making its cryptography so weak that it broke in seconds. Ironically, the FBI and NSA websites were also affected.

Back in the 1990s, the NSA also attempted to restrict the cryptographic capabilities of communications devices through another way: the Clipper chip. The idea behind Clipper was that any telephone device or communications device that was to use cryptography would incorporate the Clipper chip with a pre-assigned cryptographic key that would then be given to the government in custody. If a government agency deemed it necessary to intercept a communication, it would unlock the key and decrypt the communication. Fortunately, it never happened, as it was unable to meet the requirements and proved to be hackable. If you are curious about the history of this chip, I recommend the chapter on its rise and fall in Stephen Levy’s book Crypto.

Another question that arises is: who holds the key? The key server is a single point of failure. If an attacker manages to break into it, he would get hold of the keys of the entire population and could decrypt the communications of all citizens. Obviously, it does not seem a good idea to store them all in the same place. Each service provider could be forced to store their customers’, but how many would do it safely?

Or the keys could be distributed among several government agencies, so that they would each have to provide their share of the key if needed. Of course, implementing such a system of key sharing is not easy at all, and the master key remains vulnerable once it has been reset.

Another option would be to resort to threshold cryptography, but it is still very green, far from reaching universally accepted robust algorithms.

Moreover, even if such algorithms existed, the chosen solution would require major changes to the cryptographic protocols and applications of all products and services. They would have to be rewritten, with the consequent appearance of vulnerabilities and flaws.

There are still many questions remaining: should these changes be implemented at the operating system level in iOS, Android, Linux, Windows, MacOS, etc.? Would every creator of applications using encryption be obliged to hand over keys in custody? Would all users be obliged to use these backdoors? How long would the migration take and what would happen to legacy applications?…

So far, we are talking about safeguarding the key as if there was only one key per user or per device. The reality is quite different, both for encryption at rest and for encryption in transit, a multitude of keys are used that constantly rotate, derived from master keys, which can also rotate. There no two WhatsApp messages encrypted with the same key. A key chain is updated every time you change your password or access code to your device. In fact, it is not even clear which key or keys should be kept, or how the kept keys could be updated to serve a purpose if ever needed.

In summary, quoting the comprehensive work of a group of cryptographers from the 1990s, key escrow would impact on at least three dimensions:

  • Risk: Failure on the key recovery mechanisms can jeopardise the security of current encryption systems.
  • Complexity: Even if it would be possible to make key recovery reasonably transparent to end users, a fully functional key recovery infrastructure is an extraordinarily complex system, with many new entities, keys, operational requirements and interactions.
  • Economic cost: No one has yet described, let alone demonstrated, a viable economic system to account for the real costs of key recovery.

So far, we have assumed that the government would only use the keys in custody for criminal investigations. What guarantee do you have that they will not use them against their own citizens? Things are getting even more complicated.

And what about criminals? For them it would be as simple as creating their own messaging or data encryption apps with secure cryptography without revealing their keys and passwords to anyone. If you force the custody of the keys, only the criminals will use unguarded keys

Key storage systems are actually inherently less secure, more expensive and more difficult to use than similar systems without a recovery function. Mass deployment of key-custody-based infrastructures to meet law enforcement specifications would require significant security and convenience sacrifices and a substantial increase in costs for all users. Moreover, creating a secure infrastructure of the massive scale and complexity that would be required for such a system goes beyond current experience and expertise in the field and may well ultimately introduce unacceptable risks and costs.

There are enough security flaws already in current products and services that seek to be secure, as to introduce cryptographic vulnerability by design into our future products and services. So far, all attempts have failed miserably. And it seems like they will continue to fail in the future.

Leave a Reply

Your email address will not be published.