"It's Encrypted"
These two words are a trigger. They seem innocuous enough, but the devil is in the details, and the details of cryptography are notoriously byzantine. It's tempting to allow ourselves to be soothed by these words, but they fall well short of an assurance.
In 2002, Bill Gates sent a memo inside Microsoft introducing his vision for Trustworthy Computing, a form of which we now know as Trusted Computing (these are distinct). In addition to Security and Privacy, the Availability goal in that memo has since been likened to a light switch. You don't think about whether the power will arrive when you flip that switch, you simply trust that it will. It's hard to see the forest for the trees sometimes, but when you consider the immense complexity and level of engineering in the products we depend on fifteen years later (tablets, laptops, mobile phones, streaming music systems to name a few), it is a remarkable achievement.
The light switch analogy is instructive. Power generation and distribution is a gargantuan field, and consumes spectacular amounts of money to provide you with the simplicity and affordability of that light switch. Similarly, achieving the level of stability and performance in our computing devices has consumed millions of productivity hours and billions of dollars. Privacy, the often forgotten cousin of Security and Availability in the original memo, is expected to varying degrees by different audiences but nonetheless depended on for our daily lives.
"It's Encrypted" sounds like the light switch - problem solved, it's encrypted, I can rest. It is a rare day indeed where that claim is picked apart that I do not find (sometimes significant) issues with the claim. If not done correctly, badly implemented security (but encryption especially) can be worse than no security.
"It's Encrypted" sounds like the light switch - problem solved, it's encrypted, I can rest. It is a rare day indeed where that claim is picked apart that I do not find (sometimes significant) issues with the claim. If not done correctly, badly implemented security (but encryption especially) can be worse than no security.
Cryptography is an esoteric field. As I write this, Mozilla is due to start warning users of websites that continue to use certificates signed with the SHA-1 algorithm. This is likely to be Greek to the average user, but to security professionals this is a reasonable step in the march forward, and if naming and shaming is required to prod those last holdouts into the future, then at least users are informed that these providers are not up to snuff. This stuff is ubiquitous, and few users realise that their $150 smartphone is loaded with some of the most advanced cryptography available to the average Joe. They depend on it often more than they know, so interventions can be useful.
The problems are not in the ciphers, and handshakes & protocols are continually improved to counter new threats while enabling new functionality. The biggest headache (and the most frequent downfall of "It's Encrypted" claims) is key management. As Bruce Schneier reminds us in Applied Cryptography: "In the real world, key management is the hardest part of cryptography."
Private keys in PKI are often treated with trivial regard. The most recognisable for administrators would be the PFX, or .P12 file. Incorrectly referred to as a "Certificate file" (a wildly inappropriate shorthand), these few kilobytes are dealt with as arbitrary packages that a system uses to function, as opposed to critical pins in the security apparatus. This type of file (PKCS#12) as designed, is intended for key archival, yet is often sent via e-mail, stored on accessible drives to ease administration and troubleshooting.
On the other hand, Pre-Shared Keys are frequently recorded when they are not strictly required - if a system has issues, generate a new pair, then discard any record, though only a storage corruption would necessitate this, in which case you have bigger problems. TLS1.2 recommends this with their Ephemeral Keys, but IPSec VPN administrators have yet to learn the lesson and these are still recorded as if they will be required in the future, undermining the confidentiality and integrity of the link.
I worked on a proposal for a customer who demanded full encryption of all storage volumes in a cloud hosting provider, and quickly determined the theory and practise of these cryptosystems prohibited a robust solution. Of course, it could be fudged (and eventually was over my objections), but it took me a while to realise that the actual integrity of the system was not the point. It was merely to be able to make the claim "It's Encrypted".
The eventual implementation made use of "self-decryption". This is not a thing. By all means, system rely on obfuscation to hide key material and defeat trivial intrusions, but if a system is aware of how to decrypt its' own resources, an attacker will be able to achieve the same goal. The Trusted Computing Modules located on modern mobile devices, with the built-in Hardware Security Module, provides a slick solution to this question in performing both assurance that the system is trustworthy, and providing the eventual decryption keys once this assurance is performed. A virtual environment has not such opportunity.
Another issue in the eventual implementation was the use of a central key repository, using COTS. Since the designers have no real training or experience in cryptography or security principles, they made the fatal mistake of trusting the systems they were trying to protect too much. Every server is hosted at a provider, and on startup will request access to a database of keys over CIFS. Authentication is handled by means of the computer AD account. Since the single database is decrypted in its entirety by any participating server, every server has access to the keys for every other. This is disastrously insecure; the cloud provider has administrative access to these systems, and it is against their view that the cryptosystem is supposed to protect; ISVs and integrators are frequently given administrative credentials to install and test their products, again allowing them to retrieve as many keys as they please should a nefarious actor be present.
But hey, "It's Encrypted".