The great thing about encryption is it turns your precious information into meaningless random gobbledegook, utterly useless to any adversary or bystander.  The problem with encryption is your precious information is useless to you too. 

While encryption can protect data at rest against attack (and encryption of data in transit is table stakes) it usually means your data in an encrypted state cannot be searched, filtered, sorted, analysed or reported on. It can't be used

Curiously, if we accept the conventional framing of cybersecurity as a mix of Confidentiality, Integrity and Availability, then encrypted data at rest fails the test: it is not actually "secure" thanks to its unavailability!

Yet there are ways to encrypt your data and treat it too.  One of the most important and fastest growing categories of Privacy Enhancing Technology (PET) is protection for data in use (as distinct from data in transit and data at rest).

For many years, a number of esoteric and highly technical approaches have been available for hiding or de-identifying confidential data while retaining at least some of its utility. Best known amongst these is probably Differential Privacy, a process which judiciously injects noise into data sets such as medical records so as to corrupt the identifying details of individuals concerned while preserving important statistical properties.  Thus the data is still useful to researchers.  Controversy remains over just how useful Differential Privacy-treated data is; there is a clear trade-off where the more the data is varied by the addition of noise, the less accurate it is, even statistically. 

The main conceptual competitor to Differential Privacy has been homomorphic encryption, a class of cryptographic algorithms where mathematical operations can be performed on the cyphertext creating valid outputs but which are themselves still encrypted.  Thus some degree of data processing remains possible with homomorphically encrypted data, such as reporting or sorting. In theory, Fully Homomorphic Encryption (FHE) leaves the cyphertext amenable to every mathemtical operation and therefore full functional data processing.  

As a security professional, I have always been reluctant to recommend homomorphic encryption, because the cryptographic research community hasn’t had time yet to subject these algorithms to the rigorous peer review and testing that distinguishes truly proven methods like RSA, AES and ECC. The theoretical feasibility of FHE was only first demonstrated in 2009, barely a decade ago.  It’s hard to say with confidence when security-certified homomorphic encryption will be commercially available. I feel many advocates of homomorphic encryptionin within the privacy community over-estimate its maturity.  If security and confidentiality are truly your top priority, then it’s probably safer to retain control of the data in a safe place where  processing can continue to be performed locally or by a trusted service provider. 

Which brings us to location. If information assets can be confined to a trusted safe place, within which desired data processing can be carried out (perhaps on the owner’s behalf) then we may have superior protection for data in use.  This is the approach of the newish Confidential Computing Consortium, launched by Google, Intel and Microsoft in 2020, with dozens of more members since then. 

“Confidential Computing” as promoted by the group uses secure hardware elements ― specifically from the Trusted Execution Environment (TEE) standards family ― within which data can be protected both at rest and in use.  Think of a TEE as a secure black box, with network connections for carrying data and instructions for processing that data.  The TEE has self-contained cryptographic functions including key lifecycle management and key generation.  It can exchange keys with external parties so that encrypted data sent into the box can be safely decrypted inside, processed safely and normally, and re-encrypted before being returned. 

Ideally in my view, a TEE should be a physically separate stand-alone box, so that there's a definite security boundary, with only one way into the box and one way out. But individual hardware security modules (HSMs) are expensive, so virtual TEEs are a popular offering.  This option is not unlike the "Cloud HSM" available from several providers now including AWS and Microsoft. 

On the other hand, recent years have brought about hybridization of Trusted Platform Modules (TPMs) and microcontroller units (MCUs) such as the Advanced Micro Devices AMD EPYC™ chips, greatly reducing the cost of high reliability and high speed hardware-based cyber security. 

If a TEE falls into the wrong hands (virtually or physically) then they won't be able to access the contents without presenting the proper authentication credentials (which generally speaking ought to be a cryptographically verified key such as a smartcard or USB token). Thus it is reasonably safe to have the TEE situated in a third party data centre and to access Confidential Computing as a service. In fact high grade data centres have better security than most businesses can afford, and are really the best place for a TEE or HSM. 

It is straightforward (if somewhat expensive) to security-certify TEE hardware, to standards such as FIPS 140-2 or Common Criteria ISO 15408.  Within a TEE’s secure, access-controlled and tamper-resistant environment, verified tamper-resistant software can carry out proven conventional encryption processes.  All mainstream algorithms are available in certified HSMs today, and more options are steadily emerging in advanced MCUs.  

In this blog I’ve only touched on a few approaches to protecting data in use.  There are others such as Multi-Party Computation (and another I learned of just this month!).  For now, I will just summarise my opinion of homomorphic encryption versus Confidential Computing in secure hardware environments:

  • There is less likelihood of vendor lock-in with the way the Confidential Computing Consortium is set up. Data is likely to be intrinsically portable between service providers and partners, at the data controller’s discretion.
  • Cryptography is a slow and painstaking field. Homomorphic encryption is still new and has many years of peer review ahead before homologation.
  • Unless some of the new algorithms are inherently quantum safe, the search for quantum-resistant homomorphic encryption could be very niche; I always urge enterprises to situate themselves in the algorithmic mainstream, avoiding novelty or esoteric solutions.
  • Confidential Computing can promise full-function data processing whereas fully homomorphic encryption algorithms (where any mathematical operation is possible) are not clear-cut and remain in development.
  • In principle Confidential Computing is faster and less risky to adopt because existing software and workloads should be portable without modification into a trusted execution environment. No new algorithms need to be used.