Prime Factors Blog

How Enterprises Bring Public Cloud, Encryption, and Value Together

Posted by Jeff Cherrington on Oct 17, 2014 12:10:00 PM

Enterprises are compelled to examine the unit cost advantages of processing with large public cloud providers like Amazon Web Services and Rackspace. Frequently, however, the processes that could return the greatest cost reductions if migrated to such clouds include sensitive or regulated data. Giving a third party direct access to such data introduces new risks and new complications in their talks with regulators, auditors and, most particularly, customers.

These complications become plain by simply glancing at the terms offered by the cloud provider – for example, section 4.2 of the Amazon AWS Customer Agreement:

"You are responsible for properly configuring and using the Service Offerings and taking your own steps to maintain appropriate security, protection and backup of Your Content, which may include the use of encryption technology to protect Your Content from unauthorized access and routine archiving Your Content."

[Bolding is mine to call out the relevant phrases]


Plainly, when processing in a public cloud, encryption is needed.

Enterprises who find a way to enjoy the financial benefit of the cloud without compromising data privacy protection gain significant competitive advantage. Extracting the value of securely migrating high volume critical processing to the cloud relies on some principals simple in their statement and potentially devilish in their implementation details.

The foundation, as discussed in a prior post and recommended by Amazon above, is that all sensitive or regulated data must be protected (i.e., encrypted) before it passes from the enterprise. Looking deeper into available technology, it is plainly apparent that is the easy part. The harder part is how to use that protected data in the cloud when it is needed without giving staff of the cloud provider access to it or to the decryption key.

For example, assume an enterprise wants to move their customer lifecycle tracking application to a public cloud, wherein each customer’s interactions with the company are tracked. Some of the customer data should be considered sensitive (email, home, and shipping addresses plus phone numbers that might be fraudulently used for spearphishing, information about the payment means used for transactions, etc.). The enterprise’s IT group works with a cloud provider and ensures the sensitive fields are encrypted before migrating the application to the cloud infrastructure, that the decryption key is solely under the control of the enterprise, and no copies of that key are shared with the cloud provider staff. Great start – but what happens the first time the enterprise needs that application to perform its function?

Another prior blog post, The Three Laws of Data in the Cloud, laid out the requirement statements for suitable architectures, paraphrased as:

  1. Never allow the cloud provider access to your decryption keys
  2. Never allow the decryption key to be written to any persistent storage at the cloud provider
  3. Conclusively wipe decryption keys in cloud applications once an intended use is complete

The successful model for enterprises wanting to compute in the cloud, then, integrates a means to decrypt the needed data at the time that it needs to be used by passing the decryption key under the sole control of the enterprise into the active memory of the application in the cloud. The decryption key is held in a protected memory space, unavailable to the cloud provider staff. Only the sensitive data fields required for the task-at-hand are decrypted, the clear text representations of that data are likewise used only in active memory and, when processing is complete, that active memory space is conclusively wiped.

This implies a few things architecturally:

  • A means must be established to control access to the decryption keys, even within the cloud provider’s infrastructure, that is managed only from the enterprise
  • There must be means for an ephemeral copy of the decryption key to securely transfer from the enterprise to the cloud provider
  • The key use and decryption process must be integrated into application processing in the cloud in a manner that prevents access to the decryption key or decrypted data by the cloud provider’s staff
  • Workflows must include steps that conclusively wipe clear text versions of sensitive data from memory after processing is complete – this may need to include interim wipes for long running processes, so that sensitive data is not left lingering in active memory after it is no longer needed

Putting these functions in place, supported by appropriate logging for audit review, establishes the basis for the enterprise to harvest the lower unit costs and more favorable accounting of the public cloud for critical processing without sacrificing required data privacy.

We will be discussing this topic in at least a couple more posts in the coming weeks, as well in a live webinar on October 28, when Prime Factors' Sr. Software Engineer, Keith Bucher, and I present Your Key to Safe Cloud Computing: Cryptographic Key Management, Data Protection, and the Cloud. The session expands on obstacles and opportunities available for cryptographic key management in the cloud.  Click on the image below to register – space is limited.

Register for Your Key to Safe Cloud Computing Webinar