Engineers at Automation Logic (AL) are fortunate to be involved with a multitude of projects within various client sites, some of which are within the public sector. These projects make extensive use of Amazon Web Services (AWS), whether to meet application or networking infrastructure requirements. Thus, events held by those at AWS are a great way to gain an insight into ways it can be utilised and applied to any current projects engineers are assigned to.
I went to an AWS Public Sector Transformation day on the 21st of November with a few other AL engineers. We all agreed that it was an excellent way to showcase what AWS was capable of, as well as being a great hotbed to generate ideas on how to apply stated best practices to our own client sites.
The morning consisted of two presentations, one on the importance of Agile in facilitating change within the public sector, and one on the wider digital transformation happening amongst government departments. The day then split into two tracks, made up of six presentations each – a Technology and Innovations Track, presenting technical showcases of AWS services, and a Modernising Government Leadership Track, presenting ways in which a platform-centric approach can be used to introduce modernity into the digital transformation process.
This post will focus on one presentation within the Technology and Innovations Track, to do with ways in which AWS allows for the encryption of sensitive data. This is a key contributor towards best practices, both upheld and maintained across all of AL’s client sites.
AWS Encryption and Key Management for Data at Rest
This presentation was by Tim Rains, the Regional Leader of the Security and Compliance Business Acceleration department of AWS, and started with identifying the need for a balance between accessibility and security to be met when designing high-level security protocols. AWS’s implementation of this, Identity and Access Management (IAM), allows access or restrictions to be quickly set via a web user interface, or when provisioning infrastructure using various infrastructure as code tools.
The presentation then introduced a lower level of encryption, now key based, called symmetric key encryption. This involves plaintext data being encrypted via a key, and the same key being used to decrypt the previously encrypted data by another user. This is shown in the diagram below:
The concept of envelope encryption was then introduced. This is where the data to be encrypted is done so via a data key (D1), with D1 then being encrypted with a key encryption key (KE1), KE1 being encrypted by another key encryption key (KE2) and so on until the final key encryption key KEn is encrypted by a master key. This allows for multiple layers of encryption security, with the benefit of only having to keep track of one master key. This is shown in the diagram below, with arrows between keys signifying “is encrypted by”:
The topic of the presentation then shifted to AWS provided solutions to key management via AWS’s Key Management Service (KMS), client side encryption, where data is encrypted before being sent to be stored within the cloud, and server side encryption, where data is encrypted after being placed in cloud based storage.
In terms of client side encryption, a tool called the Amazon S3 encryption client is used to securely encrypt the contents of Amazon Simple Storage Service (S3) buckets. Within the tool, D1s are generated on a per object basis. Any key encryption keys up to KEn are subsequently generated and encrypted using the process outlined above. KEn is then encrypted using a customer provided master key. Metadata is written into any S3 bucket contents specifying what master key to use when the customer inevitably decrypts the S3 bucket contents back again. The above can also be used for the values of attributes within databases via the AWS DynamoDB Encryption Client, with the added feature of generating signatures over attributes mapping any further changes to database contents. Both Encryption clients are also fully compatible with AWS KMS, should one need to generate their own keys.
For server side encryption using S3, the customer provided key and data to be encrypted are both provided and uploaded to S3 storage, where it is then encrypted and the key deleted. Metadata written to the file upon encryption ensures that a customer provided key (required to access the file in the future) matches the one used to encrypt the file before decrypting. Other methods of server side key encryption involving S3 managed keys were also mentioned, but were not discussed in depth.
The presentation provided a fantastic introduction into the world of data encryption using AWS provided services, as well as emphasising how vital encryption was towards good security practice. However, a more detailed explanation into server side key encryption using S3 managed keys would have been an interesting point to expand upon. With many more similarly detailed and useful presentations throughout the event, I’d wholly recommend attending an AWS Public Sector Transformation Day to anyone who works in DevOps and is willing to gain an insight into the capabilities of the cloud based tools they use every day.< Back