Managing Secrets in Terraform

Cloud Technology
Security
Artificial Intelligence & MLOps

Terraform is one of the top tools available on the market to use if you want to manage and implement infrastructure as code. Terraform does more than just configuration. The platform is capable of handling the entire orchestration runtime using declarative code instead of a procedural one. There is no need to go through a complex setup process either.

Terraform has a few aces up its sleeve, including its support for immutable infrastructure. You don’t have to destroy and rebuild every instance with the updates you push to the cloud. Persistent storage and other immutable resources can also be managed from within Terraform. As an added bonus, there are multiple ways to integrate Terraform into existing pipelines.

The real challenge is making sure that infrastructure as code remains secure, and that means managing strings, parameters, and more importantly secrets, meticulously. You cannot just display the credentials for your master user as username = user and password = pass. So, how can secrets be managed in Terraform?

Basic Principles

Before we get to how to best manage secrets in Terraform, there are a few basic principles that we need to get out of the way first, starting with the fact that you must never put secrets in your .tf files. This is a big no, regardless of how the secrets are added. .tf files can be made public, and that poses a serious security risk to the entire cloud infrastructure.

You must also avoid storing secrets as plain text. This too is something that many infrastructure administrators and developers still neglect. No matter how secure the source file may be, adding secrets as plain text creates a security hole you cannot always plug. This includes when secrets are stored in .tfvars file.

It is also worth mentioning that terminals that have accessed your repository may already have a copy of your secrets stored on a local hard drive. If you have stored secrets in your .tf files or used plain text before, the first thing you want to do is regenerate your keys. This ensures maximum safety. This step also needs to be completed if you use tools like Jenkins.

Securing Terraform

The next preparation to make is Terraform itself. You need to make sure that your Terraform instance is running securely, and that is done by making sure that your .tfstate files are not accessible. At the very least, you want to be very strict with who has access to .tfstate files. It is also recommended to encrypt these files and to manage your Terraform state more carefully.

The latter is actually very important. Services like S3 and GCS all support native encryption, but encryption alone is not enough if the .tfstate files are accessible from the outside. There are two ways you can isolate your state files, with the first one being isolation through workspaces. You simply specify a workspace so that the default one doesn’t get used all the time.

Another way to secure Terraform states is by isolating them using a suitable file layout. In essence, you create multiple environments for multiple files, allowing State files to be stored separately from production code and other resources. This too adds an extra layer of control and allows you to protect Terraform states better.

Managing Secrets: The Best Methods

Now that Terraform is secure and you are managing access to sensitive files carefully, it is now time to start securing secrets. There are multiple ways to secure secrets, but the most common one is by using variables to pass secrets. First, declare the variables that need to be passed on, and then configure Terraform to get secrets from those variables.

If you use var.username and var.password as your variable for username, then your Terraform configuration should display

resource "aws_db_instance" "example" { engine = "mysql" engine_version = "5.7" instance_class = "db.t2.micro" name = "securesecrets" # Set the secrets from variables username = var.username password = var.password } The export command will help you pass variables to Terraform, and you only need to add terraform apply to make sure the variables are used. You can then use password management tools like Pass or 1Password to store secrets as needed. You can integrate a good password management tool for automated secrets delivery too.

The second method is by using encryption to secure files. In many cases, this is an easier method to implement because service providers like Google and Amazon now provide built-in tools for managing secrets and encryption. You can, for instance, use AWS KMS to encrypt files containing your secret. The kms encrypt command will do most of the hard work for you.

Generate db-creds.yml.encrypted file with your secrets stored in them, and then provide sufficient access through YAML by adding yamldecode. Once configured, you can get secrets using strings such as local.db_creds.username, and you don’t have to store secrets as plain text ever again. Protecting the secrets is also a lot easier with the files available only to select users.

The only downside to this method is that you have to decrypt and encrypt secrets whenever you need to make changes. This is a price worth paying, especially since you should not be changing secrets that frequently. Tools like SOPS and Terragrunt make integrating secrets using this method easier too, plus you can be more flexible with the way files are stored.

The last method is using secret stores, which are tools designed specifically for storing secrets. AWS Secrets Manager is a good example of such tools. With Secrets Manager, you can even use the native IAM console to manage access in a granular way. Instead of using yamldecode, you now use jsondecode to integrate stored secrets with Terraform.

As long as you don’t store secrets as plain text, you take extra steps to secure Terraform, and you follow one of these three methods for managing secrets, you will never have to worry about your cloud infrastructure being compromised due to leaks. The methods we discussed earlier also make managing and updating secrets in the long run a lot easier, resulting in higher development and deployment efficiency.

Caylent provides a critical DevOps-as-a-Service function to high growth companies looking for expert support with Kubernetes, cloud security, cloud infrastructure, and CI/CD pipelines. Our managed and consulting services are a more cost-effective option than hiring in-house, and we scale as your team and company grow. Check out some of the use cases, learn how we work with clients, and read more about our DevOps-as-a-Service offering.


Cloud Technology
Security
Artificial Intelligence & MLOps
Mauricio Ashimine

Mauricio Ashimine

View Mauricio's articles

Learn more about the services mentioned

Caylent Services

Artificial Intelligence & MLOps

Apply artificial intelligence (AI) to your data to automate business processes and predict outcomes. Gain a competitive edge in your industry and make more informed decisions.

Accelerate your cloud native journey

Leveraging our deep experience and patterns

Get in touch

Related Blog Posts

OpenAI vs Bedrock: Optimizing Generative AI on AWS

The AI industry is growing rapidly and a variety of models now exist to tackle different use cases. Amazon Bedrock provides access to diverse AI models, seamless AWS integration, and robust security, making it a top choice for businesses who want to pursue innovation without vendor lock-in.

Artificial Intelligence & MLOps

AI-Augmented OCR with Amazon Textract

Learn how organizations can eliminate manual data extraction with Amazon Textract, a cutting-edge tool that uses machine learning to extract and organize text and data from scanned documents.

Artificial Intelligence & MLOps

Building Recommendation Systems Using Generative AI and Amazon Personalize

In this blog, learn how Generative AI augmented recommendation systems can improve the quality of customer interactions and produce higher quality data to train analytical ML models, taking personalized customer experiences to the next level.

Artificial Intelligence & MLOps