Running Terramate with Terraform on GitHub-Actions

Running Terramate with Terraform on GitHub-Actions

Context

In the last blog, we spoke about Terramate and Stacks and how to implement them, and then we learned how to leverage Globals to generate Terraform code.

In this blog, we will understand Terramate in depth with a CI/CD pipeline.

Terramate, which is a code generator and orchestrator for Terraform that adds powerful capabilities such as code generation, stacks, orchestration, change detection, globals, and more to Terraform.

Our focus in this guide will be on demonstrating how to use Terramate in your Github-Actions CI/CD pipeline for AWS infrastructure creation using Global's definition.

The motive of this blog is to achieve CI/CD with Terramate for your AWS Infrastructure, wherein as soon as you make changes in your Terramate globals configuration in your GitHub repo, it (GitHub-Actions) will generate all the code for 3 different environments (Development, Staging, and Production) and apply the infrastructure(S3 bucket) in your AWS account, which means you apply once in Globals definition and it triggers the pipeline and apply infrastructure for all the stacks.

As soon as you make changes to your Global definition file, it will make changes to all 3 environments with just one push to the GitHub repo.

Agenda

  • Understanding Terramate

    • Terramate and its key features?

    • Globals Definition of Terramate.

  • Creating the Root Stack and required files

    • Walk through the process of building the primary stack using Terramate.
  • Implementing Github-Actions CI/CD Pipeline

    • Steps to integrate Terramate into the GitHub-Actions pipeline.

    • How the pipeline automates the deployment process across different environments.

  • Conclusion

    • Key takeaways from the blog post.
  • Relevant Links

    • Link to GitHub repo and other relevant documents.

Pre-requisites:

Understanding Terramate

Terramate boosts Terraform with Stacks, Orchestration, Git Integration, Code Generation, and Data Sharing. It optimizes the Developer Experience (DX), streamlines workflows, and reduces time spent on infrastructure code. Unlike others, it generates native Terraform code for seamless integration.

Key Features:

  1. Stacks: Isolated units for efficient, collaborative, and safer IaC projects.

  2. Orchestration: Commands execution order based on criteria like tags or changes.

  3. Git Integration & Change Detection: Orchestrate your stacks so only stacks that have changed within a specific pull request are executed.

  4. Code Generation: DRY code via automatic backend and provider configurations.

  5. Data Sharing: Effortless inheritance and sharing of data across stack hierarchy.

  6. Globals: One single file to generate multiple files for multiple environments.

We will be using Globals in this blog a lot, so please go through our previous blog and Terramate documentation for a better understanding.

For more details about Terramate visit our previous blog.

For Terramate Globals documentation visit.

Let’s get started…

Creating the Root Stack and required files

What do we mean by Root stack?

The root stack is the main stack in which the other 3 stacks(Development, Staging, Production) are being defined, just like a folder that contains multiple folders init, here we are using the name for the root stack as “S3”.

  • Create an empty GitHub repo and clone it to your local.

  • Open your favorite IDE, we are using VS Code here.

  • Open the terminal in your IDE.

  • Type “terramate create S3” This will create a stack named “S3”, which will work as a root stack, change the directory to this folder as well (“cd S3” in the terminal).

We are using the git-bash terminal here.

Remember to rename the root’s “stack.tm.hcl” to some relevant name as it will contain all your Globals, we are naming it as “main.tm.hcl” for now.

  • Now create 3 more stacks in that (S3), naming them as Development, Production, and Staging representing each environment stage in your organization (for your case it could be more stacks, but for the sake of example we have considered 3 stacks) which is responsible for spinning up s3 buckets.

Initially, each stack just contains 1 file which is stack.tm.hcl (which is automated generated)**, we will generate all other required files(Terraform files).

  • Now in the root’s main.tm.hcl add the below code:
globals {

  aws_provider_version = "4.27.0"

  aws_region           = "ap-northeast-1"

}

generate_hcl "provider.tf" {

  content {

    terraform {

      required_providers {

        aws = {

          source  = "hashicorp/aws"

          version = global.aws_provider_version

        }

      }

    }

    provider "aws" {

      region = global.aws_region

    }

  }

}

generate_hcl "main.tf" {

  content {

    resource "aws_s3_bucket" "Infrasity" {

      acl = "private"

      tags{

        name = "S3"

      }

      versioning {

        enabled = true

      }

  }

}

}

generate_hcl "backend.tf" {

  content {

    terraform {

      backend "s3" {

        bucket     = "<bucket name>" # this bucket must be made before

        key        = "<location to store>"

        region     = "<you_region>"

        encrypt    = true

      }

    }

  }

}
  • The above code simply makes three files in each stack, 1st file provider.tf will define the provider, in our case it is AWS, and 2nd file “main.tf” contains the S3 bucket configuration and 3rd file “backend.tf” for backend configurations.

Your main.tm.hcl file should look like this:

  • Now generate your Terraform code for all the stacks using “terramate generate” command in the terminal, Terramate generate, generates all the configuration files written in the “main.tm.hcl”(in our case).

This is how your working directory looks after generating the configuration files.

Implementing Github-Actions CI/CD Pipeline

  • To run everything in the pipeline, in the root of the repo, create a folder “.github/workflows” and create “terramate-plan.yml” and “terramate-apply.yml” file in that folder. This will run your CI/CD pipeline on GitHub-Actions.

Use these simple commands in your terminal to create a folder(.github/workflows) and a file(main.yml):

  1. cd ..” to come out of the “S3” stack/folder.

  2. mkdir .github” to make a directory named as “.github”.

  3. cd .github” change the directory of the terminal to the “.github” directory.

  4. mkdir workflows” to make a directory named “workflows” inside the “.github” directory.

  5. cd workflows” change directory to “workflows” now.

  6. touch terramate-plan.yml ” to make a file named “terramate-plan.yml”. This file will be run on the pull-request to the repo.

  7. touch terramate-apply.yml” to make a file named “ terramate-apply.yml”. This file will be run on the push to the main repo.

You can do either the above steps or just do it manually using your IDE GUI.

This is how your directory looks like after all the above steps:

Why we are using GitHub-Actions (CI/CD pipeline)?

Because as soon as you make changes in your file to the GitHub repo, it will run the pipeline from GitHub-Actions and make changes in your infrastructure.

So as soon as we make changes in our code of S3 configuration in your root main.tm.hcl and run “terramate generate” command in our local it will make changes to all 3 environments for S3.

Add the below code to your terraform-plan.yml file:

name: Preview Terraform Plan

on:

  pull_request:

jobs:

  preview:

    name: Plan

    runs-on: ubuntu-latest

    steps:

      - name: Checkout

        uses: actions/checkout@v3

        with:

          ref: ${{ github.head_ref }}

          fetch-depth: 0

      - name: setup go

        uses: actions/setup-go@v3

        with:

          go-version: "1.20"

      - name: Terramate install

        run: go install github.com/terramate-io/terramate/cmd/...@latest

      - name: Terramate List changed stacks

        id: list

        run: |

          echo "results<<COUT" >>$GITHUB_OUTPUT

          terramate list --changed >>$GITHUB_OUTPUT

          echo "COUT" >>$GITHUB_OUTPUT

# If there is any changes made then only the below pipeline will execute, nothing will happen

 - name: Configure AWS Credentials

        if: steps.list.outputs.results

        uses: aws-actions/configure-aws-credentials@v2

        with:

          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}

          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

          aws-region: ap-northeast-1

      - name: Create Terraform plan on changed stacks

        if: steps.list.outputs.results

        run: |

          terramate run --changed -- terraform init

          terramate run --changed -- terraform validate

          terramate run --changed -- terraform plan -out out.tfplan

      - name: Preview Comment For reviewer

        if: steps.list.outputs.results

        run: |

          terramate run --changed -- terraform show -no-color out.tfplan

      - name: Inform about no Changed Stacks

        if: (!steps.list.outputs.results)

        run: |

          echo "### No changed stacks."
  • The config above works in 2 ways:

    • It checks for changes with “terramate list --changed “ command that lists all the changes in the branch pushed, that can be made to the infrastructure on approving the branch.

    • It is for Terraform Plan, “terraform run --changed -- terrafomr run” it plans for the changes in the infrastructure on each git branch pushed to the repo, we are using “terramate run” that runs the command written next to it for all the stacks at once, “--changed -- “ that detects for changes made in each stack, and “terraform plan” to plan the changes with Terraform.

    • For more info about terramate run and --changed -- visit here.

Your “terraform-plan.yml” file should look like this now:

  • The config files check for all changes in the pushed branch and after the reviewer of the repo accepts the changes in the infrastructure, the pipeline will run automatically with the help of “terraform-apply.yml” config file, which runs when you make the push to the main branch.

Add the below code to your terraform-apply.yml file:

name: Run Terraform Apply

on:

  push:

    branches:

      - main

jobs:

  preview:

    name: Apply

    runs-on: ubuntu-latest

    steps:

      - name: Checkout

        uses: actions/checkout@v3

        with:

          ref: ${{ github.head_ref }}

          fetch-depth: 0

      - name: setup go

        uses: actions/setup-go@v3

        with:

          go-version: "1.20"

      - name: Terramate install

        run: go install github.com/terramate-io/terramate/cmd/...@latest

      - name: Terramate List changed stacks

        id: list

        run: |

          echo "results<<COUT" >>$GITHUB_OUTPUT

          terramate list --changed >>$GITHUB_OUTPUT

          echo "COUT" >>$GITHUB_OUTPUT

      - name: Configure AWS Credentials

        if: steps.list.outputs.results

        uses: aws-actions/configure-aws-credentials@v2

        with:

          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}

          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

          aws-region: ap-northeast-1

      - name: Terramate init on changes

        if: steps.list.outputs.results

        id: init

        run: |

          terramate run --changed -- terraform init

      - name: Terramate apply on changes

        if: steps.list.outputs.results

        id: apply

        run: |

          terramate run --changed -- terraform apply -auto-approve
  • The above config is also a simple config implementation of terraform apply command through terramate on only changes made on the infrastructure after each merge to the main repo.

    • It uses “terramate run --changed -- terraform apply” to run the apply only on changes made in the infrastructure, for example, changes in the S3 bucket tags or lifecycle that are required by the organization.

Using the above approach there would be two scenarios when the reviewer is happy with the changes, the reviewer will approve the merge request after they have validated the changes, and then “terramate-apply.yml” config file will run and changes made to infrastructure, if there are certain changes that were not supposed to happen on infra like destroy or change in configuration, the reviewer can decline the merge request after checking the output of Plan configuration on pull-request and avoid changes not to happen.

  • Now push all the work to Github and see the magic of GitHub actions CI/CD pipeline.

Always create branches while fixing or working with the code of infrastructure.

Note: Don’t forget to add your AWS authentication secrets in the variables of the GitHub actions for the seamless run of the pipeline. Need help, visit here.

There could be always two scenarios:

  1. Approving the PR

  2. Denying the PR

Approving the PR:

When the change is intended and required by the organization and repo reviewer such as changes in the tags of the “S3” bucket.

  • Making a branch from the working code-named “bugfix” branch.

  • Go to pull requests and check for the bugfix branch:

  • Click on it, and you will find “plan” is already running for you.

  • It will check for all the changes made by the bugfix branch when merged with the main branch. And reviewer can see the changes that it would potentially make to the infrastructure.

  • On completion of the Plan job, the reviewer can view all the changes made in the infrastructure in the job specially designed for them:

  • And can decide whether to approve or not.

  • In this case, it’s making a small change in the infrastructure:

  • That seems to be allowed, therefore it will be allowed by the reviewer.

Denying the PR:

Each time working on a git-based environment, work with branches avoid making changes on the main branch.

  • For this scenario, we are naming our branch as “features”. That is made by a developer.

  • The developer made changes in the infrastructure file, and pushed it to the Github with branch name “features”.

  • On reviewing this branch on pull-request, it was found out that this branch is destroying some of our important infrastructure such as our S3 bucket, which will do serious damage to the organization.

  • Since it is seen on pull-request what changes the branch can make on merging with the main branch, the reviewer can deny it, and avoid unwanted hazards to the organization.

Check for your infrastructure to be fine in AWS

  • Now let’s visit the AWS account and search for “S3” in the search bar.

  • Here you have your 3 separate S3 buckets deployed into your AWS account, everything looks fine here.

  • Everything seems to be fine and we have added a new tag as well to each of the buckets here it is:

Your complete Directory should look like this now:

Conclusion

Let’s walk through what we have done so far,

  1. We made an empty repo and cloned it to our local.

  2. We added a root stack named “S3”.

  3. We created 3 more stacks in this root stack named as

    1. Development

    2. Staging

    3. Production

Representing all different stages in an Organisation.

  1. And pushed everything to the GitHub repository, after generating all the configuration files with “terramate generate” command.

  2. Made terramate-plan.yml and terramate-apply.yml file in root to run all this on GitHub-Actions, for terramate-plan.yml at pull-request and terramate-apply.yml at push to main brach.

  3. In the end, we learned how to utilize Terramate to automate our AWS infrastructure deployment.

Relevant Links

We understand that managing complex infrastructure can be tough, and making small changes without a lengthy process is important. That's why we recommend checking out Terramate and its examples

Terramate has simplified our Terraform experience, and we believe it can do the same for you. So why wait? Give Terramate a try and improve your Terraform experience today!

Also, join their Discord channel for all future Updates about Terramate.

Did you find this article valuable?

Support Infrasity Blog by becoming a sponsor. Any amount is appreciated!