Sunday, November 21, 2021

Continuous Integration for Infrastructure as Code using Azure DevOps, Terraform & Docker

 How DevOps & Infrastructure as Code can supercharge your cloud deployments

Most companies aspire to release their products and services faster and more reliably, and agile IT systems and solutions are an essential ingredient.

However, the reality is that many companies manage IT infrastructure using a combination of manual effort, complicated processes and a bit of hope and prayer.

As an Azure infrastructure architect with Telstra Purple, I have the opportunity of helping a diverse range of organisations automate and streamline their cloud solutions.

One of the biggest factors was a lack of automation within their cloud infrastructure stack.

The application teams had begun to implement DevOps methodologies, however, without automated infrastructure, application deployment and testing still required a lot of manual and repeated effort.

Quite simply they had continuous integration but not continuous delivery.

This blog describes four key components required to power a fully automated, modern and streamlined infrastructure as code continuous integration and delivery pipeline.

Core Components

Infrastructure as Code (Iac)

One of the most critical but often overlooked components allowing customers to effectively implement DevOps and automate cloud systems is infrastructure as code.
IaC describes infrastructure in a code-based format which is stored in a version control system alongside existing application source code and deployed using continuous delivery and deployment pipelines.
Infrastructure as Code provides many benefits including…

  • Tight coupling between applications and their execution environments.
  • Code testing of infrastructure in the form of unit testing, functional testing and integration testing.
  • Application and infrastructure scalability.
  • Reduced code duplication.
  • More flexibility with disaster recovery solutions.
  • Immutable infrastructure preventing configuration drift.
  • Simplified change management (more standard changes).
  • Deployment speed, reducing a projects time to market.
  • More efficient use of IT staff time and resources
  • Shorter and tighter development and testing feedback loops.
  • Cost savings as infrastructure is shutdown when not in use.

Docker Containers
A Docker image is a lightweight, standalone, executable package of software which includes everything needed to run an application: code, runtime, system tools, system libraries and settings.
Docker makes it easy to create, deploy, and run portable, isolated and self-contained units of application code.

Infrastructure code executed inside containers allow developers to package up self-executing infrastructure modules, including dependencies and libraries and ship them out as a self-contained unit, which can rapidly be configured, deployed and updated.

Versioned Modules
Packaging infrastructure and dependencies inside containers become especially flexible when a solution focussed, and modularised approach is applied to infrastructure code. Containers become infrastructure building blocks, connected and arranged in flexible configurations.

Containers are versioned and tagged making them easy to reference and reconfigure.

Continuous Delivery
Integrating modularised infrastructure as code containers into a CI/CD system such as Azure DevOps provides the ability to tightly couple application code and the dependent underlying infrastructure into an automated end to end deployment solution.

Building a CI/CD Pipeline

Now it’s time to configure these core components into a continuous integration and delivery infrastructure as code pipeline.

Create a Terraform execution environment Docker container

Terraform relies on a number of core modules which change frequently. Keeping these updated on each developer’s pc is time consuming and can cause inconsistencies.

A Docker image containing all of the necessary deployment utilities including Terraform, environment variables and other configurations is a fundamental component of a continuous delivery pipeline.

Step 1. Install Docker Desktop
Docker can be run on any machine type. For this example, I’ll be using a Mac
1. Download docker from https://hub.docker.com/editions/community/docker-ce-desktop-mac/
2. Save Docker.dmg file and launch
3. Docker will be available to run in terminal

Step 2. Create a Terraform execution environment Dockerfile

A Dockerfile describes the configuration of a Docker image. This sample code builds a Docker image including the Terraform bundle. Ensure a file called terraform-bundle.hcl sits alongside the Dockerfile.
Terraform bundler is a Go based method for creating a Terraform installation package which incorporates modules configured in the terraform-bundle.hcl file.

# build step to create a Terraform bundle per our included terraform-bundle.hcl
FROM golang:alpine AS terraformbundler
ENV TERRAVER=v0.12.16
RUN apk --update add git unzip openssh-client && \
go get -d -v github.com/hashicorp/terraform && \
git -C ./src/github.com/hashicorp/terraform checkout $TERRAVER && \
go install ./src/github.com/hashicorp/terraform/tools/terraform-bundle
COPY terraform-bundle.hcl .
RUN terraform-bundle package -os=linux -arch=amd64 terraform-bundle.hcl && \
mkdir -p terraform-bundle && \
unzip -d terraform-bundle terraform_*.zip
view rawgistfile1.txt hosted with ❤ by GitHub

What is the code actually doing? 

  • Pulls a base Alpine Golang image from Docker Hub and labels it as terraformbundler.
  • Adds git, bash and unzip utilities using the apk package manager. 
  • Downloads and checks out terraform.
  • Uses Go to install the terraform-bundle. 
  • Copies the local terragform-bundle.hcl file to the current folder, inside the container. 
  • Runs terraform-bundler package and unzips the contents into a terraform-bundler folder. 
  • Terraform-bundler.hcl - This file contains the Terraform modules installed inside the Docker image.

Step 3. Create a container registry to store the container

A container registry is used to store the Docker image. In this example Azure Container Registry is used (ACR) but Docker Hub can also be used.

Create Azure container registry

az acr create
--resource-group myResourceGroup --name iacrepo --sku Basic

Login to container registry

az acr
login --name iacrepo.azurecr.io

 Step 4. Build a Dockerfile into an image

 Once scripted, the Dockerfile needs to be built and launched as a running container.

 The -t tags the image by the standard repository/image format and the . signifies the path to the Dockerfile. Ensure to tag the image by exactly the same name as the container registry.

docker build -t iacrepo.azurecr.io/terraform-exec-env .

Step 5. Push container to ACR

docker push iacrepo.azurecr.io/terraform-exec-environment

Developing infrastructure using the Terraform Execution Environment

Once the execution environment container has been built it must be integrated it into the infrastructure code development process.

There three main ways to inject source code into a running container.

  1. Using the ADD command in the Dockerfile to add in a folder at build time
  2. Using the Docker cp (copy) command to inject your source code folder into a running container
  3. Mounting a source code repository as a volume inside the Docker container.

Mounting a volume is the most sensible and flexible method for this approach.

Terraform code is usually developed in an IDE outside the container, but always executed inside the container.

Step 1. Run Dockerfile and mount code repository as a volume

The following command creates a running container incorporating all the necessary Terraform modules and the mounted repo and launches an interactive bash shell session.

 docker run -it -v "/Users/username/repo:/home/repo" iacrepo.azurecr.io/terraform-exec-environment /bin/bash 

Step 2. Run Terraform inside Docker container

The following code:

  1. Changes to the directory of the mounted volume code repository
  2. Sets execution permissions on the terraform executable file (which may be necessary)
  3. Initialises the Terraform project

cd /home/repo
chmod ugo+rwx terraform
/go/terraform-bundle/terraform init
/go/terraform-bundle/terraform plan
/go/terraform-bundle/terraform apply

 It makes sense to add terraform to the system path so it can be called without specifying the full path.

Incorporating continuous delivery

Managing infrastructure code using the same tools and practices as traditional software development helps integrate the previously separate development and operation teams.

Infrastructure as code allows continuous delivery to be incorporated into continuous integration.

Application code can now be automatically deployed onto the necessary infrastructure each time its complied and built.

To facilitate this process, each Terraform project needs to have a custom Dockerfile (septate but built upon the Terraform execution environment).

Step 1. Create a custom project specific Dockerfile

The Dockerfile, located in the Terraform project root, builds a container with the latest source code and executes a plan or apply along with input variables, passed as an argument at run time.

This creates a versatile and flexible infrastructure container which is integrated into Azure DevOps For those familiar with Jenkins it performs a similar function to a Jenkinsfile.

# Pull the base terraform execution environment image from the Azure container registry.
FROM iacrepo.azurecr.io/terraform-exec-env
# setup argument which is used to pass and customise Terraform execution command
ARG terraform_cmd
# make folder where source code will be injected into
RUN mkdir -p /home/repo/
# set the working directly to this location
WORKDIR "/home/repo/"
# inject source code into container during build
COPY ./ /home/repo/
# initialise Terraform using a remote backend.
# Azure connection details for test subscription are stored in uat.tfvars
RUN /go/terraform-bundle/terraform init -backend-config uat.tfvars && \
/go/terraform-bundle/terraform validate
# set container to automatically run terraform when the container is run
ENTRYPOINT [ "/go/terraform-bundle/terraform"]
# allow custom arguments to be passed to terraform such as plan or apply
CMD [$terraform_cmd]

Continuous integration and delivery for infrastructure code

Developing infrastructure code should follow the same methods and best practise as application code.

Applying continuous integration and delivery principals to infrastructure solutions ensures that code is regularly, built, inspected and tested and allows multiple developers to contribute to the same solutions.

After code is checked into a developer branch a pull request is raised to merge into a test branch.

A developer should paste the output of the Terraform plan into the pull request so it can be inspected by a reviewer.

Once the pull request is reviewed and approved, code is merged and a Terraform plan is run, the output of which can be compared to the expected results.

Once the CI build has completed successfully a Terraform release pipeline is queued. A gate can be configured to require approval before the release pipeline is executed and built.

Approval of the release pipeline build is subject to a successful code merge and acceptance of the Terraform plan.

The release pipeline is similar to the build pipeline except that a Terraform apply instead of plan is run.

Once the release pipeline has completed executing, infrastructure is built and can be inspected and tested.

The following section documents the Azure DevOps configuration steps necessary to configure continuous integration and delivery as described.
 

Step 1. Create a continuous integration build pipeline

The build is triggered once a pull request to merge branches is raised and accepted.The pipeline builds a Dockerfile into an image containing the Terraform project code and pushes it to Azure Container Registry.

There is a final Terraform plan step which outputs the results of a Terraform plan. This is useful for comparing against the plan steps a developer provided in the pull request.

The arguments section provides Terraform with command line arguments, in this case plan with the path to the input variable file.

Step 2. Set up a build validation branch policy

Enabling build validation ensures that the Docker build triggered by a pull request must succeed before the changes from the dev branch are merged into the test branch.

Step 3. Create a continuous delivery release pipeline. Once a successful Docker build and ACR push has occurred infrastructure can be applied and built. 

A continuous integration trigger must be enabled to ensure the release pipeline is run after a successful build. An approval gate can be added so that the deployment must be approved before it is executed.

The approver is usually the person who accepted and approved the initial dev to test pull request. 

The argument section in the release pipeline passes a Terraform apply instead of a plan.

 

The final end to end workflow

Once the core components are in place the new infrastructure development process looks like this.

1.      An Infrastructure developer runs a bash shell inside the containerised execution environment and mounts their Terraform project code as a volume.

2.      Terraform code is always run from inside the container which ensures that all developers are creating and testing code in identical environments.

3.      When code is ready for testing and branch merging, a pull request is raised. The container is tagged with a version number or tag which allows specific infrastructure configurations to be easily referenced.

4.      The developer copies the output of a terraform plan command into the pull request which allows an approver to understand what a particular pull request will achieve. 

5.      Once the pull request is accepted an Azure DevOps CI pipeline builds a container, executes a Terraform plan command and pushes the container into an Azure Container Registry (ACR).

6.      After the plan is inspected, compared, and accepted, a release pipeline is approved to perform a Terraform apply which deploys the infrastructure into a test environment.  

7.     Once the infrastructure deployment is successfully reviewed and tested it can be confidently integrated into existing application continuous delivery pipelines.

Conclusion

Continuous integration and delivery for infrastructure code has proved very beneficial to our customers as application and infrastructure teams now subscribe to the same methodologies, paving the way for tighter integration and more seamless DevOps adoption.

Time to market and reliability of cloud infrastructure deployments have significantly improved.

If your business would like to take advantage of the latest cloud technologies or has been struggling with persistent challenges, please reach out and set up a complimentary strategy session with one of our specialist consultants. 


No comments:

Post a Comment

Free hosting web sites and features -2024

  Interesting  summary about hosting and their offers. I still host my web site https://talash.azurewebsites.net with zero cost on Azure as ...