Azure Automation is a service in Azure that allows you to automate processes from within Azure.
An automation account manages several other resources in order to achieve this. I’d be going over these features in this article and how you can use them to automate a simple task.
Case Study
You are managing a software that collects log files constantly. To save storage costs, you have to be cleaning the storage container regularly, let’s say once every week. How can you automate this in azure?
Everything I’d be showing can be done via the azure portal but we would instead be using Terraform.
Step 1 — Create your Azure Automation Account
As always with azure, first create your resource group where all related resources will reside, followed by the automation account.
I will explain the SystemAssigned Identity bit next:
Step 2 — Set up Access
We used the System Assigned Identity access above. Some older automation accounts use Service Principles, but this brings up issues such as:
- Having to renew certificates every year to maintain access.
- The service principle is given access to the entire subscription, whereas a managed identity can simply be given access to only what it needs.
And more…
Using managed identites to access automation accounts is a newer and very useful feature. We can either use user-assigned managed identities or system-assigned managed identities.
In this case, a system-assigned identity suffices, since we are not using it for any other resources apart from this automation account, so it will be created while deploying the automation account (in the Terraform script above).
Now all we need to do is assign it the required role, using Terraform. Here, I will simply be giving it access to the whole resource group, which is where I’d be placing all the resources involved in the task:
Note: In order for the runbook to access the storage account that contains the logging, it must also be in this resource group logging
.
Azure Automation Runbook
We need to first create this resource using Terraform, so Terraform can manage it along with others.
You would create a folder named files
in the same directory your Terraform files exist, and then create the script LogCleaning.ps1
inside it.
This is what your PowerShell Workflow script would look like:
This is a very simple script, you could add extras such as a try-catch block or a way to keep a few log files. You can be as creative as you’d like!
Storage Credentials
You would notice the StorageCredentialsName
in the runbook. This is used to access the storage account where the logs live. We can get these credentials using the Terraform block below.
Assuming we have already created a storage account to store these files in another Terraform block named log_storage
Automation Schedule and Automation Job Schedule
This is what determines how often the runbook executes. The
You would also notice the Params
dictionary in the runbook. There are many ways to pass input parameters to the runbook, but I will be passing it through the schedule in this example since we already require one.
We create a schedule first and then link it to the runbook using a job schedule.
We’re again assumming a storage account was earlier created named log_container
We also need to first get your subscription ID using the snippet below:
Now we will proceed as follows:
And now you just need to run your terraform plan
followed by terraform apply
and you’d have your logs cleaned up every week without you doing a thing!