NashTech Insights

Create Module in Private Terraform Registry with Azure Pipeline

Rahul Miglani
Rahul Miglani
Table of Contents
Close up of computer coding

Terraform Cloud provides the facility to store your terraform data. You can store your tfstate files on workspaces and keep the whole set of module packages with all tf files on the registry. The registry is both Public and Private. And here, we will deal with the private registry for creating terraform module.


  1. Terraform Cloud Id
  2. Azure DevOps Id
  3. GitHub Id
  4. Understanding of API

Basic Setup for Terraform Cloud Module

We set up each environment one by one before creating the pipeline.
1. Create a Token on Terraform Cloud to access the API:
1.1 Go to Settings–>Teams–>Team API Token
1.2 Click on Create a team token.
1.3 Select the number of days for validity. And then click to generate token.
And now, save the token for future use.

Terraform Cloud Team Settings Page
Terraform Cloud Token Generated

2. Now we need to check the access to create module:
2.1 We can do a GET REQUEST and see the access.
In following Image, I used POSTMAN to run do a get request on<organisation-name>/&#8221;.

Since, we generated a token earlier it is to be used as bearer token. And in the headers, use“Content-Type: application/vnd.api+json”.

Terraform Module Get Request Output

As you can see, in “data.attributes.persmissions.can-create-module” is listed as true. Then, we can create a module with this token. After we create the module, it becomes storage for all versions of terrafform module packages.

3. Connect GitHub to Azure DevOps
3.1 To use the repository with with all source code you need to create a service connection for GitHub.
3.2 Create a personal access token on github with repo access.
3.3 Open Service Connection in project settings for GitHub.
3.4 Select personal access token method.
3.5 Authorize.
We can directly add token to our pipeline without going to project settings as done here.

gitHub Personal Access Token
Authorize to use github connection
adding github access token

Creation of Module

  1. We will add the repository in azure pipeline to access the yml we created.
  2. And then, we modify the contents of pipeline as suited like name and location of pipeline.yml.

After authorization, Azure DevOps allows to select the repository required.

Now, we add location and save the pipeline.

Pipeline Identification

Now run the pipeline. The pipeline requires module.json we use following content.

terraform module json

First there was no module. But then one module is created.

empty terrform registry
terraform module

We took references from the documentations of terraform cloud:

The script in pipeline is used in following way:

jobs:  - job: "TerraformPackaging"


      - checkout: self

        displayName: Clean Checkout

        clean: true

     - script: |

          curl  --header "Authorization: Bearer $(TERRAFORM_REGISTRY_TOKEN)" --header "Content-Type: application/vnd.api+json" --request POST --data @${{ parameters.JSON_MODULE_LOCATION }}$(TF_ORGANIZATION)/registry-modules | jq -r

        displayName: "Terraform Registry Module Creation"

        name: "ModuleCreation"

It is a POST request which utilises the information in module.json file to convey Terraform cloud to create the module and the version we want now. The variables used inside the url are taken from variable group defined in pipeline.

Rahul Miglani

Rahul Miglani

Rahul Miglani is Vice President at NashTech and Heads the DevOps Competency and also Heads the Cloud Engineering Practice. He is a DevOps evangelist with a keen focus to build deep relationships with senior technical individuals as well as pre-sales from customers all over the globe to enable them to be DevOps and cloud advocates and help them achieve their automation journey. He also acts as a technical liaison between customers, service engineering teams, and the DevOps community as a whole. Rahul works with customers with the goal of making them solid references on the Cloud container services platforms and also participates as a thought leader in the docker, Kubernetes, container, cloud, and DevOps community. His proficiency includes rich experience in highly optimized, highly available architectural decision-making with an inclination towards logging, monitoring, security, governance, and visualization.

Leave a Comment

Your email address will not be published. Required fields are marked *

Suggested Article

%d bloggers like this: