how to use imported terraform.tfstate file in Azure DevOps setup, I mean how we can connect to the same terraform.tfstate file which was imported on local machine ? how we migrate the terraform.tfstate file to Azure storage and connect that in ‘Terraform init’ phase instead of creating a new terraform.tfstate (main purpose is to import the existing resources to terraform.tfstate and then use the imported terraform.tfstate file in Azure DevOps)
Hi,
You need to create a state backend in Azure Storage first.
Here is an article on how to do that: Store Terraform State Files in Azure Remote Backend - Learn IT And DevOps
Then, you need to update your code. Every root module should be added the Azure Storage backend configuration. This is the backend
block in the terraform
block. In the article example this is this bit:
terraform {
backend "azurerm" {
resource_group_name = "myrggroup"
storage_account_name = "mystorageaccount"
container_name = "terraformbackend"
key = "deploy.tfstate"
}
...
}
Then you terraform apply
. Terraform will notice that the backend configuration has changed and offer to copy the state file from your local disk. Double check your backend
block content and accept. Add a line in your .gitignore
to stop automatically commiting that file if present (I actually use https://www.toptal.com/developers/gitignore/api/terraform to make my life easier). Delete the terraform.tfstate
files. Order beer and pizza. Repeat for each root module.
Hi Ohmer,
Thanks for a quick reply.
I still have a question, once we update the backend config to use Azure storage how the next Azure resources provisioning will happen using Azure DevOps ?
How Azure DevOps will know that it doesnot have to create a new tfstate file but to use the existing one as shown by you in terraformbackend container ?
here is a link I am using for Azure DevOps :
Thanks
How you run Terraform and where you store your state are 2 independant things. After you add the backend
block and commit, this is same stack and Terraform will run and access Azure Storage whether you run from CLI or inside an Azure DevOps Pipeline container.
Where there is an impact is that you need to make sure that regardless how you run it, the Terraform process itself must have the appropriate credentials to run successfully. By adding a state backend, you have more permissions required. This might work in your terminal because you might have the permissions in the context (often some environement variables) but it might fail in Azure DevOps because the privileges are not there (credentials are often static values defined as Pipelines variables which are exported to environment variable so same interface from Terraform point of view, this is not ideal but very common).
Can you please share an example or url where someone has migrated local state file to Azure container and setup a Azure DevOps pipeline by using the migrated state file instead of creating a new state file.
My problem is, I have few resources already created manually in Azure Resource Group , I need to import these resources to local statefile and then migrate to Azure container and setup DevOps pipeline for automated cicd deployment of infra as code.
It would be big help if you share few links i can refer
Thanks
You can import your resources in your state before or after moving backend. I would import first, make sure there is no drift and then move. If after moving backend you have no drift it means that you achieved the expected result. If you do the opposite, you will need to make sure you have the same drift before and after.
I couldn’t find a example that describe how to migrate from local to Azure Storage. There is an example to migrate from local to Terraform Cloud remote backend tho (link below). If you adapt this process to use Azure Storage (different backed block), that’s what I described. Note that I made an error in my description. You need to use terraform init
before terraform apply
to copy the state file to the remote backend.