Sharing state between pipelines

Hi! Im faced with the following issue:
I have a project that consists of 2 products (x and y) which will be using a lot of the same azure resources but have slightly different needs. In different use cases I might need to deploy just x, just y or both x and y. Since they share many of the same resources i need to place them in the same resource group and I want to use a single state file to maintain the project.

I have set up three pipelines, one for the common resources and one for x and y respectively, with the same backend blob initialized. However after running the common one I would like to supply the resources for x and y but the corresponding pipelines only rewrite the state and delete the common resources before deploying the specific x or y resources.

How can I, if possible, add to the common state blob from the pipelines of x and y?
Best regards!

Hi @Henrikkiaer!

As far as I understand, you have the following setup from the Terraform point of view:

- project “common” with its own Terraform code in a separate folder
- project “X” with its own Terraform code in a separate folder
- project “Y” with its own Terraform code in a separate folder

where all three TF projects (folders) use their own unique codebases

And azurerm back-end config with the same settings for all three TF projects

In such a case, each TF project does not know about other codebases; and since all three share the same state, when the apply is executed for a certain project, TF destroys the resources that are not described in the codebase of that project.

If my understanding was correct, then to mitigate your issue, you might try three different states with the terraform_remote_state Data Source as a way for products X and Y to get the info about their common resources.

1 Like