Hello!
My team is trying to use Terraform in conjunction with Jenkins to manage some of our infrastructure and are running into a problem with terraform plan and remote state. We have set up a remote backend using an s3 bucket that is accessible both from Jenkins and our local machines.
Our process is to run terraform plan
locally to view changes and have Jenkins run the actual apply. It appeared to be working except that running terraform plan
locally is showing that it will create the whole architecture with nothing deleted or modified so it doesn’t seem to be checking against the remote state. When running terraform state show RESOURCE
it does reference the remote state and shows an accurate representation of the resource.
I confirmed this by setting TF_LOG to debug and running both the state show
and plan
commands and while state show
is accessing the aws s3 bucket, plan
is not. This means that we cannot see what changes will happen before running the Jenkins job. The Jenkins job is correctly updating the architecture and not recreating it.
How can we get terraform plan to run against the remote state?
It is worth noting that we are using workspaces and are selecting the same workspace both locally and on Jenkins. We tried to start from a clean slate locally by deleting the .terraform directory, running terraform init, and then selecting the workspace but still have the same discrepancy between terraform plan
and terraform state show
.
FIGURED IT OUT
The problem was that we have additional environment variables for one of the providers. Locally we were targeting a different provider which is why the plan wanted to create everything from scratch. If you run into a similar issue, check which variables you are capturing via TF_VAR_ in your environment and make sure they match your CI tool :).