Migrating production and staging state files to AWS S3 backend

Hi.

We have a very good working setup that allows us to plan and apply to a production and a staging environment.

The local directory structure we are using is that the project root contains all the .tf files and we have sub-directories for production and staging that contain the .tfvars, .tfplan, .tfstate, and .tfstatebackup.

All except the .tfplan is committed to a repo.

We have a backend setup (S3 bucket and a DynamoDB table for the locks).

We want to migrate the current states to the backend.

And here we are stuck.

The terraform init --reconfigure command does not seem to allow us to nominate the location of the existing .tfstate file (production/terraform.tfstate or staging/terraform.tfstate).

Are there any good blogs around on dealing with multiple environments in the way I have described? Essentially the same resources, just different sizes/names based upon the .tfvars

Regards,

Richard Quadling.

I don’t have a blog.

Are you using different unique keys when specifying the various backends? Bucket name can be the same but you’ll need to have different keys so your state objects remain separate.

Yes we do. I think we found a solution. Delete the locally held .tfstate file for the backend in the .terraform directory.

So each time we run the process locally, we get a correct backend state for the environment we are working on.