Configure backend remote s3 bucket access

Im using a git-ops styled deployment for some AWS SSM stuff being deployed with TF in a cloud based SVC tool

I want to store the state file in a remote s3 bucket in a different account to the one im deploying to but not sure how to allow the access or how to define this in tf

Im connecting to the account im deploying to via an OICD connection stated within the pipeline that then assumes a role within that account.

How can i configure terraform to put the state file in a different aws account?

It is possible to direct your remote state backend to use resources that are not in the same AWS account - check the documentation on the S3 backend but, more specifically the section on Credentials and shared configuration & Multi-account AWS Architecture

You should also become familiar with Partial Configuration as it is likely you will need to use this.

In short, the backend configuration is where the backend state store is configured and can use different locations, credentials, even different cloud platforms from that which you use to deploy resources. They are completely separate things as far as TF is concerned.

You will likely need to create a partial configuration for the backend that will be checked-in with the code. But then your deployment tool would pass in the relevant parameters (likely via environment variables or K-V pairs) to complete that configuration and set where specifically the state will be stored and what credentials to use for that execution.

A use case for this is a pipeline deployment to non-prod vs. prod environments: When the pipeline is deploying to a non-prod environment it sets the appropriate parameters to use one particular backend state store. But when the same pipeline is deploying to a production environment it sets different values for the backend parameters to direct the state storage to be in a different backend state store, separate and segregated from the non-prod state storage.

Hope that helps

Happy Terraforming!