In our company we have a multi account in AWS approach, meaning we have a differnet account for different environment (common, staging, production).
When i first started the structure my directories I followed the best practice guidelines from hashicorp which meant to separate to different directories per service per region per account.
inside the tf-files.tf i have all the code i need to manage my infrastructure, but as of now im not working my custom modules to replicate the behavior, im simply copy-pasting the code and changing names.
Why i chose that? because im working with tf cloud remote execution and wanted to make it simple and not create my modules repositories in our github.
My question is, is there a better way to manage my infrastructure code, in terms of making a change in one env and wanting it to be identical in the other envs. are modules my only way?
Terraform supports both local and remote module sources. If you prefer to keep everything in a single version control repository then you can specify a relative path as the source inside your module block. In that case Terraform will not need to download the module source code from anywhere else, because it should already be on disk from checking out the root module’s own source code.
Since you are using Terraform Cloud remote operations, you’ll need to make sure your workspace’s “working directory” setting is set to the relative path from the repository root to your root module. Terraform Cloud relies on that setting to understand where the root module lives in relation to any other modules in the same repository, so that it can make sure to include the right set of your source code directories in the bundle of source code sent to the remote execution environment.