Loop over files before uploading to S3

I have multiple files under some root directory, let’s call it module/data/.
I need to upload this directory to the corresponding S3 bucket. All this works as expected with:

resource "aws_s3_bucket_object" "k8s-state" {
      for_each = fileset("${path.module}/data", "**/*")
      bucket = aws_s3_bucket.kops.bucket
      key    = each.value
      source = "${path.module}/data/${each.value}"
      etag   = filemd5("${path.module}/data/${each.value}")
    }

The only thing is left is that I need to loop over all files recursively and replace markers (for example !S3!) with values from variables of terraform’s module.
Similar to this, but across all files in directories/subdirectories:

replace(file(“${path.module}/launchconfigs/aws_launch_configuration_masters_user_data”), “#S3”, aws_s3_bucket.kops.bucket)

So the question in one sentence: how to loop over files and replace parts of them with variables from terraform?

Hi @dmytronasyrov!

The functionality you are describing sounds similar to the template directory module, which can scan over a local directory and render some of the files it finds as Terraform template files, before preparing a data structure that you can use with for_each on aws_s3_bucket_object.

That module is currently in a transitional state moving from being a personal project of mine to being a HashiCorp project, so it hasn’t yet landed at its final source location of hashicorp/dir/template, but maybe you can use it from its GitHub repository for now or you can refer to it as an example for how to implement something similar that is tailored to your exact needs.

The module is now installable as hashicorp/dir/template! :tada: