AWS - Upload full folder to bucket

Goal: I am using Terraform to configure a Cloud9 environment. I want to have two different folders available to the user when they open the environment, A and B, as well as all the contents within the folders

Problem: As there does not seem to be a way to add files directly to Cloud9 (using Terraform, that is), my plan was to upload the needed files to an S3 bucket and simply have the user download the contents of the bucket onto the environment, which is easily done with a command.

My problem is that uploading a whole folder to the bucket is much less simple. I have tried looping around the folder contents, but only the top level files are copied and not the folders within. Is there a way to achieve this? The contents of the folder are too big to create each directory within and upload the files that way.

Does anyone know of a way I can upload the folders A and B without a lot of scripting overhead?

Hi @tformuser,

The Terraform module hashicorp/dir/template might help here. In particular, refer to the “Uploading Files to Amazon S3” section for an example of using this module in conjunction with the aws_s3_bucket_object resource type. (If you are using a recent version of the hashicorp/aws provider then you shoud use aws_s3_object instead of aws_s3_bucket_object, but the same principle applies either way.)

This module does have the option of rendering the files in the given directory as templates, but if you don’t need that then you can just avoid including any files whose names have the .tmpl suffix (or change the template_file_suffix input variable) and then it will treat all files as opaque static files and return their paths verbatim.

1 Like

Thanks, that’s doing what I need! I just need to refine the commands a little more and it will suit my use case