S3 File Upload using Terraform Cloud

I’m moving my local TF to TF Cloud. I’d like to upload files to S3 using TF Cloud. Is there a correct way to do this? I’m currently getting an error that ‘no file exists’.

# main.tf
terraform {
  required_providers {
    aws = {
      source  = "hashicorp/aws"
      version = "3.28.0"
    }
  }
  required_version = "~> 0.14"

  backend "remote" {
    organization = "tnorlund"

    workspaces {
      name = "gh-actions-demo"
    }
  }
}
...
module "python_layer" {
  source      = "./LambdaLayer"
  type        = "python"
  path        = ".."
  developer   = "Tyler Norlund"
  bucket_name = module.layer_bucket.bucket_name
  stage       = var.stage
}
...
#LambdaLayer/main.tf
# Adds a NodeJS or Python Lambda Layer

# Upload the compressed code to the S3 bucket
resource "aws_s3_bucket_object" "object" {
  bucket = var.bucket_name
  key    = var.type == "nodejs" ? "nodejs.zip" : "python.zip"
  source = var.type == "nodejs" ? "${var.path}/nodejs.zip" : "${var.path}/python.zip"
  etag   = var.type == "nodejs" ? filemd5("${var.path}/nodejs.zip") : filemd5("${var.path}/python.zip")
  tags = {
    Project   = "Blog"
    Stage     = var.stage
    Developer = var.developer
  }
}

# Use the uploaded code as the Lambda Layer's code
resource "aws_lambda_layer_version" "layer" {
  layer_name = var.type == "nodejs" ? "analytics_js" : "analytics_python"
  s3_bucket = var.bucket_name
  s3_key = aws_s3_bucket_object.object.key

  description = var.type == "nodejs" ? "Node Framework used to access DynamoDB" : "Python Framework used to access DynamoDB"
  compatible_runtimes = var.type == "nodejs" ? ["nodejs12.x"] : ["python3.8"]
  source_code_hash = var.type == "nodejs" ? filebase64sha256("${var.path}/nodejs.zip") : filebase64sha256("${var.path}/python.zip")
}

Hey Tyler,

I guess this is related to the value of var.path which is not valid in different context (CLI vs Cloud). Have you tried using the special value path.module? See References to Values - Configuration Language - Terraform by HashiCorp

Cheers
Matthieu

I ended up uploading everything to an S3 bucket. This seems to work fine.

Yeah I was going to agree with @ohmer that it looked like it was related to the value of var.path. I have several modules I wrote and use personally that include aws_lambda_function resources and I use ${path.module} for path references. I also typically use archive_file resources to build the ZIP file itself. This has worked fine within TF Cloud.