How to update lambda when code artifact is in S3

I am trying to update my scripts to update a lambda from a jar stored in S3.

Right now I am deploying code to an S3 bucket, and another file containing the SHA for the jar. I’m generating the SHA of the jar using

openssl dgst -sha256 -binary "./target/$FILENAME.jar" | openssl enc -base64 >./target/$FILENAME.sha256

I have a data element described as

data "aws_s3_bucket_object" "java_lambda_code_sha" {

bucket = var.artifacts_s3_bucket
key = “my-jar.1.0-SNAPSHOT.sha256”
}

And then my lambda uses it as

resource "aws_lambda_function" "the_lambda" {

depends_on = [
data.aws_s3_bucket_object. java_lambda_code_sha]

s3_bucket = var.artifacts_s3_bucket
s3_key = “my-jar.1.0-SNAPSHOT.jar”
source_code_hash = data.aws_s3_bucket_object. java_lambda_code_sha.body

I’m not sure what I’m doing wrong. When I update the jar, I create a new sha file. When I run an apply, the lambda is not updated.

How should I be doing this? Thanks!

I had some similar issues if I remember correctly. We defined the initial rollout of the Lambda as well as all the aws config in terraform. For updated code, we just used our ci system and not terraform, because I couldn’t get it working right away. I also wanted something a little lighter for our CI process. The CI system just does an AWS command line call. Not perfect, but it is simple and works.

aws lambda update-function-code --function-name (name) --zip-file fileb://(zip-file) > /dev/null