Issues with using npm packages in TFE deployed Google Cloud Function

I am currently working on deploying a Google Cloud Function and associated resources (bucket, archive, etc) using Terraform. Running my terraform commands from Git Bash on Windows 10 I have a good amount of Terraform and AWS experience, but GCP is new to me. I’ve written many AWS Lambdas using TFE, but it has been about 2.5 years since.

The GC Function is using Node.js and express, however I am having problems accessing express in my deployed function. I actually want to use TypeScript but after running into some issues I decided to create a simple bare bones function with just javascript. Once I figure this out I’ll move to update it to TypeScript.

Project Structure:

├── terraform
│ 	├── modules
│   │   └── function
│   │       ├──
│   │       ├──
│   │       └──
│ 	├──
│ 	├──
│ 	├──
│ 	└──
└── src
|     └── index.js
└── package.json
└── node_modules

Relevant Terraform:

data "archive_file" "source" {
  type        = "zip"
  source_dir  = "../src" # Here is the potential issue
  output_path = "/tmp/function-${local.timestamp}.zip"


When executing terraform apply in the root terraform directory, I get the following:

Error: Error waiting for Creating CloudFunctions Function: Error code 3, message: Function failed on loading user code. This is likely due to a bug in the user code.

And within the GCP Logs Explorer:

Provided module can’t be loaded.

Did you list all required modules in the package.json ?

“Detailed stack trace: Error: Cannot find module ‘express’”

Resolution Efforts

You can see above in my archive_file source_dir that I am pointing to the src folder containing my index.js.

After doing some digging, I’ve realized that because the package.json containing my express dependency is not in that src directory, but instead at project root, it does not get included in my zipped folder.

However, if I set the archive_file source_dir to the project root, terraform apply tries to read the produced terraform.tfstate file within the root terraform directory, and it gives me this error (this also occurs if I move index.js to root and update my package.json main):

Error: error archiving directory: error reading file for archival: read C:\Users<user><proj>\test-project\terraform\terraform.tfstate: The process cannot access the file because another process has locked a portion of the file.


  1. How can I include npm dependencies that are in my package.json in my zipped GCP function folder?
  2. Do I have to include node_modules directly in the archive file? Or will the GC Function automatically recognize the package.json in the zipped dir, and install my dependencies?
  3. Would using a GCP TFE backend vs local backend resolve the .tfstate issue?

I’ve created an SO question here: node.js - Issues with using npm packages in Google Cloud Function - Stack Overflow

I’d greatly appreciate some help, and hope I provided good enough information! Thank you all!