Authentication while using a module stored in a GCS bucket in Terraform cloud

Hi All, I am trying to use a GCS bucket as the source of my terraform module:

module "cai" {
  source = "gcs::"

I am able to download the module locally when I run terraform init by setting the GOOGLE_APPLICATION_CREDENTIALS to the path of my service account key file. However, when I publish my configuration to Terraform cloud using the API I get the following error:

│ Could not download module "cai" ( source code from
│ "gcs::":
│ dialing: google: could not find default credentials. See
│ for more information.

I do not want to upload the service account key file itself to Terraform cloud as I cannot mark the file as sensitive, instead, I’ve provided the contents of the file as a sensitive environment variable GOOGLE_CREDENTIALS as described here. However this doesn’t work to authenticate when terraform init is running on Terraform cloud. Terraform cloud is unable to authenticate using this variable and seems to expect the GOOGLE_APPLICATION_CREDENTIALS to be set to the config path. Is there any way terrform init can be made to use the contents of the GOOGLE_CREDENTIALS env variable?

I would enable debugging by setting TF_LOG enviroment variable to DEBUG then do terraform init again and see what’s going on.

Sharing the DEBUG logs here for reference:
run-BVGsTvLf2X53CD56-plan-log.txt (3.1 KB)

Just to clarify, I am not running this locally, instead, I am trying to start a new plan on Terraform cloud after I push my configuration using the cloud API. I’ve followed the following steps:

  1. Created an org on Terraform cloud after signing up.
  2. Programmatically created a new workspace using the client.Workspaces.Create API.
  3. Set a new environment variable, GOOGLE_CREDENTIALS, in the newly created workspace using the client.Variables.Create API. The value of this variable is set to the contents of the JSON key file after removing the newline characters.
  4. Create and upload a new configuration to the workspace using client.ConfigurationVersions.Create/Upload APIs. This configuration makes use of a Terraform module which is published using a GCS storage bucket.

The issue seems like Terraform cloud is unable to authenticate with GCP when trying to fetch the module from GCS. It is expecting the GOOGLE_APPLICATION_CREDENTIALS variable to be set and point towards the service account credentials file. I do not wish to upload this file directly to Terraform cloud as it cannot be marked sensitive and can be viewed by anyone in plain text. I’ve instead provided the contents of the file as a sensitive environment variable GOOGLE_CREDENTIALS. But this doesn’t seem to work. Is there any better solution?

I am not 100% sure, but from the documentation, it looks like you only have three options, if you have to use gcs to store your module source code:
1.provide the key file and point to it using GOOGLE_APPLICATION_CREDENTIALS. terraform from a google instance that has access to the bucket, you can use Cloud Agent for this.
3. use gcloud auth on your own laptop.

or, if you don’t have to use gcs to host your modules:

a.if you could use private module registry in Terraform Cloud to host your modules.

b.use github/gitlab/azuredevops/bitbucket to host your modules