Best structure for GCP and automation

Hi,

We want to automate the creation of GCP projects when users ask for them in Jira and that we approve the request.

I got it to work by working a lambda that receives a webhook payload from Jira, add the Terraform code in our Git repository and plan/apply in Jenkins.

My current file structure is simple:

departments.tf
main.tf
projects.tf
services.tf
modules:
gcp_folders
gcp_organizations
gcp_projects

departments and services are GCP folders definition. Modules were builded to make the definitions in a standrized way.

It works fine, except for one thing: it’s slooowww… Why? Because of the refresh state. We enable only required services per project, like this:

resource “google_project_service” “services” {
count = length(var.services_list)
project = google_project.new_project.project_id
service = var.services_list[count.index]
disable_dependent_services = false
}

Adding a new project adds about two minutes more to the plan/apply steps :-/

Since we don’t need to refresh the state of the other projects when we create a new one, I was thinking of building a module per project, and use -target to run only this module, and each project will have its own tfstate.

Now the problem is, the project need its parent, and that one is created outside the project module. I could import the parent as a datasource, and get the name of the parent (GCP folder) as a variable from the pipeline.

Is this the correct way? I’m asking here because I don’t find many resources about Terraform and pipelines and automation.

Please note that we use the open source version of Terraform, not the cloud version.

Thanks.