Dynamic Map(object) how to create?

I’m in the process of learning Terraform, the questions I have are based on the code below. This is not all of the code… The code does function now.

module "route53-health-checks" {

  source = "./route53-health-checks"

  for_each      = var.map
  environment   = each.value["environment"]   #var.environment
  domain_name   = each.value["domain_name"]   #var.domain_name
  resource_path = each.value["resource_path"] #var.resource_path
}




variable "map" {
  type = map(object({
    environment   = string
    domain_name   = string
    resource_path = string
  }))
  default = {
    "1" = {
      environment   = "all"
      domain_name   = "mywebsite.com"
      resource_path = "/"
    }
    "2" = {
      environment   = "misc1"
      domain_name   = "myothersite.com"
      resource_path = "/onlineservices/account/login.aspx"
    }
    "3" = {
      environment   = "misc2"
      domain_name   = "stuffhere.com"
      resource_path = "/"
    }
  }
}

So i have a module that I call using the for_each = var.map

The variable “map” default section I had to create by hand… How can i automate that based of some sort of external input… file etc…? I will have possibly hundreds of entries… Manually creation is not going to cut it.

I have seen things like leveraging Dynamic Blocks, but looks like that is not recommended… Any help would be greatly appreciated…

thanks

I’d say that common patterns could be json, yaml or csv files utilising terraform functions jsondecode, yamldecode and csvdecode.

Thank you… the csvdecode is exactly what I was looking for. For anybody else here is a great link on how to leverage csvdecode…

Watch Jesse Loudon discuss DRY Coding with Terraform, CSVs, and ForEach is a HashiTalk about leveraging the power of Terraform’s csvdecode function to produce a list of maps representing data stored in Comma-separated Value (CSV) format. By combining csvdecode with Terraform’s for_each meta argument we can achieve a DRY / low-code methodology to deploy resources at scale from large datasets.