How to work with JSON

Hello everyone, hope somebody can help me with my issue. I need some help on understanding how to work with a JSON full of variables that I’d like to use in my terraform file in order to create my infrastructure. Is this even possible? from what I was able to see, it is possible but I was not able to find a good example that shows me how to do it.
The JSON that I want to use has simple varibles and arrays of arrays (meaning that is not just a simple JSON). Do I need to declare, in a variables.tf, all the variables that I expect to work from the JSON? Any help will be much appreciate it. Thanks in advance.

Hi @JustLeo,

If this JSON file is something you intend to include as part of your Terraform configuration (e.g. checked in to version control alongside the .tf files) then you can load the data structure from it into Terraform using the file function and the jsondecode function.

For the sake of example, I’ll show it loaded into a Local Value, which could then be referenced elsewhere in the configuration:

locals {
  json_data = jsondecode(file("${path.module}/data.json"))
}

The path.module reference here evaluates to the directory containing the .tf file where this expression is written, and the file must exist and contain valid JSON at the time Terraform is initially loading and validating the configuration.

Since you didn’t give any specific example of what this JSON file might contain and what you might want to do with it, it’s hard to show a real-world example of using this, but let’s say that the JSON file contains the following:

{
  "environment_name": "staging"
}

We could use that environment_name property as part of an AWS EC2 VPC tag name, like this:

resource "aws_vpc" "example" {
  # (other settings)

  tags = {
    Name = local.json_data.environment_name
  }
}

Hi @apparentlymart,

Thanks for you response. So, after I posted my question I found a way of doing it but it’s different from what you showed me. Let me try to explain it:

My JSON is big and it’s something like this:

json_variables.tfvars.json

 {
	"KEY_A": "VALUE_A",
	"KEY_B": "VALUE_B",
	...
	"KEY_C": [{
		"KEY_D": "VALUE_D",
		...
		"KEY_E": [{
			"KEY_F": "VALUE_F",
			...
		}]
		...
	}]
}

I’m calling it like this:

terraform apply -var-file="json_variables.tfvars.json"

and within my main.tf file, I’m able to reference all the values from the JSON like this (imagine that I want to get VALUE_F from the JSON example from above):

${var.KEY_C[0].KEY_E[0].KEY_F}

and like this, I can get any value from the JSON. Is this ok? does it have some disadvange? or it’s fine and it’s just a different way of doing it?

Thanks.

Hi @JustLeo,

Semantially what you’ve done here is a little different than what I was describing: rather than having your Terraform configuration read some JSON, you’ve instead defined some input variables and then instructed Terraform itself to read the JSON and use the values inside as the values for your variables.

Both of these are valid approaches. As is often the case, there are some tradeoffs to make to decide which one to use:

  • If you use file and jsondecode as I showed then your Terraform configuration is self-contained: you can just run terraform apply and it will automatically find all of the data required, without the person running Terraform needing to know anything about that file. In this sense, the direct loading approach makes the JSON file an implementation detail of the Terraform configuration, rather than part of its interface.

  • If you use variables and a .tfvars.json file then you’ve created an opportunity for those values to vary between runs. You could potentially have multiple .tfvars.json files and choose a different one each time you run terraform apply. These settings are now part of the interface of the configuration, and so must be set by anyone applying that configuration. They have some flexibility in how to do that, though: they can use a .tfvars.json file as you did here, or a .tfvars file in native Terraform syntax, or set values individually on the command line using -var.

Where possible it’s nice for a root module to have no variables at all so that it’s easier to use and clearer how it is intended to be used (terraform apply with no arguments), and so I would personally lean towards the first option unless there’s some benefit to these values being settable differently on a per-run basis rather than “hard-coded” as part of the configuration. But if you aren’t sure right now, it’s not a big deal: all of the implications of this decision are localized to just this one configuration, so relatively easy to change later if you decide to take the other approach for some reason.

1 Like

Thank you so much for your answer, it is clear as water. I really appreciate it.

I didn’t see a reference to template_file. It is a great way to inject variables into json. Here is an example.

data "template_file" "prometheus2_role_default_attributes" {
  template = file(format("attributes/prometheus.tpl")

  vars = {
    prometheus_url    = var.prometheus_url
    postgres_exporter = "${lower(var.env_type)}-hostname.com"
    pe_kube_exporter  = "${lower(var.env_type)}-hostname.com"
    environment       = lower(var.env_type)
    region            = lower(var.region)
    service_provider  = lower(var.service_provider)
  }
}

Then in the prometheus.tpl files, you just reference the variables with the ${} syntax. Here is a snippet of that json. The prometheus.tpl file is just straight json, but with a tpl extension to show it is a template.

            {
              "replacement": "${service_provider}",
              "target_label": "provider"
            },
            {
              "replacement": "${region}",
              "target_label": "region"
            }

Then we send the json to our chef resource. You can probably do anything you wanted with that json though at that point, jsondecode etc…

resource "chef_role" "prometheus2_server" {
  name = "${lower(var.env_type)}-${lower(var.application)}-${var.component}"

  run_list = [
      ...
  ]

  override_attributes_json = data.template_file.prometheus2_role_default_attributes.rendered
}

This topic was originally about reading JSON rather than generating JSON, but thanks for sharing that suggestion anyway @fortman :slight_smile: !

As an alternative for Terraform 0.12 and later I’d recommend using jsonencode to generate JSON, because then Terraform can ensure that the result is valid JSON and take care of any necessary escaping for you.

The easiest way to use jsonencode is to just call it inline in your main configuration, but if you can also use the templatefile function to render an external Template (templatefile has replaced the template_filedata source for most uses in Terraform 0.12):

override_attributes_json = templatefile("${path.module}/attributes/prometheus.tpl", {
  prometheus_url    = var.prometheus_url
  postgres_exporter = "${lower(var.env_type)}-hostname.com"
  pe_kube_exporter  = "${lower(var.env_type)}-hostname.com"
  environment       = lower(var.env_type)
  region            = lower(var.region)
  service_provider  = lower(var.service_provider)
})

You can use jsonencode inside the template:

${jsonencode({
  # etc, etc
  something = [
    {
      replacement = service_provider
      target_label = "provider"
    },
    {
      replacement = region
      target_label = "region"
    },
  ]
})}

Rendering JSON via string templates is still fine, of course! Just wanted to share the above as an alternative that might be easier to write and maintain using Terraform 0.12 features. :smiley:

2 Likes