Best way to organise repos creation using the terraform github provider

I am really new to terraform but I am already loving due to its awesome and what it can help achieve. I would like to know how best I can organise my terraform repo to allow developers in my organisation to simply open a PR when there needs a github repository to be created for them and once that PR is merged the given repo(s) is created for them. For now I have an dumb way to do this which is to have a map and let them add the repos they want like this:

repos = {
  web-poc = {
    repo_name                   = "web-poc"
    repo_description            = "For Plat web testing"
    repo_status_checks          = ["ci/travis", "codecov", "commitlint"]
    repo_approving_review_count = 2
    repo_teams = {
      developers = "pull"
      test-team  = "triage"
      poc-team   = "maintain"
    repo_disable_branch_protection = true
  backend-poc = {
    repo_name        = "backend-poc"
    repo_description = "Wow"
    # repo_approving_review_count = 1
    repo_teams         = { poc-team = "triage" }
    repo_status_checks = ["codecov"]

However I would like to have a folder containing each repo file and then terraform can build this map automatically from that folder. That way developers don’t have to edit that file because that file .tfvars file.

The solution I have in mind is to create either a folder containing either json or a yaml file for each repo and then read that folder to build that map. However I have tried but not successful.


How about something like this:

locals {
  files = fileset(path.module, "files/*.json")
  repos_list = [
      for file in local.files : jsondecode(file("${path.module}/${file}"))

output "files" {
    value = local.files

output "repos_list" {
    value = local.repos_list
$ terraform plan                                                                                                         ✔  15:23:19

Changes to Outputs:
  + files      = [
      + "files/bar.json",
      + "files/foo.json",
  + repos_list = [
      + {
          + key1 = "value1"
          + name = "bar"
      + {
          + key1 = "value1"
          + name = "foo"

so I have JSON files for each repo under the files folder, and read all files and combine them into one large list. Then you could iterate through this list from one shared resource using for_each, for example.

Thank you so much. I just did the same thing like you said. I thought there were better way to accomplish this.

1 Like

@Tochemey Or, I would say creating a custom module may be a better way of doing this:

so you create a shared module that contains the github_repository resource, and call that module from each file:

module "repo_foo" {
  source = "./path-to-module"

  repo_name = "foo"
  repo_description = "foo desc"
module "repo_bar" {
  source = "./path-to-module"

  repo_name = "bar"
  repo_description = "bar desc"

this way, you can also define default values for required attributes and it’s more human-readable/writeable than JSON/YAML.