Nomad Job should obtain access to database automatically with Vault

I am working on setting up a Nomad Job that will be able to run a service which needs to access a database.

I have a Vault cluster already setup and active. I have built a Nomad cluster with Terraform. The Nomad cluster is also integrated with a Consul cluster.

I was able to spin up a database cluster using a Terraform provider. I was also able to connect that database cluster to the Vault cluster with the vault_database_secret_backend_* Terraform resources.

Now I just need to figure out what to put in my Nomad Job specification file that will allow the Nomad/Job/Task to access the database.

I could certainly manually write a secret to Vault, at some arbitrary location:

vault write my-secrets/database/credentials creds=@creds.json

Then I could certainly have the job read those credentials:

{{with secret "my-secrets/database/credentials"}}{{.Data.creds}}{{end}}

However, it seems like the Nomad Job should be able to lease some credentials automatically from Vault that would allow it to connect to that database so that I would not need to add credentials to the Nomad Job specification.

How would I accomplish this?

However, it seems like the Nomad Job should be able to lease some credentials automatically from Vault that would allow it to connect to that database so that I would not need to add credentials to the Nomad Job specification.

Yes! This is one of the key reasons to make use of Vault with Nomad. If you haven’t already, check out the Nomad+Vault tutorial [1] , which has you setup Nomad with Vault to generate dynamic secrets for Postgres using Vault’s database secrets engine.

[1] Vault Integration and Retrieving Dynamic Secrets | Nomad | HashiCorp Developer

1 Like

@seth.hoenig, I think I have looked at that document. I have looked at so many, they are all blurring together in my mind. :smiley:

Looking at it again though is helpful. I am seeing some parts of it that I probably missed the first time.

One thing that would make the article even more helpful is if it also included doing the whole project with Terraform. That seems to be where I am getting hung up. The Terraform resources take care of many aspects under the hood, or are defined in slightly different ways. So I am trying to understand the process from that perspective.

After digesting that article a bit more and reviewing some other references, I have made some more progress.

I run the job

      vault {
        policies = ["database-policy"]
      }

      template {
        destination   = "config"
        data          = <<-EOT

          {{ with secret "<vault_mount.path>/creds/<vault_database_secret_backend_role.name>" }}
          user = {{.Data.username}}
          password = {{.Data.password}}
          {{end}}

        EOT
      }

Vault has the policy:

resource "vault_policy" "database-policy" {
  name = "database-policy"

  policy = <<-EOT
    path "<vault_mount.path>/creds/<vault_database_secret_backend_role.name>" {
      capabilities = ["read"]
    }
  EOT
}

From the command line I can get new credentials:

$ vault read <vault_mount.path>/creds/<vault_database_secret_backend_role.name>
Key                Value
---                -----
lease_id           <vault_mount.path>/creds/<vault_database_secret_backend_role.name>/<redacted>
lease_duration     768h
lease_renewable    true
password           <redacted>
username           <redacted>

The Nomad Job fails with this message:

Template	Missing: vault.read(<vault_mount.path>/creds/<vault_database_secret_backend_role.name>)

Apparently I am still missing something… :thinking:

If this is a new Vault cluster, you’re probably using the v2 KV store. In this case you will need to add data path component to your Vault policy:

resource "vault_policy" "database-policy" {
  name = "database-policy"

  policy = <<-EOT
-   path "<vault_mount.path>/creds/<vault_database_secret_backend_role.name>" {
+   path "<vault_mount.path>/data/creds/<vault_database_secret_backend_role.name>" {
      capabilities = ["read"]
    }
  EOT
}

Then use data to access the secret in your job as well:

      template {
        destination   = "config"
        data          = <<-EOT

-         {{ with secret "<vault_mount.path>/creds/<vault_database_secret_backend_role.name>" }}
-         user = {{.Data.username}}
-         password = {{.Data.password}}
+         {{ with secret "<vault_mount.path>/data/creds/<vault_database_secret_backend_role.name>" }}
+         user = {{.Data.data.username}}
+         password = {{.Data.data.password}}
          {{end}}

        EOT
      }

For more info on Vault KV v2 check the template docs:

Oh, I read about that change. I did not try it because I did not think that the vault_database_secret_backend_[connection|role] were using kv v2. I tried that change just now at your suggestion. I was hopeful, but the result did not change.

Missing: vault.read(<vault_mount.path>/data/creds/<vault_database_secret_backend_role.name>)

It makes sense that the /data/ is not the problem because using the command line the vault read works without it. But when I add the /data/ I see:

No value found at <vault_mount.path>/data/creds/<vault_database_secret_backend_role.name>

The Vault Integration and Retrieving Dynamic Secrets tutorial bears this out as well.

The fact that Nomad is receiving an error about permission makes me wonder what credentials Nomad is using to make the request to Vault in order for it to be denied. When I make the request on the command line, I am using an administrative token that has all access. Perhaps the policy that I specify in the template is not granting enough permission.

Oh you are using a dynamic secret backend, so yeah, the KV2 change doesn’t make sense, sorry for that :person_facepalming:

Check if the Nomad server token has the right permission to access this path. You would need to adjust the token policy to include it.

Ah hah! It turned out to be something even simpler. I did lookup the policies on the server token. The token did not have that policy listed, but that did not seem to be a problem. The problem turned out to be that the Nomad job was specifying a policy name that was different than the actual policy. :man_facepalming:

Such a small thing. That is why they call them bugs. :bug:

Thank you for the assistance. :smiley:

1 Like

For future reference:

If you see this general message in Nomad:
Missing: vault.read(<vault-mount-point>/creds/<role>)

These are some possible translations:

  • mount point is missing (or misspelled)
  • end point is missing (or misspelled)
  • permissions not defined in policy for user/service
  • backend service for mount point is not responding properly (timeout or some other error)