Hi,
we are running a nomad cluster with multiple jobs on it. These jobs are defined in separate files. E.g.
job1.nomad
job2.nomad
The job description looks like this:
job "job1" {
datacenters = var.datacenters
type = "service"
group "job1" {
network {
mode = "bridge"
}
task "job1" {
driver = "docker"
config {
image = var.docker_image_job1
}
}
}
}
variable "datacenters" {}
variable "docker_image_job1" {}
variable "docker_image_job2" {}
Its a simple docker job which is using two variables for one job (datacenters and jobX). job1
is using docker_image_job1 and job2
(another file) is using docker_image_job1
.
Normally you would only define the variable of the same job in the job file, but we are using a var file to set the value of the variables. This looks like this (version.vars
):
datacenters = [ "local" ]
docker_image_job1 = "somedocker:1.2.3"
docker_image_job2 = "somedocker:1.2.4"
The problem with version.vars
is when we are using it when deploying the job it always expects the defined variable for each job. E.g.
nomad job plan -var-file="version.vars" job1.nomad
This works when all defined variables are in the nomad job file and in the variables file.
The problem is that we have more than 50 jobs and we dont want to maintain 50 variables in each job file (which will not be used) to satisfy version.vars
.
I was looking for options to e.g. include a file that contains the variable definitions and that can be injected into job files, but I could not find any solution yet.
Is there any way to do this or is my solution not suitable at all?
Thanks,
bert