Hello,
we are running few projects using nomad-pack using gitab ci/cd pipeline. The steps we use are:
- Pulls the pack repo which is another repo.
- Creates vendor deps for the pack by loading the helper packs.
- then deploys using nomad-pack command but uses hcl files depending on the environment. for eg for test:
nomad-pack run “packs/project1” -f “packs/project1/test-variables-project1.hcl”
Our nomad cluster, which consists of 3 servers and 3 agents/clients are well integrated with Consul and Vault - where all our env variables and secrets go. Usually - any changes in vault or consul, refreshes the containers automatically and adds the changes inside the container.
But here is the issue - when we deploy with nomad pack, it deploys fine but with any changes in var or secrets which refreshes the container, it goes back to using usual nomad container rather than nomad pack container with pack details. Which means the next time the pipeline runs, we get the following error:
$ echo "${INATEC_LOCAL_CERT}" > $CI_PROJECT_DIR/server.crt # collapsed multi-line command
! Failed Job Conflict Validation
! Error: job with id "project1" already exists and is not managed by nomad pack
! Context:
! Template Name: project1/templates/project1.nomad.tpl
Error: Failed to deploy nomad pack
Cleaning up project directory and file based variables 00:01
ERROR: Job failed: exit code 1
This is a bit weird and I am unable to troubleshoot it. I understand that nomad-pack saves the state but it is still not terraform and not sure what the solution for this is.
Does nomad-pack also need some consul integration? has anyone come across such an issue? The documentation are too basic to give a proper insight into this behavior.
Any help is greatly appreciated.
Kind Regards,
Sumeet