Multiple Tasks in a Single Job with order

That’s true, you would need to do some clean-up. Or maybe some hacky bash?

while [ "{{ env "my_key"}} != "exptected_value"]; do
  sleep 10
done

If this is a dispatch job, you could use the NOMAD_JOB_ID environment variable as a unique key.

job "example" {
  datacenters = ["dc1"]
  type        = "batch"

  parameterized {}

  group "init" {
    task "work" {
      driver = "raw_exec"

      config {
        command = "sleep"
        args    = ["20"]
      }
    }

    task "unblock" {
      driver = "raw_exec"

      config {
        command = "local/script.sh"
      }

      lifecycle {
        hook = "poststop"
      }

      template {
        data        = <<EOF
#!/usr/bin/env bash
consul kv put ${NOMAD_JOB_ID} done
EOF
        destination = "local/script.sh"
      }
    }
  }

  group "main-1" {
    task "gate" {
      driver = "raw_exec"

      config {
        command = "local/script.sh"
      }

      lifecycle {
        hook = "prestart"
      }

      template {
        data        = <<EOF
#!/usr/bin/env bash
echo {{ key (env "NOMAD_JOB_ID") }}
EOF
        destination = "local/script.sh"
      }
    }

    task "main" {
      driver = "raw_exec"

      config {
        command = "sleep"
        args    = ["10"]
      }
    }
  }

  group "main-2" {
    task "gate" {
      driver = "raw_exec"

      config {
        command = "local/script.sh"
      }

      lifecycle {
        hook = "prestart"
      }

      template {
        data        = <<EOF
#!/usr/bin/env bash
echo {{ key (env "NOMAD_JOB_ID") }}
EOF
        destination = "local/script.sh"
      }
    }

    task "main" {
      driver = "raw_exec"

      config {
        command = "sleep"
        args    = ["10"]
      }
    }
  }
}

Kind of a hack as well, but it works :sweat_smile:

1 Like