Is there any way to run all jobs in a directory like Terraform or Kubernetes?

Is there any way to do this? How do you handle running a bunch of different job files?

How about a GNUmakefile to automate a bunch of things.

What I miss is that there is no “automatic” way of knowing what the job name is wrt to the filename which interferes with a stop-all target in the makefile.

I wonder if there would/could ever be a feature to say nomad job stop -purge myjobfile.nomad instead of the job name.

1 Like

Sounds like a simple for loop iterating over an array with the corresponding job files.

for job in $( ls -1 ${dir} ); do nomad run ${job}; done

As I said, starting the jobs is fine … I have targets like start-all, stop-all, etc.
which are made up of the job names…

Hmm, and I seem to have answered my own question … I will maintain a list of names in the GNUmakefile (I prefer makefile to bash script for this activity) and then I can happily operate on the job in this directory! :slight_smile: :slight_smile: :slight_smile:

Uh, my reply was for the question of the topic. :slight_smile:

I also learned to love GNUmake, btw.

1 Like

I think the following might be more appropriate … :slight_smile:

for F in *.nomad; do nomad run $F; done
1 Like

True, I just figured it would be nice to have it built in to nomad since I would assume most people would be starting up a bunch of job spec files at the same time regularly.