Shell script as "dynamic" data source?

Hi all!

I am just fiddling around with a TF script and Terragrunt to deploy multiple Openstack environments at once. All environments are quite the same except some things which are handled using a tfvars file. So far things run fine, but there´s one thing I need help with…

In every environment a number of VMs gets IPs from a Floating IP Pool within Openstack while being created. Those IPs are not reserved IPs, so they are kinda dynamic. The big problem here is that I need to create Security Rules with all the IPs from all environments in every environment using Terraform. I have a shell script which collects the IPs from files configured by, which works fine. But unfortunately I can´t use local-exec to read data using the shell script when needed as there is no output option.

Question : Is there a way to collect and use those IP addresses from all environments in every environment while running “terraform plan/apply/…”?

Thanks in advance and best regards,

Hi @Olly,

The provider hashicorp/external has a data source called external which runs an external program and parses its stdout as a JSON object with string values.

Its requirement to use strings only makes it slightly awkward for your situation of capturing a list, but if you can make your script produce a JSON string containing multiple addresses separated by spaces then you can use Terraform’s split function to transform that into a Terraform list.

Hi @apparentlymart ,

looks like it´s kind of the right way. My bash script sends the desired JSON with all the IPs separated with a space as single value, which I can then handle with the split() function.

Unfortunately Terraform tries to gather all those data section before even starting to create any resource. So when i try to plan or apply my configuration, the data section just blocks further processing. You know, my bash script waits for all those outputs to appear.

Isn´t there a way to have a data source that collects it´s data at a specific point in time? Best way would be if there´s a way to ensure that a resource and it´s data source will be processed last in line when running apply.