The external
data source is designed to integrate with specially-designed external software which generates JSON, so any output that isn’t JSON would be considered invalid.
However, there is a specific protocol for the external program to signal failure, which will then allow you to see an arbitrary failure message:
- Print some messages to stderr
- Exit with a non-successful exit status
Bash’s default behavior when encountering an error is to keep running subsequent commands and hope for the best, so unfortunately it’s not the most ideal language to implement integrations for the external
provider, but in our docs example Processing JSON in shell scripts the script intentionally starts with set -e
to cause Bash to fail immediately with an unsuccessful status if any intermediate command fails.
In your case I don’t think set -e
alone would be sufficient because the command that might fail is in a pipeline, so you’d need to also add -x pipefail
to get that one to fail in a useful way:
set -ex pipefail
With both of those options enabled, bash should fail if any of the intermediate commands fail. As long as those commands also print error messages to their stderr when they fail, these should come together to implement the error reporting protocol that the external
data source is expecting.
Your pipeline is also in a subshell $( ... )
and so I think – but may be misremembering – that Bash will not actually raise an error if that pipeline fails by default. To catch that one it might be necessary to actually manually test the exit status variable $?
using an if
statement, at which point this is of course getting rather messy and may be better to switch to a different programming language to implement the integration.
A long, long time ago, before I worked on Terraform at HashiCorp and when I ran Terraform and Consul together in production, our variation of what you’re doing here was to include in the VM image a ready-to-run script for bootstrapping the Consul Agent, which then meant the command line to run it was relatively simple and the script could be written in a more robust programming language where it was easier to handle errors.
Note that if consul acl bootstrap
is actually modifying Consul in some way then this is technically a bit of a misuse of the external
data source, because data sources are supposed to be for reading data only, not for performing side-effects. If it’s working for you then I’m not going to tell you not to do it, but please do keep in mind that Terraform will re-run that script on every new plan because Terraform does not expect a data resource to do anything other than return some data.