Hello @apparentlymart ,
Kindly find below error stack as part of terraform plan
zampc816:test zabelesn$ terraform plan
Releasing state lock. This may take a few moments...
Error: Invalid reference from destroy provisioner
on ../../modules/k8s_manifest/main.tf line 29, in resource "null_resource" "kubernetes_manifest":
29: command = templatefile("${path.module}/kubectl_apply.tpl.sh", local.template_input)
Destroy-time provisioners and their connection configurations may only
reference attributes of the related resource, via 'self', 'count.index', or
'each.key'.
References to other resources during the destroy phase can cause dependency
cycles and interact poorly with create_before_destroy.
Error: Invalid reference from destroy provisioner
on ../../modules/k8s_manifest/main.tf line 29, in resource "null_resource" "kubernetes_manifest":
29: command = templatefile("${path.module}/kubectl_apply.tpl.sh", local.template_input)
Destroy-time provisioners and their connection configurations may only
reference attributes of the related resource, via 'self', 'count.index', or
'each.key'.
References to other resources during the destroy phase can cause dependency
cycles and interact poorly with create_before_destroy.
Error: Invalid reference from destroy provisioner
on ../../modules/k8s_manifest/main.tf line 29, in resource "null_resource" "kubernetes_manifest":
29: command = templatefile("${path.module}/kubectl_apply.tpl.sh", local.template_input)
Destroy-time provisioners and their connection configurations may only
reference attributes of the related resource, via 'self', 'count.index', or
'each.key'.
References to other resources during the destroy phase can cause dependency
cycles and interact poorly with create_before_destroy.
Error: Invalid reference from destroy provisioner
on ../../modules/k8s_manifest/main.tf line 29, in resource "null_resource" "kubernetes_manifest":
29: command = templatefile("${path.module}/kubectl_apply.tpl.sh", local.template_input)
Destroy-time provisioners and their connection configurations may only
reference attributes of the related resource, via 'self', 'count.index', or
'each.key'.
References to other resources during the destroy phase can cause dependency
cycles and interact poorly with create_before_destroy.
Error: Invalid reference from destroy provisioner
on ../../modules/k8s_manifest/main.tf line 29, in resource "null_resource" "kubernetes_manifest":
29: command = templatefile("${path.module}/kubectl_apply.tpl.sh", local.template_input)
Destroy-time provisioners and their connection configurations may only
reference attributes of the related resource, via 'self', 'count.index', or
'each.key'.
References to other resources during the destroy phase can cause dependency
cycles and interact poorly with create_before_destroy.
terraform version
Terraform v0.13.2
+ provider registry.terraform.io/carlpett/sops v0.6.0
+ provider registry.terraform.io/hashicorp/aws v3.33.0
+ provider registry.terraform.io/hashicorp/external v2.1.0
+ provider registry.terraform.io/hashicorp/helm v1.3.0
+ provider registry.terraform.io/hashicorp/kubernetes v1.13.1
+ provider registry.terraform.io/hashicorp/local v2.1.0
+ provider registry.terraform.io/hashicorp/null v3.1.0
+ provider registry.terraform.io/hashicorp/random v3.1.0
I have recently upgraded from version 0.12.29 to 0.13.2 these peice of code started failing during terraform plan.
Codebase related to it .
locals {
template_input = {
url = var.url
manifest = var.data
validate = var.validate
namespace = var.namespace
kubeconfig = var.kubeconfig
}
}
resource "null_resource" "kubernetes_manifest" {
triggers = {
manifest_sha1 = sha1(jsonencode(var.trigger == null ? local.template_input : var.trigger))
}
provisioner "local-exec" {
environment = {
KUBECONFIG = "/tmp/kubeconfig_${uuid()}"
}
command = templatefile("${path.module}/kubectl_apply.tpl.sh", local.template_input)
}
provisioner "local-exec" {
when = destroy
environment = {
KUBECONFIG = "/tmp/kubeconfig_${uuid()}"
DESTROY = true
}
command = templatefile("${path.module}/kubectl_apply.tpl.sh", local.template_input)
}
}
it invokes below sh file .
#!/usr/bin/env sh
if [ -z "$DESTROY" ]
then
OPERATOR="apply --validate=${validate}"
else
OPERATOR="delete"
fi
cat <<EOF > $KUBECONFIG
${kubeconfig}
EOF
%{if url == null}
cat <<EOF |
${manifest}
EOF
%{else}
curl ${url} |
%{~ endif ~}
kubectl $OPERATOR\
%{if namespace != null}--namespace ${namespace} %{endif}\
-f -
ret_code=$?
rm $KUBECONFIG
exit $ret_code
As mentioned earlier , I tried to play around with multiple combinations like refer self. , then fetching the values in map, or exporting values in the null_resource but nothing worked well so now rollbacked to previous state where it was working before.
Hope I have included all the information related to look at the issue, In case of missing any information let me know, will add to it.
Your help is appreciated .
Thanks,
Snehil belekar