Hello Everyone,
I have been working on this topic for some time now and wanted to ask for your opinions:
The general problem I see when handling AWS IoT certificates in Terraform is, that the method of using aws_iot_certificate
stores the private key of the certificate in plaintext in the state file.
Of course one could argue, that if a person has access to the statefile they do have some sort of elevated privileges but I personally don’t think this is an argument to store a private key in plaintext in a file when it isn’t necessary.
An ideal solution would be: being able to run terraform from a GitLab pipeline with certificates as artifacts and no private key being stored in the state file.
Until now we tried the following approaches:
1. External Data Resource
Using an external data resource to run the AWS CLI aws iot create-keys-and-certificate
command and then returning the ARN and ID works somewhat, but as Terraform tries to load the data, the command is executed again thus replacing the existing certificates.
2. null-resource with local-exec provisioner and an external data resource
Using a local-exec provisioner to generate the key pair, storing artifacts in the filesystem and then loading them with an external data resource. This works as intendet, meaning the certificates are stored in a local file and are atuomatically attached to a device with the ARN stored in an artifact.
resource "null_resource" "generate_cert" {
for_each = var.things
provisioner "local-exec" {
interpreter = ["/bin/bash", "-c"]
command = <<-EOT
NAME="${each.key}"
REGION=${data.aws_region.current.name}
CERT_DIR="./certificates/$NAME"
CERTIFICATE_PATH="$CERT_DIR/certificate.pem"
PRIVATE_KEY_PATH="$CERT_DIR/private.pem"
PUBLIC_KEY_PATH="$CERT_DIR/public.pem"
ARTIFACT_DIR="./artifacts"
ARTIFACT_PATH="$ARTIFACT_DIR/$NAME-tf-artifact.json"
mkdir -p "$CERT_DIR"
mkdir -p "$ARTIFACT_DIR"
aws iot create-keys-and-certificate \
--set-as-active \
--certificate-pem-outfile "$CERTIFICATE_PATH" \
--public-key-outfile "$PUBLIC_KEY_PATH" \
--private-key-outfile "$PRIVATE_KEY_PATH" \
--region "$REGION" | jq '{id: .certificateId, arn: .certificateArn }' \
>> "$ARTIFACT_PATH"
EOT
}
}
data "external" "load_certificates" {
for_each = var.things
program = ["cat", "./artifacts/${each.key}-tf-artifact.json" ]
depends_on = [ null_resource.generate_key_par ]
}
This approach is not scalable as the local-exec provisioner is executet each time a new entry is added to the set var.things
, so all certificates are replaced as well. The second problem with this approach is, this relies heavily on the generated artifacts to be on the filesystem which cannot be guaranteed since the GitLab filesystem is not persistent.
What are your toughts on this topic? Is terraform able to provide such functionality or do we need to use a different toolchain for certificate provisioning because terraform is not intendet for artifact creation?
I appreciate your Help
Thank you in advance.