Hello everyone,
I’m deploying a Spark cluster (1 master, 2 workers, and a MinIO node) on OpenStack (OVH) using Terraform.
All the instances are in a private network. From my local machine I can already connect to the master node using my public SSH key.
What I want to achieve now is to manage the internal SSH configuration automatically with Terraform:
-
Generate an SSH RSA key pair on the master node.
-
Distribute the public key to the worker nodes and the MinIO node.
-
Allow the master to connect to the workers and MinIO via SSH without manual configuration.
My question is:
What is the best Terraform approach to implement this?
-
Should I use
tls_private_key
and inject the keys into each instance? -
Or should I rely on provisioners (remote-exec, file) to copy the generated key from the master to the workers?
-
Are there recommended patterns for handling this securely and automatically with Terraform?
I’d really appreciate any advice, examples, or best practices on how to set this up.
My code
#######################
# SSH Key
#######################
resource "openstack_compute_keypair_v2" "spark_key" {
name = "spark-cluster-key"
public_key = file(var.ssh_public_key)
}
Thanks in advance for your help