How to automate SSH key generation and distribution inside a private cluster with Terraform

Hello everyone,

I’m deploying a Spark cluster (1 master, 2 workers, and a MinIO node) on OpenStack (OVH) using Terraform.
All the instances are in a private network. From my local machine I can already connect to the master node using my public SSH key.

What I want to achieve now is to manage the internal SSH configuration automatically with Terraform:

  • Generate an SSH RSA key pair on the master node.

  • Distribute the public key to the worker nodes and the MinIO node.

  • Allow the master to connect to the workers and MinIO via SSH without manual configuration.

My question is:

What is the best Terraform approach to implement this?

  • Should I use tls_private_key and inject the keys into each instance?

  • Or should I rely on provisioners (remote-exec, file) to copy the generated key from the master to the workers?

  • Are there recommended patterns for handling this securely and automatically with Terraform?

I’d really appreciate any advice, examples, or best practices on how to set this up.

My code

#######################
# SSH Key
#######################

resource "openstack_compute_keypair_v2" "spark_key" {
  name       = "spark-cluster-key"
  public_key = file(var.ssh_public_key)
}

Thanks in advance for your help :slightly_smiling_face:

@amineamine7897 while creating that infrastructure did you tried with the same key to configure them all.

Actually, i cut the public key of my local machine (outside the private network) in .ssh/authorized_keys of all cluster VMs in the private network [master, workers and MinIO(DB)], i don’t have internel ssh connection inside my private network

My code

#######################
# SSH Key
#######################

resource "openstack_compute_keypair_v2" "spark_key" {
  name       = "spark-cluster-key"
  public_key = file(var.ssh_public_key)
}