On instance vault integration with multiple Kubernetes clusters

Hi Team,

i’m new to vault and trying to integrate it with kubernetes clusters.
Vault is on a single AWS instance running with consul as storage backend . We are looking to integrate this vault instance with multiple EKS clusters.
In docs i can see to integrate vault we have to install it in kubernetes and provide kubernetes_host for the cluster. then enable kubernetes auth.
doc link

vault auth enable kubernetes

vault write auth/kubernetes/config \
   token_reviewer_jwt="$(cat /var/run/secrets/kubernetes.io/serviceaccount/token)" \
   kubernetes_host=https://${KUBERNETES_PORT_443_TCP_ADDR}:443 \
   kubernetes_ca_cert=@/var/run/secrets/kubernetes.io/serviceaccount/ca.crt

is there a way to provide multiple host or use single instance with multiple clusters?

Thanks.

This example shows a simple setup for someone who has only one Kubernetes cluster … but you can enable multiple instances of the Kubernetes auth method at different Vault paths, and point each one to a different cluster.

1 Like

Since you have to initiate the connection from the Kubernetes hosts you are limited to one at a time. Each cluster would need it’s own vault role – since it uses the local serviceaccount and JWT token.

1 Like

Thanks @maxb , will try this approach.

Hi @maxb @aram, found this blog vault with multicluster it’s using the approach @maxb shared but EKS jwt token expires after an hour. Is there a way to handle this natively in vault? or writing script to renew tokens is the only way? Thanks!