Authenticating multiple Kubernetes cluster

Hi, I’ve looked through existing topics but couldn’t find an answer that satisfied me. I know that we can only have a single Kubernetes auth config per path in Vault. My use case is our platform is deploying multiple Kubernetes cluster around the world to host latency sensitive application, aka game servers(so I really need all those clusters).

Those workload needs access to a single vault and since they all deploy the same app(game server) they all need the same secrets. These clusters can be ephemeral and rotated quite often so Kubernetes auth where we need to configure a url wouldn’t work anyway.

Would there be another option, maybe using jwt to authenticate those remote clusters without having to modify vault config every time a cluster is added?

I have read about pub key chaining in the doc. That could fit my use case although I would need the pub key config to be fetched through a url instead of written in vault directly. Is that possible?

I just want to avoid having to duplicate the same secret in multiple paths just so that auth can be configured.

Thanks

You can have more than one kube auth, using the path parameter :

vault auth enable kubernetes -path=cluster1

vault auth enable kubernetes -path=cluster2

See:

Just that they are managed separately from the vault point of view. So cluster1 needs to authenticate against the cluster1 endpoint, as vault will check with that cluster specifically.

Both can use the same policies, or you can create policy templates based on the metadata. I know for us the meta data available is k8s namespace and ID, service account name and ID, and pod ID.

As for the ephemeral nature of the clusters, it depends on how you create or choose the clusters. An idea is to have the process that deploys the app, also create the auth point and roles ( policies could already be provisioned ) prior to deploying the app. You would just give it its own identity and policies - for example, a github action to a JWT endpoint.