AWS EKS Instances failed to join the kubernetes cluster when using private subnets

I’m having some troubles creating a cluster with managed nodes with bottlerocket in private subnets: when nodes get created the always fail to join the cluster.
I’m using official vpc and eks modules, and following the examples nat gateway is enabled.
I cannot access them with session manager after adding AmazonSSMManagedInstanceCore policy.
What should i look at to understand the error?

Hello SilverXXX
We just encountered the same issue. Did you manage to figure it out?
In case you did, It would be great if you could share what you’ve done.

Thank you in advance

I did a few tests, it was something related to vpc endpoints (or some networkings rules). After removing all the endpoints it worked again

Ok, we will try to do the same.
Thank you for the reply!

No problems, now i’ve got some error with terraform helm provider not downloading charts and it seems it’s a completely separate problem…