After reaching a limit with Ansible’s encyption workflow, I’ve been wrapping my head around Vault for the past few months, and I intend to share what I hope will eventually be a rapid way of implementing Vault with best practices (Not there yet) for production in AWS with HA and auto unsealing, launched from a cloud9 (no inbound access) AWS instance, which could also build the AMI’s using packer.
In my case it’s for an IAC project to allow VFX artists to do their own cloud rendering, but I think the vault implementation could be useful for others.
When it comes to authentication of an external system (like a laptop onsite) unless I am missing some option, I think there could be improvements.
Currently, I have to copy and paste the remote host’s public key to get vault to sign it in the cloud 9 instance, then copy the signed public key back to the remote laptop, before it would be able to sign into the bastion and delegate future vault requests. Alternatively, we could generate a one time password from the cloud9 instance. Still, both options feel clunky to me.
I’d like to make the process smoother, perhaps you guys have better ideas…
Is something like vault Cloud going to be able to help with this problem, even if we are running our own vault?
If possible in the future: I would hope to use vault cloud to validate an external user with a github / gmail / or AWS IAM (any of these hopefully with MFA) to sign their public key and enable ssh to the bastion host.
Or are there other ways Vault OSS could do this more smoothly I haven’t considered?
Not sure I am totally understanding what you are trying to achieve here, but I will try to help.
You could use Vault and integrate it with an OpenID-Connect provider. This would handle the authentication part of end-users with Vault. Once authenticated, you could use SSH CA-based authentication. You can check the documentation for this, or my own example which you can find here: https://github.com/jeroenjacobs79/homelab-vault-config (check the readme and ssh_* files).
When you SSH CA-based authentication, the user will need to supply their ssh public key to Vault, and the user will receive a certificate from Vault that you need to store on the user’s laptop (don’t overwrite the public key, you will only receive a cert from Vault, it will not sned the public key back I think).
In order to make this process more straight-forward, I would recommend a wrapper script. This script will call Vault CLI with the
-method=oidc parameter, then it will post the public key of the user to Vault and receive the cert back. Then, the script will store this cert somewhere on the user’s laptop and call
ssh to connect to the instance. (you will need to pass both the public key and the cert to the
Is this the information you are looking for? If not, could you provide more details?
Thanks for sharing your example Jeroen, OpenID certianly could play a part, but I don’t think it solves the problem. I love to see you examples for open vpn, thats bound to help me later too. Anyway, here’s the challenge:
If vault is initialised in a private subnet, with public facing bastions, then if we want an external host to connect via a bastion for the first time (or VPN for that matter), we still have to get that external host’s public key to the vault first for a cert, before they could even ssh to the bastion and use vault in the first place (since the bastion needs to accept hosts only with certs). So I’m interested to know the possibilities in how we can streamline this. it looks like the options are:
Using something like cloud 9 I could copy and paste the external hosts public key into vault to retrieve back a cert that could then be copied back to the external host, it would then be able to ssh into the bastion. This cant easily be automated though, and its clunky.
Generate a one time password to ssh into the bastion from the external host. it can authenticate to vault at that point, and get a cert to use from that point onwards and for future ssh session to the bastion. This is probably the best solution.
? not sure if its possible, but can open ID contain a public key and the cert given back to open ID directly from vault? that way I could give open ID the cert from within cloud 9 (or via automation), and then I could ssh to the bastion that way with the external host some how. Just ideation, its probably not a thing.
If there are any other options to provide a cert to an external host to get access to the bastion, before it can authenticate to vault, I’d love to know other possibilities.
Another query I had is that If I’m using vault in HA (3 AWS nodes), and consul for a storage backend (another 3 AWS nodes, maybe ill ditch the consul backend for simplicity down the line and use the integrated raft option), can all these nodes be shurdown and spun back up again without data loss? it looks like the s3 storage backend might be the only option for that, but it says it doesn’t support HA. I’m not dependent on the S3 backend for HA, I just want some place to be able to stash the data store if I want to shut everything down and save costs when idle, and then have everything resume and auto unseal when the day begins again.
Are there any OIDC providers that have some sort of MFA ability freely available? I looked at Auth0 but it looks like MFA is a paid feature.
Just wanting to bump this as I still don’t know the answer - are there OIDC providers out there that provide MFA freely?
I’d like to teach people how to use the infra I’m sharing without them having to be familiar with SSH key management through multiple channels, and MFA is probably the only way to do that in a secure way that would make sense.
If there are any great answers to this I’d love to know!