I’ve been using vault lately to evaluate its feasability for an upcoming project and so far am quite happy with it
My main concern today is that there is a good chance, that we end up with up to 50 million secrets, that need to be stored in the kv secrets engine (possibly in multiple subfolders).
Does anyone have experience with this amount of data inside vault and maybe can recommend a storage backend for this task? The only constraint currently is, that it must be hosted onsite and can’t be a cloud-based storage backend.
To give this question some background:
- I’m currently running vault with a postgres backend, both inside docker-containers on a virtual (ubuntu) server
- My test script is inserting secrets into vault and currently sits at approx 9 million secrets
- The rate of insertion has steadily dropped from approx. 75 secrest/second to now approx. 20 secrets/second
- When I open the UI (while the test is running), it needs 5-7 minutes in order to show the list of secrets
The actual use-case for the project at hand would be something like this:
- have an ‘archive’ folder/path with the said max 50 mio secrets, that basically never need to be accessed. In the event, that it is accessed, it may very well take a couple of minutes - no problem
- have a ‘work directory’, that at any point holds roughly 100k secrets, that are actively added, modified, retrieved and finally moved to the archive. Actions in this folder should be “fast”
I’m still very much experimenting in order to get a feeling for vault and how I can use it best to solve my challenges. I appreciate any input, that you guys could provide and look forward to an interesting discussion.
Many thanks in advance