Vault Agent on Windows Server Can't Authenticate: context deadline exceeded

I followed the documentation to configure a Vault Agent on Windows as a service and was able to successfully login with the AppRole method just as stated in the documentation. To test further, I created a second AppRole and updated the needed files (agent-role-id and agent-secret-id) to reflect this but was unable to authenticate into the Vault server. I reverted the AppRole settings and still can’t authenticate. I was unable to replicate this on other Windows Servers and can even authenticate with both of those AppRoles elsewhere.
Here are the logs from start up:

PS C:\windows\system32> C:\ProgramData\chocolatey\lib\vault\tools\vault.exe agent -config=d:\vault-agent\vault-agent.hcl --log-level=trace
==> Vault agent started! Log data will stream in below:

==> Vault agent configuration:

           Api Address 1: http://127.0.0.1:8100
                     Cgo: disabled
               Log Level: trace
                 Version: Vault v1.8.0
             Version Sha: REDACTED

C:\ProgramData\chocolatey\lib\vault\tools\vault.exe : 2021-08-20T16:10:32.913-0500 [INFO]  sink.file: creating file sink
At line:1 char:1
+ C:\ProgramData\chocolatey\lib\vault\tools\vault.exe agent -config=d:\ ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (2021-08-20T16:1...ating file sink:String) [], RemoteException
    + FullyQualifiedErrorId : NativeCommandError
 
2021-08-20T16:10:32.913-0500 [TRACE] sink.file: enter write_token: path=d:\vault-agent\agent-token
2021-08-20T16:10:32.914-0500 [TRACE] sink.file: exit write_token: path=d:\vault-agent\agent-token
2021-08-20T16:10:32.914-0500 [INFO]  sink.file: file sink configured: path=d:\vault-agent\agent-token mode=-rw-r-----
2021-08-20T16:10:32.914-0500 [DEBUG] cache: auto-auth token is allowed to be used; configuring inmem sink
2021-08-20T16:10:32.914-0500 [INFO]  sink.server: starting sink server
2021-08-20T16:10:32.914-0500 [INFO]  auth.handler: starting auth handler
2021-08-20T16:10:32.914-0500 [INFO]  template.server: starting template server
2021-08-20T16:10:32.915-0500 [INFO]  auth.handler: authenticating
2021-08-20T16:10:32.915-0500 [INFO]  template.server: no templates found
2021-08-20T16:10:32.915-0500 [INFO]  cache: received request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:10:32.915-0500 [DEBUG] cache: using auto auth token: method=PUT path=/v1/auth/approle/login
2021-08-20T16:10:32.916-0500 [DEBUG] cache.leasecache: forwarding request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:10:32.916-0500 [INFO]  cache.apiproxy: forwarding request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:10:32.916-0500 [DEBUG] cache.apiproxy.client: performing request: method=PUT url=http://127.0.0.1:8100/v1/auth/approle/login
2021-08-20T16:10:32.916-0500 [INFO]  cache: received request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:10:32.916-0500 [DEBUG] cache: using auto auth token: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.915-0500 [ERROR] auth.handler: error authenticating: error="context deadline exceeded" backoff=1s
2021-08-20T16:11:32.916-0500 [ERROR] cache.apiproxy.client: request failed: error="Put "http://127.0.0.1:8100/v1/auth/approle/login": context deadline exceeded" method=PUT url=http://127.0.0.1:8100/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [DEBUG] cache.leasecache: forwarding request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [INFO]  cache.apiproxy: forwarding request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [DEBUG] cache.apiproxy.client: performing request: method=PUT url=http://127.0.0.1:8100/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [INFO]  cache: received request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [DEBUG] cache: using auto auth token: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [DEBUG] cache.leasecache: forwarding request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [INFO]  cache.apiproxy: forwarding request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [DEBUG] cache.apiproxy.client: performing request: method=PUT url=http://127.0.0.1:8100/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [INFO]  cache: received request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [DEBUG] cache: using auto auth token: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [DEBUG] cache.leasecache: forwarding request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.916-0500 [INFO]  cache.apiproxy: forwarding request: method=PUT path=/v1/auth/approle/login
2021-08-20T16:11:32.917-0500 [DEBUG] cache.apiproxy.client: performing request: method=PUT url=http://127.0.0.1:8100/v1/auth/approle/login

My Vault server runs on a different machine and other instances of Vault Agent can authenticate successfully.

I even went so far as to uninstall Vault and move to a new directory to no avail.

Any ideas?

I compared the responses from one agent that is failing to another that is not and found that the failing agent is attempting to access the cache via a PUT against the localhost instance like this:

[INFO]  cache: received request: method=PUT path=/v1/auth/approle/login
[DEBUG] cache: using auto auth token: method=PUT path=/v1/auth/approle/login
[DEBUG] cache.leasecache: forwarding request: method=PUT path=/v1/auth/approle/login
[INFO]  cache.apiproxy: forwarding request: method=PUT path=/v1/auth/approle/login
[DEBUG] cache.apiproxy.client: performing request: method=PUT url=http://127.0.0.1:8100/v1/auth/approle/login

The agent that is successful is attempting to access the cache via a GET against the my vault server like this:

[INFO]  cache: received request: method=GET path=/v1/auth/token/lookup-self
[DEBUG] cache.leasecache: forwarding request: method=GET path=/v1/auth/token/lookup-self
[INFO]  cache.apiproxy: forwarding request: method=GET path=/v1/auth/token/lookup-self
[DEBUG] cache.apiproxy.client: performing request: method=GET url=https://my-vault-server:8200/v1/auth/token/lookup-self

Is this significant? If so, how can I correct it?

The root cause of this was I had VAULT _ADDR defined as an environment variable and set to the local agent location, not my k8s instance. Once that was sorted, it started just fine.

2 Likes

Hi,
I have question.

I’m working on integrating HashiCorp Vault into our application using Vault Agent for authentication. The initial setup works well, where the application reads the Vault token from a file generated by Vault Agent and uses it to authenticate with the Vault server.

However, I’m concerned about handling scenarios where the token’s max TTL is reached, and a new token is generated by Vault Agent.

Currently, our application reads the token once during initialization and uses it for subsequent operations. If the token expires and a new one is generated, the application wouldn’t automatically know about the new token, which could lead to failed operations.

To address this, I am thinking to implement a file watcher that monitors the token file for changes. When a new token is generated, the watcher reloads the token and updates the Vault client. While this seems to work in theory, I want to ensure that we’re following best practices and not missing any important considerations.

Here are the specific questions I have:

  1. Is monitoring the token file for changes and reloading the token dynamically the recommended approach for handling token renewal with Vault Agent?
  2. Are there any potential pitfalls or edge cases I should be aware of when implementing this solution?
  3. Are there more efficient or reliable methods to ensure the application always has access to a valid token, especially in high-availability or production environments?

I’d appreciate any feedback or suggestions on improving this implementation.