Hi all,
I have 1 Nomad Server and 1 Client installed on 2 separate VMs. I have connected both to an External Consul Server. However, I am getting the health check failing issue for both Nomad nodes as per Consul UI.
as per Consul UI
However, serf health status is ok
My Nomad configs in /etc/nomad.d/nomad.hcl
Nomad Server
data_dir = "/opt/nomad/data"
bind_addr = "0.0.0.0"
server {
enabled = true
bootstrap_expect = 1
}
advertise {
http = "192.168.40.10:4646"
rpc = "192.168.40.10:4647"
serf = "192.168.40.10:4648"
}
client {
enabled = false # Disable the client on the server
}
consul {
address = "192.168.60.10:8500"
}
Nomad Client
client {
enabled = true
servers = ["192.168.40.10:4647"]
}
data_dir = "/opt/nomad/data"
bind_addr = "0.0.0.0"
advertise {
http = "192.168.40.11:4646"
}
server {
enabled = false # Disable server functionality on the client node
}
consul {
address = "192.168.60.10:8500"
}
The issue is I think Consul tries to connect to 0.0.0.0:4646 which is not a valid IP, It should be 192.168.40.10:4646 for the Nomad Server and 192.168.40.11:4646 for the Nomad Client.
I sincerely appreciate your kind advice to resolve this issue.
Thank you!