Consul - cross DC awarness

in cross dc awareness established consul, facing an issue while registering a service with same name.

Consul version- v1.4.0

Details - There is an existing consul cluster of 3 nodes [ centos7 ], in one region. Later did setup a new cluster in another region and established consul awareness using WAN gossip protocol. Came across a scenario, where not able to register a service with the same name in the new environment. Any suggestions?

Most obvious question; any specific reason you are using an older version of Consul; that too not the latest in the 1.4.x series ?
In my opinion; unless there is an overwhelming reason not to, you should use the latest GA. (1.6.2 at this time)

If you must use the 1.4 series, try 1.4.5 (which seem to be the latest of the 1.4 series)

ref: https://www.consul.io/downloads.html
also ref: https://releases.hashicorp.com/consul/

All said and done, this is just my opinion, I’ll let someone from HashiCorp weigh in with more info! :slight_smile: :slight_smile:

Hi @sant84,

Is there a specific error you’re receiving when trying to register the service in the second DC?

Any additional detail you can provide would be helpful.

Hi Blake, error is specific to node id. In the new setup it was throwing an error saying duplicate node id.

Scenario - In the new setup in new region, was trying to deploy service in a VM and trying to register with the same name [ as used in the existing setup]

Hi @sant84, could you show us exactly what you are trying to do and what exactly the error is that you are seeing? Otherwise it is almost impossible to reproduce for us. I just tried registering a service with the same name in two datacenters and that works just fine.

Thanks,
Hans

As i said above, setting up cross regional consul servers [ mutually known to each other]

While registering a node/service in the new setup facing below issue -

10:02:56 sszookeepercentral1dev-central1test-v000-56p4 consul: 2019/11/25 10:02:56 [WARN] agent: Syncing service “ss-zk-carbon-dev” failed. rpc error making call: failed inserting node: node ID “a3297a42-5d64-0b7e-2349-f6a751b14a0f” for node “sszookeepercentral1dev-central1test-v000-56p4” aliases existing node “sszookeepercentral1dev-central1test-v000-s5z0”

10:02:56 sszookeepercentral1dev-central1test-v000-56p4 consul: 2019/11/25 10:02:56 [ERR] agent: failed to sync remote state: rpc error making call: failed inserting node: node ID “a3297a42-5d64-0b7e-2349-f6a751b14a0f” for node “sszookeepercentral1dev-central1test-v000-56p4” aliases existing node “sszookeepercentral1dev-central1test-v000-s5z0”

10:27:21 sszookeepercentral1dev-central1test-v000-cbxk consul: * Failed to join 10.149.1.217: Member ‘sszookeepercentral1dev-central1test-v000-s5z0’ has conflicting node ID ‘a3297a42-5d64-0b7e-2349-f6a751b14a0f’ with this agent’s ID