Cannot establish connection between my database and backend with nomad and consul

Hai There,
i am new to nomad and trying to do some practicals i have a simple application a go http server which connects to postgres databse and handles ‘/’ route it takes all db configs as ENV .
i configured most of my nomad file from a demo app at github.com/schmichael/django-nomadrepo but the application cannot connect to database!

my real question is what should the host be ? is it localhost or the name of the service (in my case postgres-con) ps: i tried both i am unable to connect postgres-con says host not found and localhost/127.0.0.1 says its unable to connect to db as 5432 is not open …

I am pasting my nomad config below

job "deploy2" {
    datacenters = ["dc1"]

    group "server" {
        count = 1

        network {
            mode = "bridge"
            port "http" {
                static = 8090
            }
        }

        service {
            name = "server"
            port = "http"

            connect {
                sidecar_service {
                    proxy {
                        upstreams {
                            destination_name = "postgres-con"
                            local_bind_port = 5432
                        }
                    }
                }
            }
        }
        task "backendserver" {
            driver = "docker"
            
            

            config {
                image = "charithreddyv/go-http-server"
                ports = ["http"]
            }


            env {
                DB_USER = "postgres"
                DB_HOST = "127.0.0.1"
                DN_PASS = "postgres"
                DB_NAME = "postgres"
                DB_PORT = 5432
            }
        }
    }

    group "database" {
        count = 1

        network {
            mode = "bridge"
            port "db" {
                static = 5432
            }
        }

        service {
            name = "postgres-con"
            port  = "db" 
            connect {
                sidecar_service {}
            } 
        }

        task "postgres" {
            driver = "docker"

            config {
                image = "postgres"
                ports = ["db"]
            }

            env {
                POSTGRES_PASSWORD = "postgres"
            }
        }
    } 
}

i am running nomad and consul with as root user and agent -dev
Thanks on advance :smiley:

Hai there,
little digging found a example app and i used db host as "${NOMAD_UPSTREAM_IP_postgres_con}" in my env, but now my application says connection reset peer any idea what might be happening ?

user=postgres host=127.0.0.1 password=postgres dbname=postgres port=5432 sslmode=disable
db ping failed read tcp 127.0.0.1:46274->127.0.0.1:5432: read: connection reset by peer
running on 8090


 handle / 
error in query read tcp 127.0.0.1:46292->127.0.0.1:5432: read: connection reset by peer
db ping failed read tcp 127.0.0.1:46294->127.0.0.1:5432: read: connection reset by peer

Thanks in advance :smiley:

@charithreddyv :wave:

Did you start Nomad with the -dev-connect flag? The -dev flag binds Nomad to localhost which prevents Consul service mesh jobs from working properly.

I ran consul agent -dev and sudo nomad agent -dev-connect and was able to run the job specification that you included in the first post with no errors. When I connected to the HTTP port defined on the server group, I received a pong message and the logs looked clean.

As to your question about hostnames, I would suggest that using 127.0.0.1 for anything that is using Consul service mesh is probably the best answer since it prevents a DNS lookup that is unnecessary, since the upstreams listen on local_bind_addr:local_bind_port. Using the NOMAD_UPSTREAM_ variables will reduce the number of places that need to be updated if you decide to change the upstream configurations.

Hopefully this helps. If not, the next step is verify that your intention configuration allows for these two services to talk to one another. After that, you’d need look at the logs for any errors that might help understand where your failure is coming from.

Regards,
Charlie

@angrycub
Thanks for the help :pray: it works now with running nomad as agent -dev-connect earlier i was running it as agent -dev -bind 0.0.0.0.
can you please point me to the documentation about these if possible, so that i can read more and understand these out.