Job failed - consul connect

Hello,

My nomad job:

job "traccar.domain.com" {
  region = "global"
  datacenters = ["dc1"]
  type = "service"
   update {
     max_parallel     = 1
     canary           = 1
     min_healthy_time = "10s"
     healthy_deadline = "5m"
     auto_revert      = true
     auto_promote     = true
     health_check     = "checks"
     stagger          = "30s"
   }

  group "database" {
    network {
      mode = "bridge"
    }

    service {
      name = "database"
      port = "3006"

      connect {
        sidecar_service {}
      }
    }

    task "mariadb" {
        driver = "docker"
        config {
            image = "ghcr.io/linuxserver/mariadb"
            volumes = [
          "/data/traccar.domain.com/mysql:/config"
          ]
        }
        env {
            MYSQL_USER = "traccar"
            MYSQL_PASSWORD = "xxx"
            MYSQL_ROOT_PASSWORD = "xxx"
            MYSQL_DATABASE = "traccar"
        }
        resources {
            memory = 300
        }
        
       }
  }

  group "app" {
    network {
      mode = "bridge"

      port "webinterface" {
        to     = 8082
      }
    }

    restart {
      attempts = 10
      interval = "5m"
      delay = "10s"
      mode = "delay"
    }

    ephemeral_disk {
        size = 300
        sticky = "true"
        migrate = true
    }
    
    service {
      name = "traccar"
      port = "webinterface"

      tags = [
        "traefik.enable=true",
                "traefik.http.routers.traccardomaincom.tls=true",
                "traefik.http.routers.traccardomaincom.tls.certresolver=myresolver",
                "traefik.http.routers.traccardomaincom.tls.options=mintls12@file",
                "traefik.http.routers.traccardomaincom.entrypoints=https",
                "traefik.http.routers.traccardomaincom.middlewares=traccardomaincom-headers@consulcatalog",
                "traefik.http.routers.traccardomaincom.rule=Host(`traccar.domain.com`)",

                "traefik.http.middlewares.traccardomaincom-headers.headers.customResponseHeaders.Strict-Transport-Security=max-age=63072000",
                "traefik.http.middlewares.traccardomaincom-headers.headers.frameDeny=true",
                "traefik.http.middlewares.traccardomaincom-headers.headers.browserXssFilter=true",
                "traefik.http.middlewares.traccardomaincom-headers.headers.contentTypeNosniff=true",
                "traefik.http.middlewares.traccardomaincom-headers.headers.stsIncludeSubdomains=true",
                "traefik.http.middlewares.traccardomaincom-headers.headers.stsPreload=true",
                "traefik.http.middlewares.traccardomaincom-headers.headers.stsSeconds=31536000",
                "traefik.http.middlewares.traccardomaincom-headers.headers.forceSTSHeader=true",
                "traefik.http.middlewares.traccardomaincom-headers.headers.accessControlMaxAge=15552000",
                "traefik.http.middlewares.traccardomaincom-headers.headers.customFrameOptionsValue=SAMEORIGIN",
                "traefik.http.middlewares.traccardomaincom-headers.headers.sslHost=traccar.domain.com",
                "traefik.http.middlewares.traccardomaincom-headers.headers.sslForceHost=true",

                "traefik.http.routers.traccardomaincom.middlewares=traccardomaincom-headers@consulcatalog",

      ]

      connect {
        sidecar_service {
          proxy {
            upstreams {
              destination_name = "database"
              local_bind_port  = 3306
            }
          }
        }
      }
    }

    task "traccar" {
      driver = "docker"

      env {
        
      }

      config {
        image = "traccar/traccar"

        ports = ["webinterface"]
        volumes = [
          "local/traccar.xml:/opt/traccar/conf/traccar.xml",
          "/data/traccar.domain.com/logs:/opt/traccar/logs"
        ]
      }

      resources {
        cpu    = 512
        memory = 256
      }

      template {
        change_mode = "noop"
        destination = "local/traccar.xml"
        data = <<EOH
<?xml version='1.0' encoding='UTF-8'?>

<!DOCTYPE properties SYSTEM 'http://java.sun.com/dtd/properties.dtd'>

<properties>

    <entry key='config.default'>./conf/default.xml</entry>

    <!--

    This is the main configuration file. All your configuration parameters should be placed in this file.

    Default configuration parameters are located in the "default.xml" file. You should not modify it to avoid issues
    with upgrading to a new version. Parameters in the main config file override values in the default file. Do not
    remove "config.default" parameter from this file unless you know what you are doing.

    For list of available parameters see following page: https://www.traccar.org/configuration-file/



    <entry key='database.driver'>org.h2.Driver</entry>
    <entry key='database.url'>jdbc:h2:./data/database</entry>
    <entry key='database.user'>sa</entry>
    <entry key='database.password'></entry>
    -->

    <entry key='database.driver'>com.mysql.cj.jdbc.Driver</entry>
    <entry key='database.url'>jdbc:mysql://127.0.0.1:3306/traccar?serverTimezone=UTC&amp;useSSL=false&amp;allowMultiQueries=true&amp;autoReconnect=true&amp;useUnicode=yes&amp;characterEncoding=UTF-8&amp;sessionVariables=sql_mode=''</entry>
    <entry key='database.user'>traccar</entry>
    <entry key='database.password'>xxx</entry>
</properties>

EOH
      }
    }


  }
}

But it didn’t work… :cry:

Group database works, but not app.

connect-proxy-traccar stderr:

[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:330] initializing epoch 0 (base id=0, hot restart version=disabled)
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:332] statically linked extensions:
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.access_loggers: envoy.access_loggers.file, envoy.access_loggers.http_grpc, envoy.access_loggers.open_telemetry, envoy.access_loggers.stderr, envoy.access_loggers.stdout, envoy.access_loggers.tcp_grpc, envoy.access_loggers.wasm, envoy.file_access_log, envoy.http_grpc_access_log, envoy.open_telemetry_access_log, envoy.stderr_access_log, envoy.stdout_access_log, envoy.tcp_grpc_access_log, envoy.wasm_access_log
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.compression.decompressor: envoy.compression.brotli.decompressor, envoy.compression.gzip.decompressor
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.resource_monitors: envoy.resource_monitors.fixed_heap, envoy.resource_monitors.injected_resource
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.thrift_proxy.transports: auto, framed, header, unframed
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.http.cache: envoy.extensions.http.cache.simple
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.rate_limit_descriptors: envoy.rate_limit_descriptors.expr
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.tls.cert_validator: envoy.tls.cert_validator.default, envoy.tls.cert_validator.spiffe
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.http.stateful_header_formatters: preserve_case
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.upstream_options: envoy.extensions.upstreams.http.v3.HttpProtocolOptions, envoy.upstreams.http.http_protocol_options
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.resolvers: envoy.ip
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.health_checkers: envoy.health_checkers.redis
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.matching.http.input: request-headers, request-trailers, response-headers, response-trailers
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.grpc_credentials: envoy.grpc_credentials.aws_iam, envoy.grpc_credentials.default, envoy.grpc_credentials.file_based_metadata
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.dubbo_proxy.filters: envoy.filters.dubbo.router
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.tracers: envoy.dynamic.ot, envoy.lightstep, envoy.tracers.datadog, envoy.tracers.dynamic_ot, envoy.tracers.lightstep, envoy.tracers.opencensus, envoy.tracers.skywalking, envoy.tracers.xray, envoy.tracers.zipkin, envoy.zipkin
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.matching.common_inputs: envoy.matching.common_inputs.environment_variable
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.filters.listener: envoy.filters.listener.http_inspector, envoy.filters.listener.original_dst, envoy.filters.listener.original_src, envoy.filters.listener.proxy_protocol, envoy.filters.listener.tls_inspector, envoy.listener.http_inspector, envoy.listener.original_dst, envoy.listener.original_src, envoy.listener.proxy_protocol, envoy.listener.tls_inspector
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.thrift_proxy.protocols: auto, binary, binary/non-strict, compact, twitter
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.thrift_proxy.filters: envoy.filters.thrift.rate_limit, envoy.filters.thrift.router
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.dubbo_proxy.route_matchers: default
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.filters.network: envoy.client_ssl_auth, envoy.echo, envoy.ext_authz, envoy.filters.network.client_ssl_auth, envoy.filters.network.direct_response, envoy.filters.network.dubbo_proxy, envoy.filters.network.echo, envoy.filters.network.ext_authz, envoy.filters.network.http_connection_manager, envoy.filters.network.kafka_broker, envoy.filters.network.local_ratelimit, envoy.filters.network.mongo_proxy, envoy.filters.network.mysql_proxy, envoy.filters.network.postgres_proxy, envoy.filters.network.ratelimit, envoy.filters.network.rbac, envoy.filters.network.redis_proxy, envoy.filters.network.rocketmq_proxy, envoy.filters.network.sni_cluster, envoy.filters.network.sni_dynamic_forward_proxy, envoy.filters.network.tcp_proxy, envoy.filters.network.thrift_proxy, envoy.filters.network.wasm, envoy.filters.network.zookeeper_proxy, envoy.http_connection_manager, envoy.mongo_proxy, envoy.ratelimit, envoy.redis_proxy, envoy.tcp_proxy
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.transport_sockets.upstream: envoy.transport_sockets.alts, envoy.transport_sockets.quic, envoy.transport_sockets.raw_buffer, envoy.transport_sockets.tap, envoy.transport_sockets.tls, envoy.transport_sockets.upstream_proxy_protocol, raw_buffer, tls
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.transport_sockets.downstream: envoy.transport_sockets.alts, envoy.transport_sockets.quic, envoy.transport_sockets.raw_buffer, envoy.transport_sockets.starttls, envoy.transport_sockets.tap, envoy.transport_sockets.tls, raw_buffer, starttls, tls
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.retry_host_predicates: envoy.retry_host_predicates.omit_canary_hosts, envoy.retry_host_predicates.omit_host_metadata, envoy.retry_host_predicates.previous_hosts
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.stats_sinks: envoy.dog_statsd, envoy.metrics_service, envoy.stat_sinks.dog_statsd, envoy.stat_sinks.hystrix, envoy.stat_sinks.metrics_service, envoy.stat_sinks.statsd, envoy.stat_sinks.wasm, envoy.statsd
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.filters.udp_listener: envoy.filters.udp.dns_filter, envoy.filters.udp_listener.udp_proxy
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.internal_redirect_predicates: envoy.internal_redirect_predicates.allow_listed_routes, envoy.internal_redirect_predicates.previous_routes, envoy.internal_redirect_predicates.safe_cross_scheme
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.dubbo_proxy.serializers: dubbo.hessian2
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.bootstrap: envoy.bootstrap.wasm, envoy.extensions.network.socket_interface.default_socket_interface
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.matching.action: composite-action, skip
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.dubbo_proxy.protocols: dubbo
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.guarddog_actions: envoy.watchdog.abort_action, envoy.watchdog.profile_action
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.clusters: envoy.cluster.eds, envoy.cluster.logical_dns, envoy.cluster.original_dst, envoy.cluster.static, envoy.cluster.strict_dns, envoy.clusters.aggregate, envoy.clusters.dynamic_forward_proxy, envoy.clusters.redis
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.compression.compressor: envoy.compression.brotli.compressor, envoy.compression.gzip.compressor
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.retry_priorities: envoy.retry_priorities.previous_priorities
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.request_id: envoy.request_id.uuid
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.matching.input_matchers: envoy.matching.matchers.consistent_hashing
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.upstreams: envoy.filters.connection_pools.tcp.generic
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.filters.http: envoy.buffer, envoy.cors, envoy.csrf, envoy.ext_authz, envoy.ext_proc, envoy.fault, envoy.filters.http.adaptive_concurrency, envoy.filters.http.admission_control, envoy.filters.http.aws_lambda, envoy.filters.http.aws_request_signing, envoy.filters.http.buffer, envoy.filters.http.cache, envoy.filters.http.cdn_loop, envoy.filters.http.composite, envoy.filters.http.compressor, envoy.filters.http.cors, envoy.filters.http.csrf, envoy.filters.http.decompressor, envoy.filters.http.dynamic_forward_proxy, envoy.filters.http.dynamo, envoy.filters.http.ext_authz, envoy.filters.http.ext_proc, envoy.filters.http.fault, envoy.filters.http.grpc_http1_bridge, envoy.filters.http.grpc_http1_reverse_bridge, envoy.filters.http.grpc_json_transcoder, envoy.filters.http.grpc_stats, envoy.filters.http.grpc_web, envoy.filters.http.gzip, envoy.filters.http.header_to_metadata, envoy.filters.http.health_check, envoy.filters.http.ip_tagging, envoy.filters.http.jwt_authn, envoy.filters.http.local_ratelimit, envoy.filters.http.lua, envoy.filters.http.oauth2, envoy.filters.http.on_demand, envoy.filters.http.original_src, envoy.filters.http.ratelimit, envoy.filters.http.rbac, envoy.filters.http.router, envoy.filters.http.squash, envoy.filters.http.tap, envoy.filters.http.wasm, envoy.grpc_http1_bridge, envoy.grpc_json_transcoder, envoy.grpc_web, envoy.gzip, envoy.health_check, envoy.http_dynamo_filter, envoy.ip_tagging, envoy.local_rate_limit, envoy.lua, envoy.rate_limit, envoy.router, envoy.squash, match-wrapper
[2021-10-26 16:07:08.196][1][info][main] [source/server/server.cc:334]   envoy.wasm.runtime: envoy.wasm.runtime.null, envoy.wasm.runtime.v8
[2021-10-26 16:07:08.212][1][warning][misc] [source/common/protobuf/message_validator_impl.cc:21] Deprecated field: type envoy.config.cluster.v3.Cluster Using deprecated option 'envoy.config.cluster.v3.Cluster.http2_protocol_options' from file cluster.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/version_history/version_history for details. If continued use of this field is absolutely necessary, see https://www.envoyproxy.io/docs/envoy/latest/configuration/operations/runtime#using-runtime-overrides-for-deprecated-features for how to apply a temporary and highly discouraged override.
[2021-10-26 16:07:08.212][1][warning][misc] [source/common/protobuf/message_validator_impl.cc:21] Deprecated field: type envoy.config.bootstrap.v3.Admin Using deprecated option 'envoy.config.bootstrap.v3.Admin.access_log_path' from file bootstrap.proto. This configuration will be removed from Envoy soon. Please see https://www.envoyproxy.io/docs/envoy/latest/version_history/version_history for details. If continued use of this field is absolutely necessary, see https://www.envoyproxy.io/docs/envoy/latest/configuration/operations/runtime#using-runtime-overrides-for-deprecated-features for how to apply a temporary and highly discouraged override.
[2021-10-26 16:07:08.212][1][info][main] [source/server/server.cc:350] HTTP header map info:
[2021-10-26 16:07:08.217][1][info][main] [source/server/server.cc:353]   request header map: 632 bytes: :authority,:method,:path,:protocol,:scheme,accept,accept-encoding,access-control-request-method,authentication,authorization,cache-control,cdn-loop,connection,content-encoding,content-length,content-type,expect,grpc-accept-encoding,grpc-timeout,if-match,if-modified-since,if-none-match,if-range,if-unmodified-since,keep-alive,origin,pragma,proxy-connection,referer,te,transfer-encoding,upgrade,user-agent,via,x-client-trace-id,x-envoy-attempt-count,x-envoy-decorator-operation,x-envoy-downstream-service-cluster,x-envoy-downstream-service-node,x-envoy-expected-rq-timeout-ms,x-envoy-external-address,x-envoy-force-trace,x-envoy-hedge-on-per-try-timeout,x-envoy-internal,x-envoy-ip-tags,x-envoy-max-retries,x-envoy-original-path,x-envoy-original-url,x-envoy-retriable-header-names,x-envoy-retriable-status-codes,x-envoy-retry-grpc-on,x-envoy-retry-on,x-envoy-upstream-alt-stat-name,x-envoy-upstream-rq-per-try-timeout-ms,x-envoy-upstream-rq-timeout-alt-response,x-envoy-upstream-rq-timeout-ms,x-forwarded-client-cert,x-forwarded-for,x-forwarded-proto,x-ot-span-context,x-request-id
[2021-10-26 16:07:08.223][1][info][main] [source/server/server.cc:353]   request trailer map: 144 bytes: 
[2021-10-26 16:07:08.223][1][info][main] [source/server/server.cc:353]   response header map: 440 bytes: :status,access-control-allow-credentials,access-control-allow-headers,access-control-allow-methods,access-control-allow-origin,access-control-expose-headers,access-control-max-age,age,cache-control,connection,content-encoding,content-length,content-type,date,etag,expires,grpc-message,grpc-status,keep-alive,last-modified,location,proxy-connection,server,transfer-encoding,upgrade,vary,via,x-envoy-attempt-count,x-envoy-decorator-operation,x-envoy-degraded,x-envoy-immediate-health-check-fail,x-envoy-ratelimited,x-envoy-upstream-canary,x-envoy-upstream-healthchecked-cluster,x-envoy-upstream-service-time,x-request-id
[2021-10-26 16:07:08.223][1][info][main] [source/server/server.cc:353]   response trailer map: 168 bytes: grpc-message,grpc-status
[2021-10-26 16:07:08.227][1][info][main] [source/server/server.cc:500] admin address: 127.0.0.2:19001
[2021-10-26 16:07:08.228][1][info][main] [source/server/server.cc:667] runtime: layers:
  - name: base
    static_layer:
      {}
  - name: admin
    admin_layer:
      {}
[2021-10-26 16:07:08.228][1][info][config] [source/server/configuration_impl.cc:128] loading tracing configuration
[2021-10-26 16:07:08.228][1][info][config] [source/server/configuration_impl.cc:88] loading 0 static secret(s)
[2021-10-26 16:07:08.228][1][info][config] [source/server/configuration_impl.cc:94] loading 1 cluster(s)
[2021-10-26 16:07:08.338][1][info][config] [source/server/configuration_impl.cc:98] loading 0 listener(s)
[2021-10-26 16:07:08.338][1][info][config] [source/server/configuration_impl.cc:110] loading stats configuration
[2021-10-26 16:07:08.339][1][info][runtime] [source/common/runtime/runtime_impl.cc:428] RTDS has finished initialization
[2021-10-26 16:07:08.339][1][info][upstream] [source/common/upstream/cluster_manager_impl.cc:188] cm init: initializing cds
[2021-10-26 16:07:08.341][1][warning][main] [source/server/server.cc:642] there is no configured limit to the number of allowed active connections. Set a limit via the runtime key overload.global_downstream_max_connections
[2021-10-26 16:07:08.342][1][info][main] [source/server/server.cc:764] starting main dispatch loop
[2021-10-26 16:07:08.350][1][info][upstream] [source/common/upstream/cds_api_helper.cc:28] cds: add 2 cluster(s), remove 0 cluster(s)
[2021-10-26 16:07:08.543][1][info][upstream] [source/common/upstream/cds_api_helper.cc:65] cds: added/updated 2 cluster(s), skipped 0 unmodified cluster(s)
[2021-10-26 16:07:08.543][1][info][upstream] [source/common/upstream/cluster_manager_impl.cc:168] cm init: initializing secondary clusters
[2021-10-26 16:07:08.548][1][info][upstream] [source/common/upstream/cluster_manager_impl.cc:192] cm init: all clusters initialized
[2021-10-26 16:07:08.548][1][info][main] [source/server/server.cc:745] all clusters initialized. initializing init manager
[2021-10-26 16:07:08.555][1][info][upstream] [source/server/lds_api.cc:78] lds: add/update listener 'database:127.0.0.1:3306'
[2021-10-26 16:07:08.558][1][info][upstream] [source/server/lds_api.cc:78] lds: add/update listener 'public_listener:0.0.0.0:22159'
[2021-10-26 16:07:08.558][1][info][config] [source/server/listener_manager_impl.cc:888] all dependencies initialized. starting workers

traccar stderr:

Exception in thread "main" java.lang.RuntimeException: com.zaxxer.hikari.pool.HikariPool$PoolInitializationException: Failed to initialize pool: Could not create connection to database server. Attempted reconnect 3 times. Giving up.
	at org.traccar.Main.run(Main.java:147)
	at org.traccar.Main.main(Main.java:106)
Caused by: com.zaxxer.hikari.pool.HikariPool$PoolInitializationException: Failed to initialize pool: Could not create connection to database server. Attempted reconnect 3 times. Giving up.
	at com.zaxxer.hikari.pool.HikariPool.throwPoolInitializationException(HikariPool.java:596)
	at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:582)
	at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:100)
	at com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:81)
	at org.traccar.database.DataManager.initDatabase(DataManager.java:130)
	at org.traccar.database.DataManager.<init>(DataManager.java:89)
	at org.traccar.Context.init(Context.java:290)
	at org.traccar.Main.run(Main.java:120)
	... 1 more
Caused by: java.sql.SQLNonTransientConnectionException: Could not create connection to database server. Attempted reconnect 3 times. Giving up.
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:110)
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:89)
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:63)
	at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:73)
	at com.mysql.cj.jdbc.ConnectionImpl.connectWithRetries(ConnectionImpl.java:898)
	at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:823)
	at com.mysql.cj.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:448)
	at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:241)
	at com.mysql.cj.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:198)
	at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:138)
	at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:359)
	at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:201)
	at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:470)
	at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:561)
	... 7 more
Caused by: com.mysql.cj.exceptions.CJCommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
	at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:61)
	at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:105)
	at com.mysql.cj.exceptions.ExceptionFactory.createException(ExceptionFactory.java:151)
	at com.mysql.cj.exceptions.ExceptionFactory.createCommunicationsException(ExceptionFactory.java:167)
	at com.mysql.cj.protocol.a.NativeProtocol.readMessage(NativeProtocol.java:519)
	at com.mysql.cj.protocol.a.NativeProtocol.readServerCapabilities(NativeProtocol.java:475)
	at com.mysql.cj.protocol.a.NativeProtocol.beforeHandshake(NativeProtocol.java:362)
	at com.mysql.cj.protocol.a.NativeProtocol.connect(NativeProtocol.java:1350)
	at com.mysql.cj.NativeSession.connect(NativeSession.java:132)
	at com.mysql.cj.jdbc.ConnectionImpl.connectWithRetries(ConnectionImpl.java:842)
	... 16 more
Caused by: java.io.EOFException: Can not read response from server. Expected to read 4 bytes, read 0 bytes before connection was unexpectedly lost.
	at com.mysql.cj.protocol.FullReadInputStream.readFully(FullReadInputStream.java:67)
	at com.mysql.cj.protocol.a.SimplePacketReader.readHeader(SimplePacketReader.java:63)
	at com.mysql.cj.protocol.a.SimplePacketReader.readHeader(SimplePacketReader.java:45)
	at com.mysql.cj.protocol.a.NativeProtocol.readMessage(NativeProtocol.java:513)
	... 21 more

The two tasks restarted until the limit.

When I exec netstat -lnp in two containers of app group, I can see 127.0.0.1:3306 but I don’t have enough time to try mysql connection.

In consul interface, I see link by mesh proxy

I don’t know, sorry…

Thanks for your help or/and advices.

Just viewing on mobile, but I think you registered wrong port for your service:

ā€˜ā€˜ā€˜
service {
name = ā€œdatabaseā€
port = ā€œ3006ā€
ā€˜ā€˜ā€˜

…unless you set a custom port, it should be 3306 for mysql

I came here to say the same thing. That too was what caught my eye and lines up with the Could not create connection to database server error messages

Shame on me… :confounded:

Spent a day for that…

thanks

1 Like