Weirdest issue with MySQL 8 and Consul Connect

Hi folks, I have a mysql server running where I have added a definition for a consul connect proxy in a separate config file, as follows:

{
        "service": {
                "name": "mysql-proxy",
                "kind": "connect-proxy",
                "port": 21330,
                "proxy": {
                        "destination_service_name": "mysql",
                        "local_service_address": "10.0.0.7",
                        "local_service_port": 3306
                }
        }
}

This same definition (with a few changes) lives on our PostgreSQL server as well.

Anyway, moving on. Envoy is started on the command line (well, systemd service but you get the idea) as follows:

consul connect envoy -proxy-id=mysql-proxy -ca-file /consul/ca.pem -client-cert /consul/consul.crt -client-key /consul/consul.key -no-central-config -grpc-addr https://127.0.0.1:8502

Now. The issue I’m having is that ~80% of all clients that are using Connect to connect to MySQL in fact can’t. Either they get an EOF, or they get malformed packets. Oddly enough, a few clients work fine, but they eventually lose their connection due to… bad packets.

I’ve checked and the Consul intentions are set to basically allow any-to-any (* => *).

The irritating thing is that this setup works 100% fine for our PostgreSQL database, it works fine for InfluxDB, and it works fine for pretty much everything else that had the proxy setup configured manually in this manner.

If I spin up an upstream proxy on a machine by hand and connect to it with the mysql client, it works fine.

Anyone have any idea, at all, what on earth could be going on because one thing that’s on my list of things to do is move the mysql server into Nomad land, but if Connect doesn’t work that’s going to put a slightly hilarious crimp on those plans.