Hashicorp Boundary using target Databricks Warehouse is throwing error

I am trying to connect Databricks Warehouse via Hashicorp Boundary Session

I have created target on Hashicorp Boundary for Databricks Warehouse Host and started a session to connect to it.

When I try the URL in DataGrip to connect to this data warehouse, I am getting following error:

DBMS: SparkSQL (ver. 3.1.1)
Case sensitivity: plain=mixed, delimited=exact
Driver: DatabricksJDBC (ver. 02.06.36.1062, JDBC4.2)
[Databricks][JDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: javax.net.ssl.SSLHandshakeException: Remote host terminated the handshake.

Here is the URL I am trying to connect:

jdbc:databricks://127.0.0.1:63771/default;transportMode=http;ssl=1;AuthMech=3;httpPath=/sql/1.0/warehouses/a1b2c3d4e5f6g7h8;

While the original is working:

jdbc:databricks://dbc-123abcde-45fg.cloud.databricks.com:443/default;transportMode=http;ssl=1;AuthMech=3;httpPath=/sql/1.0/warehouses/be408cb2375afc4e;

I have replaced dbc-123abcde-45fg.cloud.databricks.com:443 with 127.0.0.1:63771

Likely this is because there is a TLS certificate on the warehouse that does not have 127.0.0.1 as the host name. At the moment, there are a couple of options, assuming you cannot change the TLS cert provisioned in databricks:

  • Use a host alias on your system that maps dbc-123abcde-45fg.cloud.databricks.com to 127.0.0.1 so that you can use the original hostname, although this will then interfere if you want to connect directly.
  • See if your client has an option to specify the TLS SNI/host name to use rather than using the one from the connection URL.

In the future an upcoming feature may help here, but that’s all I can say about it for now.