Restricting databricks sparks_version

We are looking to restrict the version of spark which is returned using the databricks data provider, we tried the following which is the current version our deployments use:

data “databricks_spark_version” “latest” {
spark_version = “13.3.x-scala2.12”
depends_on = [

However, when running this, we get the following error:

Error: spark versions query returned no results. Please change your search criteria and try again

We’ve also tried spark_version = “13.3” with the same result. Does anyone have a working example of how to restrict the version that’s used?

We are running this in an Azure yml pipeline


Hey Simon, i am getting this same error, did u find any solution?