Hello all,
I am trying to deploy serverless cluster job and task via terraform on Databricks (Azure).
The documentation says: “If no job_cluster_key, existing cluster id, or new cluster is specified in the job definition, the task will be executed using serverless computing.”
and it also states that we need the environment block:
environment Confaguration Block
This block describes an Environment that is used to specify libraries used by the tasks running on serverless compute. This block contains the following attributes:
environment_key - a unique identifier of the Environment. It will be referenced from environment_key attribute of corresponding task.
spec - block describing the Environment. Consists of the following attributes:
client - (Required, string) client version used by the environment.
dependencies - (list of strings) List of pip dependencies, as supported by the version of pip in this environment. Each dependency is a pip requirement file line. See API docs for more information.
here is the my environment block under the task configuration:
environment {
environment_key = "Default"
spec {
dependencies = ["foo==0.0.1"]
client = "1"
}
}
but when i deploy this one, terraform plan fails:
│ on databricks_job_dbt_serverless.tf line 139, in resource “databricks_job” “job_dbt_run_all_serverless”:
│ 139: environment {
│
│ Blocks of type “environment” are not expected here.
documentation link: Terraform Registry
how is this possible, in the documentation it says that I can use the environment blog but it gives an error.
what do you think about this?