Dataproc Job creation fails for Spark SQL job with query_file_uri set

Hi Everyone,

I am trying to create gcloud dataproc job using terraform resource google_dataproc_job
As part of my configuration, I am trying to configure spark sql job with query_file_uri pointing to Google Cloud Storage bucket location

Unfortunately, terraform fails with below error:

│ Error: googleapi: Error 400: Invalid value at 'job.spark_sql_job' (oneof), oneof field 'queries' is already set. Cannot set 'queryList'
│ Details:
│ [
│   {
│     "@type": "type.googleapis.com/google.rpc.BadRequest",
│     "fieldViolations": [
│       {
│         "description": "Invalid value at 'job.spark_sql_job' (oneof), oneof field 'queries' is already set. Cannot set 'queryList'",
│         "field": "job.spark_sql_job"
│       }
│     ]
│   }
│ ]
│ , invalid

Please note that i have not passed query_list attribute to spark sql config block as recommended by terraform docs

Please provide your valuable help here so that we can get unblocked

Sharing the terraform resource definition that we are trying:

resource "google_dataproc_job" "spark-sql-tables" {

  region       = var.region
  project      = var.project

  placement {
    cluster_name = module.dataproc-cluster[0].dataproc_cluster_name
  }

  sparksql_config {
    query_file_uri = "gs://${var.artifacts_bucket}/sql/tables.sql"
    jar_file_uris  = local.jar_file_uris_list
  }
}