Avoid destroy and create of Azure Stream Analytics

I have a terraform code that creates a stream analytics job, An input and output for the job too.
Below is my terraform code:

provider "azurerm" {
  version = "=1.44"
}
resource "azurerm_stream_analytics_job" "test_saj" {
  name                                     = "test-stj"
  resource_group_name                      = "myrgname"
  location                                 = "Southeast Asia"
  compatibility_level                      = "1.1"
  data_locale                              = "en-US"
  events_late_arrival_max_delay_in_seconds = 60
  events_out_of_order_max_delay_in_seconds = 50
  events_out_of_order_policy               = "Adjust"
  output_error_policy                      = "Drop"
  streaming_units                          = 3

  tags = {
    environment = "test"
  }
  transformation_query = var.query

}
resource "azurerm_stream_analytics_output_blob" "mpl_saj_op_jk_blob" {
  name                      = var.saj_jk_blob_output_name
  stream_analytics_job_name = "test-stj"
  resource_group_name       = "myrgname"
  storage_account_name      = "mystaname"
  storage_account_key       = "mystakey"
  storage_container_name    = "testupload"
  path_pattern              = myfolder/{day}"
  date_format               = "yyyy-MM-dd"
  time_format               = "HH"

  depends_on = [azurerm_stream_analytics_job.test_saj]

  serialization {
    type            = "Json"
    encoding        = "UTF8"
    format  = "LineSeparated"
  }
}
resource "azurerm_stream_analytics_stream_input_eventhub" "mpl_saj_ip_eh" {
  name                         = var.saj_joker_event_hub_name
  stream_analytics_job_name    = "test-stj"
  resource_group_name          = "myrgname"
  eventhub_name                = "myehname"
  eventhub_consumer_group_name = "myehcgname"
  servicebus_namespace         = "myehnamespacename"
  shared_access_policy_name    = "RootManageSharedAccessKey"
  shared_access_policy_key     = "ehnamespacekey"

  serialization {
    type     = "Json"
    encoding = "UTF8"
  }
  depends_on = [azurerm_stream_analytics_job.test_saj]
}

Following is my tfvars input file:

    query=<<EOT
    myqueryhere
    EOT
    saj_jk_blob_output_name="outputtoblob01"
    saj_joker_event_hub_name="inputventhub01"

I have no problem with the creation. Now my problem is when I want to create a new input and output for the same stream analytics job, I changed the name values alone in the tfvars file and gave terraform apply (in the same directory where first apply was given. Same state file).

Terraform is replacing the existing i/p and o/p with the new ones which is not my requirement. I want both the old one and the new one. This usecase was satisfied when imported the existing stream analytics using terraform import in a completely different folder and I used the same code. But is there way to do this without terraform import. Can this be done with a single state file itself?