I’m creating code in terraform to create azure function apps (using azurerm_linux_function_app resource) and I noticed that it is not possible with this provider to send the package to the server, and in the next step install python dependencies (from .requirements file). I came up with the idea that the dependencies would be installed before sending the zip file to the storage blob.
However, given that the code is run from the repo through the azure pipeline, each time the pipeline is called again (without a change in this functionapp) it checks the dir_sha1 trigger again, and since it is not changed, a new zip file without dependencies (from requirements.txt file) is created and swapped on the server.
I came up with the idea to exclude the creation of a zip file if the dir_sha1 of the functionapp folder has not changed, but I am not able to implement this. Please help.
resource "null_resource" "dependencies" {
for_each = local.app_files
triggers = {
dir_sha1 = sha1(join("", [for f in fileset(".", "${path.module}/../../src/azure_functions/${each.value["app_service_plan"]}/${each.key}/scripts/**"): filesha1(f)]))
}
provisioner "local-exec" {
command = <<EOT
pip3 install --target="${path.module}/../../src/azure_functions/${each.value["app_service_plan"]}/${each.key}/scripts/.python_packages/lib/site-packages" -r "${path.module}/../../src/azure_functions/${each.value["app_service_plan"]}/${each.key}/scripts/requirements.txt"
EOT
}
}
data "archive_file" "af-source-code" {
for_each = local.app_files
type = "zip"
output_path = "${path.module}/../../dist/azure_functions/${each.value["app_service_plan"]}/${each.key}.zip"
source_dir = "${path.module}/../../src/azure_functions/${each.value["app_service_plan"]}/${each.key}/scripts"
}