What is the idiomatic way of importing data files outside terraform root into terraform?

Let say I have ~/.kube/config, and I want to use it to initialize k8s provider in terraform, what is the right way to do so if I am applying the code on terraform cloud? Do I need a custom shell script to copy that file into the terraform root directory?

I tried to use local_sensitive_file like this:

resource "local_sensitive_file" "k8s-config" {
    source = "/Users/xxx/.kube/k8s-config" filename = "${path.module}/k8s-config" 

But I will get error like this if apply on the cloud:

│ Error: open /Users/xxx/.kube/k8s-config: no such file or directory
│  with local_sensitive_file.k8s-config,
│  on index.tf line 31, in resource "local_sensitive_file" "k8s-config":
│  31: resource "local_sensitive_file" "k8s-config" {

Do I need to use relative to path, like ${path.root}/../../.kube/config to reference that config file? What is the best practise for handling a file that is sensitive that cannot be stored in the repository?

Terraform Cloud supports various workflows, but the usual one is VCS based - in which case, the repository is all the files the plan sees.

The is no way to pass extra files from your workstation, because you workstation isn’t even necessarily involved in running a plan/apply at all. Someone else may push to the Git repo, for example.

Unless you’re using Terraform Cloud in an alternate workflow, I’d say the idiomatic thing to do would be to upload credentials as Terraform workspace variables, and reference them via provider config: https://registry.terraform.io/providers/hashicorp/kubernetes/latest/docs#credentials-config

I am using CLI flow right now, would using workspace variable still the best option in your opinion?