Issue with AWS CLI on Cloud

So when I run locally everything is fine, and it used to work well too. However, something has changed recently, that prevents this from running as expected.

and have this code:

However it hangs indefinitely, and then when I switch the TF_LOG to be JSON, it is constantly waiting for the output of these aws commands:

{"@level":“trace”,"@message":“dag/walk: vertex “local.notifications_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:48.260545Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.exberry_image (expand)” is waiting for “data.aws_ecr_repository.this (expand)””,"@timestamp":“2022-05-17T00:49:48.260751Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.auth_service_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:48.260914Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.hltv_scraper_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:48.261171Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “provider[\“Terraform Registry”] (close)” is waiting for “data.aws_eks_cluster.cluster (expand)””,"@timestamp":“2022-05-17T00:49:48.261262Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.slack_image (expand)” is waiting for “data.aws_ecr_repository.this (expand)””,"@timestamp":“2022-05-17T00:49:48.261519Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.halo_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:48.261788Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “aws_iam_role.dns_record_creator (expand)” is waiting for “data.aws_iam_policy_document.dns_record_creator (expand)””,"@timestamp":“2022-05-17T00:49:48.261856Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “provider[\“Terraform Registry”] (close)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:48.261931Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “root” is waiting for “data.external.this[\“mako\”]””,"@timestamp":“2022-05-17T00:49:48.526180Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.newsfeed_image (expand)” is waiting for “data.aws_ecr_repository.this (expand)””,"@timestamp":“2022-05-17T00:49:53.256874Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.frontend_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:53.259184Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.payments_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:53.259308Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.missions_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:53.259357Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.user_service_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:53.259428Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “root” is waiting for “local.hltv_scraper_image (expand)””,"@timestamp":“2022-05-17T00:49:53.259471Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.notifications_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:53.260908Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.admin_image (expand)” is waiting for “data.aws_ecr_repository.this (expand)””,"@timestamp":“2022-05-17T00:49:53.261240Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.ipo_image (expand)” is waiting for “data.aws_ecr_repository.this (expand)””,"@timestamp":“2022-05-17T00:49:53.261501Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.mako_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:53.261746Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “aws_iam_role.dns_record_creator (expand)” is waiting for “data.aws_iam_policy_document.dns_record_creator (expand)””,"@timestamp":“2022-05-17T00:49:53.262090Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.slack_image (expand)” is waiting for “data.aws_ecr_repository.this (expand)””,"@timestamp":“2022-05-17T00:49:53.263071Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.backend_image (expand)” is waiting for “data.aws_ecr_repository.this (expand)””,"@timestamp":“2022-05-17T00:49:53.263184Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.hltv_scraper_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:53.263611Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.auth_service_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:53.263648Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “provider[\“Terraform Registry”] (close)” is waiting for “data.aws_eks_cluster.cluster (expand)””,"@timestamp":“2022-05-17T00:49:53.264128Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “provider[\“Terraform Registry”] (close)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:53.264532Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.exberry_image (expand)” is waiting for “data.aws_ecr_repository.this (expand)””,"@timestamp":“2022-05-17T00:49:53.264749Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “local.halo_image (expand)” is waiting for “data.external.this (expand)””,"@timestamp":“2022-05-17T00:49:53.264846Z”}
{"@level":“trace”,"@message":“dag/walk: vertex “root” is waiting for “data.external.this[\“mako\”]””,"@timestamp":“2022-05-17T00:49:53.526650Z”}

Constantly, in a loop, for 2 hours.

Any help would be much appreciated as we can’t fit it into our CD if it needs to be run locally.

Note that if I run it locally it runs in less than a second, however on cloud it takes a VERY long time. And if I try and do VCS driven workflow, it just sits in “waiting for resources…”