Nomad ecs(with ec2) plugin sample

hello i am lee

I was in the process of deploying a container service on an ecs cluster using nomad.

I succeeded normally when the ecs cluster is a fargate type, but the ec2 type is currently failing.

My ecs(fargate)job is as below.
I thought that changing the value of launch_type to EC2 and running it would work, but it wasn’t.
Is there any ecs(ec2) job that I can refer to? Or do I need to change the ecs task definition settings?

#my deploy ecs job

job "nomad-ecs-demo" {
  datacenters = ["dc1"]
  namespace = "${namespace}"

  group "ecs-remote-task-demo" {
    count = 1

    scaling {
      enabled = true
      min = 0
      max = 5
    }

    restart {
      attempts = 0
      mode     = "fail"
    }

    reschedule {
      delay = "5s"
    }

    task "http-server" {
      driver       = "ecs"
      kill_timeout = "1m" // increased from default to accomodate ECS.

      config {
        task {
          launch_type     = "FARGATE" 
          task_definition = "${task_definition}"
          network_configuration {
            aws_vpc_configuration {
              assign_public_ip = "ENABLED"
              security_groups  = ["${security_group_id}"]
              subnets          = ["${subnet_id1}","${subnet_id2}"]
            }
          }
        }
      }

      resources {
        cpu    = 20
        memory = 10
      }
    }
  }
}

The errors found in nomad are as follows.

I solved this problem!

When creating ecs_task_definition with terrafrom, I changed the option value and solved it by changing the nomad job template accordingly!

1 Like

Hi @swbs90,

Glad you managed to find a solution.

Thanks,
jrasell and the Nomad team