Aws EKS with additional EBS volumes

Dear community,

I’m deploying an aws eks cluster using the terraform-aws-eks module. Now I want to attach additional ebs volumes to the worker nodes in the cluster but I have no clue how to do that.

I know there is a property called additional_ebs_volumes in which you must provide resources with “block_device_name” property. From the module

additional_ebs_volumes            = []                          # A list of additional volumes to be attached to the instances on this Auto Scaling group. Each volume should be an object with the following: block_device_name (required), volume_size, volume_type, iops, encrypted, kms_key_id (only on launch-template), delete_on_termination. Optional values are grabbed from root volume or from defaults

But it is not clear what resource type this refers to. An aws_ebs_volume does not have the required property, hence it fails to set up. Giving the lack of an example of this configuration I’m not sure exactly how to achieve what I’m looking for.

BTW, I’m new to terraform modules & eks managed by terraform, so I don’t know if I’m missing an obvious thing.

Any advice?

Thank you

I was finally able to add ebs volumes to the worker nodes. In my case, I just defined a local variable with the required attributes:

locals {
    ebs_block_device = {
        block_device_name = "/dev/sdc",
        volume_type = "gp2"
        volume_size = "100"

Then in the worker_group section I can just reference it:

 worker_groups = [    
      name                          = "wg-on-demand-medium"
      instance_type                 = "t2.medium"
      additional_security_group_ids = []
      asg_desired_capacity          = 1
      kubelet_extra_args            = ""
      additional_ebs_volumes        = [local.ebs_block_device]