How to load kerberos keytab in terraform script

Hi Experts,

We are trying to load a kerberos keytab through terraform, but without success. Using the same keytab and krb5 file via AWS console, everything works fine. Via TF we tried using the function filebase64, file, but nothing worked. Does anyone know the correct way to load the keytab file?

Describing the scene better…

We are trying to authenticate an AWS DataSync HDFS agent. Creating the task with this agent and location via the AWS Console, everything works fine, but when trying to create it via TF, it seems to me that the keytab file format is not correct when it is compared with the keytab in the CloudTrail thas was uploaded via AWS Console.

Kerberos keytabs are binary files. The Terraform file function doesn’t support binary data: file - Functions - Configuration Language | Terraform | HashiCorp Developer

You’ve clipped your screenshot to the point where the resource type you are working with isn’t visible, so I can’t help more than that.

Hi Maxb,

Sorry for the delay and thanks for your help.

Can screenshoot below help? The resource is “aws_datasync_location_hdfs”

I had expected the documentation to explain the format it required for this option. Sadly, it does not: Terraform Registry

Checking the code, it is just casting the data from a string: terraform-provider-aws/location_hdfs.go at 8a8a78fe0f63d949fb957f0ef71c79f2b906551b · hashicorp/terraform-provider-aws · GitHub

This means it’s effectively broken, as the Terraform docs are quite clear that Terraform strings are pure Unicode: file - Functions - Configuration Language | Terraform | HashiCorp Developer

Perhaps no-one has ever actually used this functionality? Or perhaps everyone who has, has just had keytab content which happens by chance to be interpretable as UTF-8 sequences?

At this point you will probably need to raise a bug with the provider developer to have them fix it.

In the upstream datasync:CreateLocationHdfs documentation it says that KerberosKeytab is a “Base64-encoded binary data object”, and so I would expect that filebase64 would be the correct function to use in this case.

I think what’s happened here is some layering confusion. The underlying API is expecting a base64-encoded string, but the AWS SDK for Go is expecting a byte array and so I assume the SDK is designed to automatically apply the base64 encoding to the bytes.

The AWS provider logic is, however, just passing the UTF-8 bytes of the base64 string into the SDK and so presumably it’s ending up double-base64-encoded and therefore invalid by the time it reaches the underlying API.

If I’m reading this right then I agree this seems like a bug in the hashicorp/aws provider, and so I’d suggest reporting it in the provider’s issue tracker. I think the fix here would be for the provider code to first base64-decode the given argument and send the resulting bytes to the AWS SDK, and then the SDK will presumably re-encode it with base64 again to get the right result. A bit silly and redundant, but correct at least.

Hi Maxb, Apparentlymart

Now it makes sense why in CloudTrail the formats were different when using the console and when using the TF.

I will open an issue there.

Thanks for your reply!

Best regards,

Hi @le2004,

I’m hitting the same issue, so I’m like the second person to try to do this. DataSync task runs fine when creating the hdfs location using the AWS Console but not with Terraform.

I’m guessing there’s no workaround so far, but if you see my message and have some good news, let me know.

Hi @bdewasmes,

So far no new news. In the coming weeks, we will have to implement a workaround for this point. We are considering adding a step via the AWS CLI for uploading the keytab file.

Sorry for that :frowning:

Thanks. I ended creating the locations with terraform then updating the locations with aws cli. I was afraid terraform would change it back but it does not. Good enough.

Hi.
Is there any update on this topic?
I am facing the same problem with DataSync HDFS location and passing kerberos keytab as a terraform parameter.
My idea was to store keytab value in secretsmanager secret, but it’s in binary format, so it’s not being passed as a string correctly (probably because of a double base64 encoding described above).
Is the workaround described by @bdewasmes with updating location after it’s created with AWS CLI the only option here?
Second thing - where to store the ketab then? S3 bucket and download every time? Github repositiry is not the best place for that :wink:

Many thanks for any suggestions

Hi,

Let me confirma with our Team, but the solution was developed in go (native HC language) and it is the best solution.

best regards,

Leandro Silva

Thank you @le2004
I would be grateful if you could check this with your team.

Hello Lukas.

In order to solve this problem we developed a custom AWS provider to exclusively handle the Datasync provisioning part (useful links that we used to understand how to do this below). Everything else is being provisioned using the terraform-provider-aws.

Briefly explaining the solution, in the function responsible for creating the resource, the custom provider is taking the kerberos keytab and krb5 file paths, reading the files with the os.ReadFile() function, using the returned values as well as other necessary variables to set the &datasync.CreateLocationHdfsInput (just like in the location_hdfs.go file from the provider source code), and finally creating the resource using the CreateLocationHdfsWithContext() function passing as a parameter the &datasync.CreateLocationHdfsInput we’ve just set. The logic of the function responsible for updating the resource is very similar, and the other ones are the same idea from the source code.

We choose to use this work around because it’s a solution developed with Golang that works just the way that terraform-provider-aws should work in this case.

Useful links: