resource.CreateRequest.Plan.Get() throws error on slice of struct in resource model

According to the documentation at Implement resource create and read | Terraform | HashiCorp Developer it is possible to specify slices of nested structs within resource data models and map those to schema.SetNestedAttribute. However, in reality with TF_LOG=DEBUG one can confirm that resource.CreateRequest.Plan.Get() will throw the following error (with *resource.CreateResponse.Diagnostics.HasError():

Received unknown value, however the target type cannot handle unknown values. Use the corresponding types package type or a custom type that handles unknown values.
| Path: nestedStructs
| Target Type: nestedStruct
| Suggested Type: basetypes.SetValue

I also note that Why is req.Plan.Get() in my Create() method throwing an error? aligns with the error message and contradicts the documentation (albeit at a “beta” version of the plugin framework).

What is the actual correct documentation for resource data model specifications which declare a field member of a slice of nested structs? Where should the issue be raised to correct it?

As an interesting side note pointers to nested structs DO actually seem to map correctly to schema.SingleNestedAttribute which feels weirdly inconsistent to me and suggests that maybe there is a WIP for slices of structs?

Hi again, @mschuchard :wave:

Upfront: the tutorials should be using types.Set (basetypes.SetValue) there. While the tutorial repository is not public, the provider codebase for it is, so raising the issue there is likely the best place: Issues · hashicorp/terraform-provider-hashicups · GitHub

The canonical website documentation for data handling with the framework can be found here: Plugin Development - Framework: Handling Data - Terraform Concepts | Terraform | HashiCorp Developer

Even though the framework can support Go built-ins when accessing data, such as a slice, that is only possible in certain scenarios since Go built-ins cannot conceptually store additional information from Terraform’s type system, such as whether a value is unknown or in the future whether a value is partially unknown or marked as sensitive. It is difficult to explain to developers in the documentation all the scenarios where built-ins can potentially be used, so eventually the website documentation gets to always recommending types types over built-ins. It likely could (and maybe should) be more explicit in this regard. More information about this topic can be found on this page: Plugin Development - Framework: Types | Terraform | HashiCorp Developer

A future version of the framework may also consider preventing the usage of Go built-ins for accessing data, which would certainly be a breaking change, but would ultimately remove the data handling ambiguities that exist today: Consider Always Returning Errors with Get/GetAttribute for Target Types Without Unknown Support · Issue #498 · hashicorp/terraform-plugin-framework · GitHub

Long story short here though is that the tutorials were written earlier in the framework’s development when the recommendation to always use types types was much hazier. We weren’t sure at that time whether we wanted the framework’s design to leave additional provider developer options when accessing data to workaround the quirks of Go built-ins not supporting all of the Terraform type system concepts. An example of that eventually not chosen design path can still be seen in the lingering (basetypes.ObjectValue).As() options.

This will work as long as the object itself is not unknown in the Terraform configuration. A pointer (and slices too, since a slice can be nil) can store whether a value is null, which is why it can work in certain scenarios. There should not be anything else the framework could additionally do with slice handling given the current design.

When accessing data for a single nested attribute, the recommendation is types.Object instead of a struct or struct pointer. When accessing data for a list/set nested attribute, the recommendation is types.List/types.Set instead of a slice, then accessing each of the elements as a types.Object, not as a struct or struct pointer, since types.Object can handle unknown values. It is verbose, but the most accurate/safe data representation since provider developers can choose the appropriate logic at each layer of the data. The framework website documentation will provide more verbose code examples and recommendations of handling data for each attribute type as part of Make value handling methods more discoverable in the documentation · Issue #695 · hashicorp/terraform-plugin-framework · GitHub.

Thanks for the detailed explanation. Issue raised in repo. I am actually super glad the solution is to use a tftype as opposed to a custom type.

When accessing data for a single nested attribute, the recommendation is types.Object instead of a struct or struct pointer.

So this opened up an enormous can of worms. As you may notice in the issue you linked against at the end, I was eventually able to determine usage of ElementsAs and MapValueFrom. However, ObjectValueFrom has very cryptic usage. My best initial guess is along the lines of:

type TopModel struct {
    MyModel types.Object `tfsdk:"my"`

type myModel struct {
	Foo           types.String `tfsdk:"foo"`
	Bar            types.Bool   `tfsdk:"bar"`

var myModelTF map[string]attr.Type = map[string]attr.Type{
	"foo":            types.String,
	"bar":              types.Bool,

// assume that sdk.My is the analogous deserialized struct from the response body from my corresponding SDK Go bindings
func MyModelGoToTerraform(ctx context.Context, my sdk.My) (types.Object, diag.Diagnostics) {
	return types.ObjectValueFrom(ctx, myModelTF, myModel{
		Foo:           types.StringValue(my.Foo),
		Bar:            types.BoolValue(my.Bar),

// invoked in Create like 
var objectConvertDiags diag.Diagnostics
state.My, objectConvertDiags = util.MyModelGoToTerraform(ctx,

but throws compilation error:

types.String (type) is not an expression

Unsure from attr package - - Go Packages and what magical values are required in myModelTF. I could modify to attr.Type.TerraformType but that violates usage. I also tried values of attr.Type{TerraformType: types.String} but no joy. The documentation claims:

// TerraformType returns the tftypes.Type that should be used to
// represent this type.

but that does not seem to be my experience? Also Plugin Development - Framework: Types | Terraform | HashiCorp Developer honestly seems to contradict your information and the code if I understand it correctly (my assumption being the documentation is incorrect).

As another side note: most of my code for this plugin now seems to be schema, models, and converters among the SDK struct, TF Go type models, and TF tftype models vis a vis “utility code”. Is there any possibility of some helper functions in the plugin framework to facilitate these kinds of functionality? Packer plugins and possibly in the future Vault plugins have mapstructure for HCL serialization and deserialization, and I have no idea what kind of schema serialization and deserialization (other than of course the tfsdk) could be available.

I found a solution in the AWS provider for this (and unsurprisingly it uses generics), but even with the generics it is still a gigantic amount of code for handling all these conversions, SDK models, TF models, and schema. What is unclear to me is:

  1. Why is there no support for generics for non-primitives in tftypes since this would also benefit TF in addition to the plugin framework?
  2. Why are these conversion functions not standardized so that each plugin is not developing its own solution with what amounts to an entire package or more? It does not even need to be part of the plugin framework (to preemptively clarify I have no objection whatsoever to a software architecture argument for not containing it within the plugin framework); I just think it should exist.

I also realized my dumbness and updated the model to:

var myModelTF map[string]attr.Type = map[string]attr.Type{
	"foo":            types.StringType,
	"bar":              types.BoolType,

and now the compiler is much happier.

However I did also notice that there seems to be no equivalent to append([]struct, struct) vis a vis append(types.Set, types.Object) which seems like a showstopper for SetNestedAttribute type conversion.