Framework - Error: Provider produced inconsistent result after apply


I had a quick attempt at converting some resources over to the new plugin-framework, I believe I have run into Unset Primitive Types on Optional Attributes with (State).Set() Generate Errors · Issue #201 · hashicorp/terraform-plugin-framework · GitHub and tried a few different workarounds.

I have a simple optional field, which when I ran my tests a value is provided by the server (note this shouldn’t be optional+computed).

"description": {
	Optional: true,
	Type:     types.StringType,

The error is as documented in the issue:

│ Error: Provider produced inconsistent result after apply
│ When applying changes to pingfederate_oauth_authentication_policy_contract_mapping.example, provider "provider[\"\"]" produced an unexpected new value: .jdbc_attribute_sources[0].description: was null, but now cty.StringVal("JDBC").
│ This is a bug in the provider, which should be reported in the provider's own issue tracker.
if in.Description != nil {
	result.Description = types.String{Value: *in.Description}
} else {
	result.Description = types.String{Null: true}

This occurs because the API is returning a description even if we don’t provide one (so its technically drift).

I’m just checking the above issue link is the one I need to track for this, or I anyone has any other suggestions :+1:

Hi @iwarapter :wave: this attribute would likely benefit from two implementation details:

  • First, it should be marked as Computed: true as this will signal to Terraform CLI (the source of the error) that the attribute value can come from the provider, not necessarily the configuration
  • Second, adding an attribute plan modifier can help the Terraform CLI plan output show any expected default value, rather than “(known after apply)”, and also allows any configuration references to the planned value get the expected default. The framework will likely provide some helper plan modifiers in the future for exactly this use case.

Hope this helps.

Hi @bflad, so this is quite a difference in behaviour from the sdkv2.

In v2, you would be able to successfully create the resource but on a subsequent plan or apply it would flag the delta:

terraform plan                 
pingfederate_authentication_policy_contract.demo: Refreshing state... [id=gXHZH2JlHNBxOZSk]
pingfederate_oauth_authentication_policy_contract_mapping.demo: Refreshing state... [id=gXHZH2JlHNBxOZSk]

Terraform used the selected providers to generate the following execution plan. Resource actions are indicated with the following symbols:
  ~ update in-place

Terraform will perform the following actions:

  # pingfederate_oauth_authentication_policy_contract_mapping.demo will be updated in-place
  ~ resource "pingfederate_oauth_authentication_policy_contract_mapping" "demo" {
        id = "gXHZH2JlHNBxOZSk"

      ~ jdbc_attribute_source {
          - description = "JDBC" -> null
            # (3 unchanged attributes hidden)

            # (1 unchanged block hidden)
        # (3 unchanged blocks hidden)

Plan: 0 to add, 1 to change, 0 to destroy.

Whilst in my case I didn’t originally set a default value as the API didn’t enforce one (a few versions back) but now if there is an upstream API change it means that unless the field is marked as now computed terraform wont be able to graciously handle it and the provide will fail as described originally.

I feel this change makes the new framework more brittle to upstream changes which many community providers certainly would potentially be vulnerable too.