I am using the
terraform test command released in the terraform 1.6.
Is there a way to check if there is a diff with the state file after executing terraform
apply command? I tried to run
apply , but the test command will directly return
pass when it is executed successfully. It doesn’t realize that it’s wrong to have these diffs.
Does the current command support checking diff? Or will it be supported later?
I am using the
Having a diff after apply is a concern with the provider, rather than the Terraform configuration, so it’s harder to deal with in a Terraform test situation.
Checking if there’s a diff statically is not very useful, since you just applied that configuration it will match the applied state (any actual diff is a Terraform or provider bug, and not something in your configuration). Any diff in a plan after applying usually takes the form of the provider refreshing the resource to a state which does not match that which was previously applied, and is a problem with the provider.
This puts the changes somewhat outside the scope of testing your configuration, however I can see how that might be useful to validate the configuration. I’m not sure offhand if there’s a good way to access that, @liamcervante may have some insight on a better way to run a plan against the tested state.
Hi @shanye997! I’ve reworked the issue you filed yesterday into a feature request and added some extra context in there: terraform test: add support for validating plan diffs / attribute changes · Issue #34500 · hashicorp/terraform · GitHub
The summary is that this isn’t currently supported, and it’s not something we have concrete plans to work on. But, enough interest from the community and in your feature request could change that.
If you can capture the attribute that is drifting in an output block, then you might be able to use the testing framework to validate that it isn’t changing in a follow
plan action. I’ve also added an explanation about this in the feature request to make it discoverable by others reading through the issue.
FWIW, there is a similar mechanism in the provider testing harness which tries to create an additional plan and checks whether it’s a no-op.
I share James’ thought that this should not typically be necessary if providers are correct, but there are some situations that can be created directly by a module author without encountering any provider bugs, which includes any situation where the declarations in the configuration are self-contradictory.
For example, if someone incorrectly tries to write a module that uses a data resource to try to test if something exists and then use a conditional
resource block to declare it only if it doesn’t already exist then that creates a contradiction: “this exists if it doesn’t exist”. Terraform will therefore typically fail to converge, because the side-effects of applying a change also change the predicate used to decide the desired state.
This is a tricky tradeoff though, because planning can be pretty expensive (in time, at least) for some modules, so I think it would be too disruptive to check for convergence by default. It could potentially be opt-in somehow, but then we’d be asking test authors to guess whether someone is likely to accidentally make the module contradictory in future and to endure an additional time cost on each test run just in case. I’m not sure that a typical test author has enough information to make that decision ahead of time.
Thank you for your reply!
I do agree that if a diff is generated it is probably a bug in the provider, which is exactly why we want to check for diffs. We would like to implement a double check on the use of the provider.
Hi @liamcervante ！
I think the Terraform test mechanism is necessary to guarantee that there is no diff after apply, in addition to guaranteeing the success of the test. Whether it’s a bug in Terraform configuration or in provider, the existence of a diff means that subsequent work on the template won’t be able to proceed smoothly, and to a certain extent, the Terraform configuration will be unusable. Additionally, the timely detection of diffs can also drive improvements in the provider.
The future solution you mentioned in the issue will solve my current problem, and I’m very much looking forward to the release of the new feature!
Hi, @apparentlymart !
Time consumption is indeed a factor we consider as well. It would be nice to have this feature as an optional check for the plan command in Terraform test.