For what it’s worth, I will share my experience with this, perhaps it will help inform your decision.
We maintain a “common” image upstream that all workloads are built off. This is where we apply automatic updates, through a pipeline with tests associated. It is true that sometimes applying automatic updates can result in flaky bakes - either the process takes too long and subsequent tasks time out, or there is some broken package which breaks everything. Discovering this during a deploy pipeline of an actual workload is unadvisable, so we (the SREs) like to discover it before the developers do.
The packer templates which are shared with people, as you want to do, use these base images and do not need to run any updates.
This involves taking the base AMI we want to build off (usually from the marketplace), then applying our desired configuration to it, via a well-maintained, semantically versioned and peer-reviewed Ansible role.
The role itself has an Inspec profile associated with it, the logic being that “if you apply this role to any image, the resulting state should be the following”, then we make assertions about the state of the image. One of those assertions for example is “there are no packages with vulnerabilities present”. It could just as well be “there are no updates available for the packages on this image”.
The only thing which typically changes for a given version of that role is the initial state (base AMI) and the “layer” of package changes. If anything breaks in the update: *
task, the packer build fails and results in a failed job in our CI system, giving us something to in and check on, address or re-run when culprit upstream has been resolved. In all of these cases, the people we serve are not impacted by this. When the build passes again, the result is a new image with the same name, which people can automatically consume using "latest": "true"
in their packer template.
So yes - you should deliver clean, secure images, but it’s better to do it behind the scenes in a separate process that transparently delivers known-good states to the people who will use them.