Ansible failed to connect but vagrant ssh works


When I use vagrant with ansible, I only need the playbook to play once so I made this Vagrantfile and works like a charm on debian / ubuntu boxes:

Vagrant.configure("2") do |config|
  # Configuration du primary
  #config.ssh.private_key_path = "~/.ssh/id_rsa"
  config.ssh.forward_agent = true
  config.vm.synced_folder "/share/bl/prod//pgbackups", "/save/bdd",
    type: "nfs",
    nfs_version: 4,
    nfs_udp: false
  config.vm.define "pbd1" do |pbd1| = "generic/rocky8"
    pbd1.vm.hostname = "pbd1" :private_network, ip: ""
    pbd1.vm.provider :libvirt do |v|
    #v.gui = true
    v.cpus = 2
    v.memory = 1024

  # pbd2
  config.vm.define "pbd2" do |pbd2| = "generic/rocky8"
    pbd2.vm.hostname = "pbd2" :private_network, ip: ""
    pbd2.vm.provider :libvirt do |v|
      #v.gui = true
      v.cpus = 2
      v.memory = 1024
    pbd2.vm.provision "ansible" do |ansible|
      ansible.limit = "pbd1,pbd2"
      ansible.playbook = "provisionning/install_postgres.yml"
      ansible.extra_vars = { ansible_python_interpreter:"/usr/bin/python3" }
      ansible.inventory_path = "provisionning/inventory"
      ansible.compatibility_mode = "2.0"

end # Vagrant.configure("2") do |config|

But when switching to centos / redhat / rocky, ansible does not know which ssh keys to use and I need to to specify ansible_ssh_private_key_file :

pbd1 ansible_host= ansible_user=vagrant ansible_ssh_private_key_file=.vagrant/machines/pbd1/libvirt/private_key
pbd2 ansible_host= ansible_user=vagrant ansible_ssh_private_key_file=.vagrant/machines/pbd2/libvirt/private_key

With ubuntu boxes, it works and there is no need to specify ansible_ssh_private_key_file.

I know it is because provisionning is “run” on pbd2 level. Well… It seems logical.

Do you know any workaround to play the playbook only once ?