Search code examples
ansibleslurm

Running Ansible using slurm user: how can I fix ansible.legacy.setup failed to execute error?


TASK [Gathering Facts] *********************************************************
task path: /opt/playbook/site.yml:1
Using module file /usr/local/lib/python3.10/dist-packages/ansible/modules/setup.py
Pipelining is enabled.
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: slurm
<localhost> EXEC /bin/sh -c 'sudo -H -S -n  -u root /bin/sh -c '"'"'echo BECOME-SUCCESS-mvloemssulwwmnnhtatxivyevcbshjsb ; /usr/bin/python3'"'"' && sleep 0'
fatal: [localhost]: FAILED! => {
    "ansible_facts": {},
    "changed": false,
    "failed_modules": {
        "ansible.legacy.setup": {
            "failed": true,
            "module_stderr": "sudo: a password is required\n",
            "module_stdout": "",
            "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
            "rc": 1
        }
    },
    "msg": "The following modules failed to execute: ansible.legacy.setup\n"
}

A playbook is executed by the slurm user on node startup. However, it fails while gathering facts and I am unsure what the issue is. Apparently something is wrong with sudo. I am looking for ways to debug this more efficiently.

The playbook runs without issues under the regular ubuntu user.

Simplified host File

vpn:
  children:
    master:
      hosts:
        localhost:
          ansible_connection: local
          ansible_python_interpreter: /usr/bin/python3
          ansible_user: ubuntu
          ip: localhost

Solution

  • I was using local in my host file and therefore it was trying to run all scripts within the playbook on the master as the slurm user.

    By changing local to ssh it now connects to the master as the ubuntu user and by that has no privilege issue later on.