Search code examples
sshansiblessh-keysauthorized-keys

Unable to add public key to target host using ansible authorized_key module


  1. I have complete access to ServerA (single server) [IP: 142.5.5.55] where I have my public key /app/serverA/mw.pub. This server has ansible from where i run my automation.

  2. I have ssh passwordless connectivity to 3 servers [IP: 11.1.1.220, 11.1.1.221, 11.1.1.222] from ansible ServerA and we call these 3 servers as jump_nodes

  3. Lastly, I have something called as target hosts [IP: 192.0.0.200, 192.0.0.201, 192.0.0.202] that we named dest_nodes where we wish to inject our public key mw.pub. Only the jump_nodes have connectivity to dest_nodes.

Thus:

ServerA(ansible) ---------------------> jump_nodes --------------------> dest_nodes 
                  copy to ~/mw.pub               inject ~/mw.pub

I'm able to copy my public key mw.pub to all jump_nodes under ~/mw.pub using the below playbook.

All good but below is what is failing:

I wish to now inject the ~/mw.pub from jump server to the target host i.e dest_nodes which ever has connectivity.

My playbook:

---

- name: "Play 1"
  hosts: localhost
  gather_facts: false
  tags: always
  tasks:
    - name: Add host
      debug:
        msg: " hello "
    - set_fact:
        jump_server_list: "{{ JUMP_SERVER | trim }}"
    - set_fact:
        target_server_list: "{{ TARGET_SERVER | trim }}"

    - add_host:
        hostname: "{{ item }}"
        groups: jump_nodes
      with_items: "{{ jump_server_list.split('\n') }}"

    - add_host:
        hostname: "{{ item }}"
        groups: dest_nodes
      with_items: "{{ target_server_list.split('\n') }}"

- name: "Play 2"
  hosts: dest_nodes
  user: root
  ignore_unreachable: yes
  vars:
    ansible_ssh_extra_args: -o StrictHostKeyChecking=no
    ansible_ssh_private_key_file: /app/id_rsa

  gather_facts: true

  tasks:
    - name: Copy ssh public key to a file on jump servers
      raw: "echo {{ TARGET_KEY }}>~/mw.pub"
      run_once: True
      delegate_to: "{{ item }}"
      with_items: "{{ groups['jump_nodes'] }}"

    - name: Set authorized key taken from file
      ignore_errors: yes
      authorized_key:
        user: "{{ TARGET_USER }}"
        state: present
        key: "{{ lookup('file', '~/mw.pub') }}"
      register: keystatus
      delegate_to: "{{ item }}"
      with_items: "{{ groups['jump_nodes'] }}"

    - debug:
        msg: "CHECK STATUS {{ keystatus }}"
      ignore_errors: yes

Output:

TASK [Copy ssh public key to a file on jump servers] *******************************************************************************************************************
task path: /app/injectkeys/injectsshkeys.yml:40
<11.1.1.220> ESTABLISH SSH CONNECTION FOR USER: root
<11.1.1.220> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/app/id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o StrictHostKeyChecking=no -o ControlPath=/home/axmwapp/.ansible/cp/42c5d2e05f -tt 11.1.1.220 'echo ssh-rsa PPPPB3NzaC1yc2EPPPPDAQABAAABAQDQGeUOA0vJK1AXSp3UKK1KF4VnFzmcrCoM4Ha7jx49DGPkGuNgS4ZKYYGiAl7FDhwtysvUF6JSl1l3Gxrki3nLDmGYUHbzNCU0qghOw85gbr++W+b+VfEZEnzTE8VPjAgR/JvQItLd2F8PGGlZBwDUXOIvuw8Acqft0nErDkPkKApJcn302qHtOc9R1mFff/GuD6WL6gjPF0gZsEkxHq+FObdsuUzndon0SR3SPeoF/oKA2CVy15+ea6wZnAYqCCppbdgZYR9uSZlMnvwMGT2g3Au+kL2dls3aRYQm6ZH0IrOpfn8M+BaPCcpWppE64XSPZlkU+3mIe2riG4IyIE75 [email protected]>~/mw.pub'
<11.1.1.220> (0, '', 'Shared connection to 11.1.1.220 closed.\r\n')
changed: [192.0.0.200 -> 11.1.1.220] => (item=11.1.1.220) => {
    "ansible_loop_var": "item",
    "changed": true,
    "item": "11.1.1.220",
    "rc": 0,
    "stderr": "Shared connection to 11.1.1.220 closed.\r\n",
    "stderr_lines": [
        "Shared connection to 11.1.1.220 closed."
    ],
    "stdout": "",
    "stdout_lines": []
}

TASK [Set authorized key taken from file] ******************************************************************************************************************************
task path: /app/injectkeys/injectsshkeys.yml:46
[WARNING]: Unable to find '~/mw.pub' in expected paths (use -vvvvv to see paths)

fatal: [192.0.0.200]: FAILED! => {
    "msg": "An unhandled exception occurred while running the lookup plugin 'file'. Error was a <class 'ansible.errors.AnsibleError'>, original message: could not locate file in lookup: ~/mw.pub"
}
...ignoring

TASK [debug] ***********************************************************************************************************************************************************
task path: /app/injectkeys/injectsshkeys.yml:57
ok: [192.0.0.200] => {
    "msg": "CHECK STATUS {'msg': u\"An unhandled exception occurred while running the lookup plugin 'file'. Error was a <class 'ansible.errors.AnsibleError'>, original message: could not locate file in lookup: ~/mw.pub\", 'failed': True}"
}
META: ran handlers
META: ran handlers

Below is how i run the playbook

ansible-playbook /app/injectkeys/injectsshkeys.yml -f 5 -e JUMP_SERVER='11.1.1.220' -e TARGET_SERVER='192.0.0.200' -e TARGET_USER='root' -e TARGET_KEY="'ssh-rsa PPPPB3NzaC1yc2EPPPPDAQABAAABAQDQGeUOA0vJK1AXSp3UKK1KF4VnFzmcrCoM4Ha7jx49DGPkGuNgS4ZKYYGiAl7FDhwtysvUF6JSl1l3Gxrki3nLDmGYUHbzNCU0qghOw85gbr++W+b+VfEZEnzTE8VPjAgR/JvQItLd2F8PGGlZBwDUXOIvuw8Acqft0nErDkPkKApJcn302qHtOc9R1mFff/GuD6WL6gjPF0gZsEkxHq+FObdsuUzndon0SR3SPeoF/oKA2CVy15+ea6wZnAYqCCppbdgZYR9uSZlMnvwMGT2g3Au+kL2dls3aRYQm6ZH0IrOpfn8M+BaPCcpWppE64XSPZlkU+3mIe2riG4IyIE75 [email protected]'" -vvv

As you can see from output it is looking for ~/mw.pub on the dest_nodes despite delegate_to: jump_nodes i.e it looks up for ~/mw.pub on dest_nodes 192.0.0.200 when it should have looked up on 11.1.1.220 where it is present and then injected it on 192.0.0.200.

Can you please suggest how can i fix this ?


Solution

  • From the documentation on lookup plugins

    Like all templating, these plugins are evaluated on the Ansible control machine, not on the target/remote.

    So it actually does not look on the target host but on the controller.

    If you need to get a file from the target, you will have to use fetch prior to lookup the local copy or slurp the content into a variable.