I zipped up an Ansible playbook and a configuration file, pushed the .zip file to S3, and I'm triggering the Ansible playbook from AWS SSM.
I'm getting a AnsibleFileNotFound
error: AnsibleFileNotFound: Could not find or access '/path/to/my_file.txt' on the Ansible Controller.
Here is my playbook:
- name: Copies a configuration file to a machine.
hosts: 127.0.0.1
connection: local
tasks:
- name: Copy the configuration file.
copy:
src: /path/to/my_file.txt
dest: /etc/my_file.txt
owner: root
group: root
mode: '0644'
become: true
my_file.txt
exists in the .zip file that I uploaded to S3, and I've verified that it's being extracted (via the AWS SSM output). Why wouldn't I be able to copy that file over? What do I need to do to get Ansible to save this file to /etc/
on the target machine?
EDIT:
remote_src: true
makes sense because the .zip file is presumably unpacked by AWS SSM to somewhere on the target machine. The problem is that this is unpacked to a random temp directory, so the file isn't found anyway.The solution here is a bit horrendous:
remote_src
must be true
. Yeah, it's your file that you've uploaded to S3, but Ansible isn't really smart enough to know that in this context.So using src: "{{ playbook_dir | dirname }}/path/to/my_file.txt"
solved the problem in this case.
Note that this approach should not be used if configuration files contain secrets, but I'm not sure what approach AWS SSM offers for that type of scenario when you are using it in conjunction with Ansible.