I've a lot AWS EC2 Instances and I need to execute a python script from them at the same time.
I've been trying from my pc to execute the script by sending via ssh the commands required. For this, I've created a another python script that open a cmd terminal and then execute some commands (the ones I need to execute the python script on each instance). Since I need that all these cmd terminal are openned at the same time I've used the ThreatPoolExecutor that (with my PC characteristics) grants me 60 runs in parallel. This is the code:
import os
from concurrent.futures import ThreadPoolExecutor
ipAddressesList=list(open("hosts.txt").read().splitlines())
def functionMain(threadID):
os.system(r'start cmd ssh -o StrictHostKeyChecking=no -i mysshkey.pem ec2-user@'+ipAddressesList[threadID]+' "cd scripts && python3.7 script.py"')
functionMainList =list(range(0,len(ipAddressesList)))
with ThreadPoolExecutor() as executor:
results = executor.map(functionMain, functionMainList)
The problem of this is that the command that executes the script.py
is blocking the terminal until the end of the process, hence the functionMain stays waiting for the result. I would like to find the way that after sending the command python3.7 script.py
the function ends but the script keeps executing in the instance. So the pool executor can continue with the threads.
The AWS Systems Manager Run Command can be used to run scripts on multiple Amazon EC2 instances (and even on-premises computers if they have the Systems Manager agent installed).
The Run Command can also provide back results of the commands run on each instance.
This is definitely preferably to connecting to the instances via SSH to run commands.