I am trying to write a script that is built around an existing script provided for me from a different team. I'm trying to use it with subprocess.
The script provided to me does the following things;
Due to how this provided script is constructed, it is meant to be run from the command line, and arguments are used to specify what you are wanting to collect. The CLI command looks like this:
./applicationAPI.py -target_ip -username -password metric1 metric2 metric3 -u 60
The -u 60
is specifying the gather time interval to be every 60 seconds.
So in my script, I'm executing that through subprocess, shown like the following:
import subprocess
import csv
import json
api_call = subprocess.run(['./applicationAPI.py'], '-target_ip', '-username',
'-password', 'metric1', 'metric2', 'metric3', '-u', '60'], capture_output=True, text=True, check=True)
api_response = (api_call.stdout)
f = StringIO(api_response)
reader = csv.reader(f)
for line in reader:
print(line)
I tried to do it this way, but the cursor just keeps hanging after execute until I ^C.
I think that because the applicationAPI
process keeps actively running over the established TCP connection, it is streaming data for every -u
time interval, and doesn't 'finish' its execute, which is what the api_call.stdout
is expecting.
Even though, if I run the provided script from the CLI normally, with the proper args, it then outputs to the terminal CSV results of the application metrics.
I would just normally use this, but I'm doing this script setup to parse all the CSV results to JSON for a different use case.
So I guess TLDR, how can subprocess handle streamed data within your python script? Still trying to look up solution (Popen?) not sure but still looking.
Any help is appreciated.
I figured it out. The script that was given to me applicationAPI.py already formatted the data with a CSV reader. So using subprocess.Popen with stdout.PIPE, I can iter through the rows with my own CSV reader.