I know there is some issues with passing in more complicated data structures, such as a list of lists, to a python script via the CLI.
I was wondering if running a python script from node code had any of these same issues.
Basically, say I have the following code in a node app:
const spawn = require("child_process").spawn;
const pythonProcess = spawn('python',["path/to/script.py", arg1, arg2, arg3]);
Question the above code is from
Suppose that arg1 and arg2 are lists of lists in the node app. And suppose arg3 is a double.
The corresponding code in my script.py file that is meant to parse and receive these arguments into variables looks like so:
import sys
if __name__ == '__main__':
oc = sys.argv[1]
nc = sys.argv[2]
r = sys.argv[3]
Will oc and nc here be lists of lists in python? Or does something else need to be done to get this working?
The easiest way to pass complex structures is to serialize it first in some common data format, such as JSON:
const myList = ["foo", "bar", "baz"];
const { spawn } = require("child_process");
const python = spawn('python',["script.py", JSON.stringify(myList)]);
And deserialize on callee side:
import sys, json
if __name__ == '__main__':
my_list = json.loads(sys.argv[1])
But, instead of passing serialized params as callee arguments, better use stdout
and stdin
streams for interchanging data larger than a few hundreds of bytes:
const { spawn } = require("child_process");
const python = spawn('python', ["script.py"]);
const buffers = [];
python.stdout.on('data', (chunk) => buffers.push(chunk));
python.stdout.on('end', () => {
const result = JSON.parse(Buffer.concat(buffers));
console.log('Python process exited, result:', result);
});
python.stdin.write(JSON.stringify(["foo", "bar", "baz"]));
python.stdin.end()
And accept it from sys.stdin
via json.load
, which takes streams instead of strings:
import sys, json
if __name__ == '__main__':
my_list = json.load(sys.stdin)
json.dump(my_list, sys.stdout)