I have a python script which takes one input at a time and process it . I want to run script parallel with different inputs at a same time . Like 50 or 100 times but with different inputs feed from txt file .
i execute script like:
python script.py -d url1
then
python script.py -d url2
then
python script.py -d url3
and instead of input one argument at a time , i want to feed those url from a text file and process them in parallel .
I tried this script running in bash shell using gnu-parallel but bash script not runs python shell and thus errors.
the code is as follows---
#!/usr/bin/env bash
doit() {
host="$1"
~/script/python script1.py -d $host
}
export -f doit
cat "$1" | parallel -j50 -k doit
contents of txt file---
url1.com
url2.com
url3.com
--------
url1000.com
url_any.com
With GNU Parallel, like this:
parallel --dry-run -a arguments.txt python script.py
which assumes your arguments are one per line in "arguments.txt"
.
Use parallel -k ...
to keep outputs in order, if required.
Use parallel --bar ...
to get a progress bar.