Search code examples
bashshellgnu-parallel

Bash sending multiple curl request using GNU parallel


I have a list of URLs ( 5000+ ) and I need to send 25 URLs in parallel to a internal service. I know how to send the URLs request using a single query.

curl -s http://192.168.150.113:9999/app.boxx.com 

And I tried using GNU parallel,

while true;do parallel -j25 curl -s http://192.168.150.101:9999/'{}' < list;done

Is it good to use GNU parllel ? It works good but i feel the response is quite slow and the response is similar to a single API request.

Instead , Can we use ampersand ( & ) at end of each urls and send the request in parallel ?


Solution

  • I'm not sure if you are using the full potential of GNU parallel to the extent to what is should be used. For it to work, you need to do define a smaller job (the least smallest unit that you can breakdown) and let it run for the the number of times you want.

    Define a function to read from the URL, assuming the part http://192.168.150.113:9999/ is a fixed string and rest of the URL comes from a file, define a function as

    oneShot() {
        url="http://192.168.150.113:9999/"
        finalURL="$url$1"
        curl -s "$finalURL"   
    }
    

    and export this function to make it available across child-shells

    export -f oneShot
    

    and now do the magic to achieve parallelism, to run 25 jobs in parallel

    parallel -j25 oneShot < list