Search code examples
linuxbashshellpipelineio-redirection

Use I/O redirection between two scripts without waiting for the first to finish


I have two scripts, let's say long.sh and simple.sh: one is very time consuming, the other is very simple. The output of the first script should be used as input of the second one.

As an example, the "long.sh" could be like this:

#!/bin/sh
for line in `cat LONGIFLE.dat` do;
    # read line;
    # do some complicated processing (time consuming);
    echo $line
done;

And the simple one is:

#!/bin/sh
while read a; do
    # simple processing;
    echo $a + "other stuff"
done;

I want to pipeline the two scripts this:

sh long.sh | sh simple.sh

Using pipelines, the simple.sh has to wait the end of the long script before it could start.

I would like to know if in the bash shell it is possible to see the output of simple.sh per current line, so that I can see at runtime what line is being processed at this moment.

I would prefer not to merge the two scripts together, nor to call the simple.sh inside long.sh. Thank you very much.


Solution

  • stdout is normally buffered. You want line-buffered. Try

    stdbuf -oL sh long.sh | sh simple.sh
    

    Note that this loop

    for line in `cat LONGIFLE.dat`; do   # see where I put the semi-colon?
    

    reads words from the file. If you only have one word per line, you're OK. Otherwise, to read by lines, use while IFS= read -r line; do ...; done < LONGFILE.dat

    Always quote your variables (echo "$line") unless you know specifically when not to.