I've got the following issue:
i am parsing a file in a bash script and preparing an output format like this:
echo "Evaluation of data from $date" > $outPut
printf "\n%s\t\t\t%s\t\t%s\n" "column1" "column2" "column3" >> $outPut
printf '%*s\n' "${COLUMNS:-$(tput cols)}" '' | tr ' ' - >> $outPut
cat $file | \
while read i;
do
...
printf "..."
done >> $outPut
It works fine if $outPut is a file. But depending on a program parameter $output can be /dev/stdout.
If I pipe the output to less
bash someprogram.bash --tostdout | less
less starts immediately with zero output. After a while I see everything, but if I do :G less just stops working and I can only stop it with CTRL-C. If I don't pipe it, the output works fine.
What I want is: the function to write everything at once after collecting the output + the pipe to wait for my program to finish.
I found a workaround by first writing to a tempfile and cat it to stdout afterwards
tmpFile=$(mktemp -p /tmp)
echo "Evaluation of data from $date" > $tmpFile
printf "\n%s\t\t\t%s\t\t%s\n" "column1" "column2" "column3" >> $tmpFile
printf '%*s\n' "${COLUMNS:-$(tput cols)}" '' | tr ' ' - >> $tmpFile
cat $file | \
while read i;
do
...
printf "..."
done >> $tmpFile
cat $tmpFile
rm $tmpFile
The advantage is that less gets the full output of the script and not chunk after chunk.