I would like to export the stdout
and stderr
of a bash command to the same text file.
The bash command is a single-line command that calls python3
followed by the name of the module and function, followed by three arguments (each using --
).
The bash command is running on an HPC as part of a slurm job.
python3 -m module.function --Arg1 Val1 --Arg2 Val2 --Arg3 Val3
I tried the following but it failed. It assumes that 2>
or >
are extra information to the last argument.
python3 -m module.function --Arg1 Val1 --Arg2 Val2 --Arg3 Val3 2> Output.txt
python3 -m module.function --Arg1 Val1 --Arg2 Val2 --Arg3 Val3 > Output.txt
How to be able to export the output to a file without making large changes to the syntax (I need to keep using a single line command for calling the function from the module).
Thanks
The problem does not lie in the Bash syntax but in the way you run the command:
Then, I use the following bash command to run each line of these commands as separate job in parallel: $(head -n $SLURM_ARRAY_TASK_ID $File | tail -n 1) [submitted as a SLURM array job]
this indeed exec
s the command which is interpreted as is without any involvement of Bash.
Try this instead:
head -n $SLURM_ARRAY_TASK_ID $File | tail -n 1 | bash
Now beware that with > Output.txt
for every lin in $File
, each job will overwrite the Output.txt
file, leaving you only with the output of the last job in the array. Using >> Output.txt
will not overwrite, but the result on parallel jobs is not guaranteed. Best is to
And if you want both stderr
and stdout
to be merged you need >& Output.txt
.
But if that is the only line in the submission script, you might prefer simply using the Slurm --output="slurm-%A_%a.out
parameter. If you need everything in a single file afterwards, use the cat
command.