I am trying to use eval
to run a command passed into a function through $@
.
Here is my code:
run_command() {
: some logic not relevant to this question
eval "$@"
}
I am running it as:
run_command "ls" "|" "wc -l" # works, runs "ls | wc -l"
run_command "ls | wc -l" # works as above
Now, I try to list a file that has a space in it:
> "file with space"
run_command "ls" "-l" "file with space"
This time, I get these errors:
ls: file: No such file or directory
ls: space: No such file or directory
ls: with: No such file or directory
So, it is clear that "$@"
is resulting in word splitting. Is there a way to prevent this issue so that run_command
function is immune to white spaces, globs, and any other special characters?
eval
combines all arguments into a single string, and evaluates that string as code. Thus, eval "ls" "-l" "file with space"
is exactly the same as eval ls -l file with space
or eval "ls -l file with space"
.
As given in BashFAQ #50 -- the following runs its exact argument list as the argument vector of a simple command.
run_command() {
"$@"
}
This offers numerous guarantees:
>foo
will be passed through, rather than causing a file named foo
to be created.If you need to wrap a command with a pipeline, this can be done by encapsulating that pipeline in a function:
run_pipeline() { foo "$@" | bar; }
run_command run_pipeline "argument one" "argument two"
To be clear: I do not advise using this code. By exempting |
from the usual protections provided by following best practices, it weakens the security provided by said practices. However, it does do what you ask.
run_command() {
local cmd_str='' arg arg_q
for arg; do
if [[ $arg = "|" ]]; then
cmd_str+=" | "
else
printf -v arg_q '%q' "$arg"
cmd_str+=" $arg_q"
fi
done
eval "$cmd_str"
}
In this form, an argument of |
will cause the generated string to contain a compound command, split into simple commands at the location of that argument.
Now -- why is trying to allow syntax elements to be processed a Bad Idea in this context? Consider the following:
echo '<hello>'
Here, the <
and >
in the string <hello>
have been quoted, and thus no longer have their original behavior. However, once you've assigned these values to an array or an argument list, as in
args=( 'echo' '<hello>' )
...metadata no longer exists about which characters were quoted or were not. Thus,
echo hello '|' world
becomes entirely indistinguishable from
echo hello | world
even though as separate commands these would have had very different behaviors.
Consider the following:
run_command rm -rf -- "$tempdir" "$pidfile"
In the "best practices" example, this is guaranteed to treat both the contents of tempdir
and pidfile
as filenames passed to rm
, no matter what those values are.
However, with the "allowing pipes" example, the above could instead invoke rm -rf -- | arbitrary-command-here
, should tempfile='|'
and pidfile=arbitrary-command-here
.
As shell variables are initialized from the set of environment variables present, and environment variables are often externally controllable -- as demonstrated by the existence of remote exploits for Shellshock -- this is not a purely theoretical or idle concern.