i want to process files from a text file containing single quoted file names, like
'new'$'\n''line'
'tab'$'\t''ulator'
copy & paste for manually processing this files works fine:
test -f 'tab'$'\t''ulator'
now, reading from file with bash read builtin command
while IFS="" read -r myfile; do
line=$myfile
...
done < text.txt
give strings containing escaped single quotes, like
'\''new'\''$'\''\n'\'''\''line'\'''
'\''tab'\''$'\''\t'\'''\''ulator'\'''
however, processing this file names in bash script does not work.
test -f "$myfile"
test -f ${myfile}
how to disable /undo escaping single quotes and process raw file name within bash?
eval
Many people quite reasonably regard eval
as a mis-spelling of evil.
So, I would regard this solution as last-choice to be used only if all else fails.
Let's take this sample file:
$ cat badformat
'new'$'\n''line'
'tab'$'\t''ulator'
We can read and interpret these file names as in the following example:
while read -r f; do
eval "f=$f"; [ -f "$f" ] || echo "file not found"
done <badformat
The only character that cannot be in a Unix file name is NUL (hex 00). Consequently, many Unix tools are designed to be able to handle NUL-separated lists.
Thus, when creating the file, replace:
stat -c %N * >badformat
with:
printf '%s\0' * >safeformat
This latter file can be read into shell scripts via a while-read loop. For example:
while IFS= read -r -d $'\0' f; do
[ -f "$f" ] || echo "file not found"
done <safeformat
In addition to shell while-read loops, note that grep
, find
, sort
, xargs
, as well as GNU sed
and GNU awk
, all have the native ability to handle NUL-separated lists. Thus, the NUL-separated list approach is both safe and well-supported.