I'm writing a script truncating tables. The problem is that the script goes in infinite loop.
./n_input
contains environment variables that are used in script.
#!/bin/ksh
INPUT_FILE=./n_input
if [ ! -e $INPUT_FILE ];
then
echo "Error: Input file require"
exit 1
fi
source $INPUT_FILE
while read Table
do
TABLE="$DSTN_DATABASE.dbo.$Table"
echo "Table : $TABLE"
QUERY="truncate table $TABLE"
echo $QUERY > ./tmp_file
sqlcmd -m 1 -U $DSTN_USER -P $DSTN_PASSWORD -D -S $DSTN_SERVER -m1 -i ./tmp_file
RET_VALUE=$?
if [ $RET_VALUE -ne 0 ]
then
echo "Error $TABLE"
fi
done < $TABLE_LIST
exit 0
How do I break the loop? I have tried to remove sqlcmd
from script and verified that it was working. It's working as expected. Observed the same behavior with sqlcmd -Q
option.
$TABLE_LIST
file contains only one table name.
You don't need to write the query into temporary file. Use -q
, or -Q
options instead:
q="truncate table ${TABLE};"
sqlcmd -m 1 -U "$DSTN_USER" -P "$DSTN_PASSWORD" -S "$DSTN_SERVER" -q "$q"
Note the ;
at the end of the query. Probably, that's the reason why the script "stalls". That may look like an infinitely running loop.
Also note the use of double quotes. You should wrap variables in double quotes to prevent reinterpretation of the special characters.
By the way, you can locate the exact command that is causing the issue by adding set -x
at the beginning of the script. set -x
turns on debugging mode. With debugging mode on, you see the commands being executed.
It's very unlikely that the content of $TABLE_LIST
file is causing such behavior, unless the file is enormously big. The loop construct is correct, and the number of iterations should match the number of lines in the file.