I'm having trouble getting bash namerefs to work properly when I'm also piping into my function.
I have a function below, which takes a json blob and converts it into a bash associative array. Since bash cannot pass associative arrays by value, I wrote the function to receive the name of the return value, already declare
d to be an associative array. It then sets up a nameref to it and writes into it.
This first example works:
jq_to_aa () {
# takes a json hash as $2, and the name of a var in $1. Adds all entries from $2 into $1, via jq.
local -n j2a_result=$1
input="${@:2}"
input=$(echo "$input" | jq -rM ". | to_entries | .[] | @text \"\(.key) \(.value)\"")
while read -r key val; do
j2a_result["$key"]="$val"
done < <(echo "${input}")
}
data='{"a": 0.96}'
declare -A assoc=( )
jq_to_aa assoc "${data}"
echo expect 0.96: ${assoc[@]}
$ bash script1.sh
expect 0.96: 0.96
OK, so far so good. But, I would like the function to receive the data via a pipe. The obvious changes are to call it echo "${data}" | jq_to_aa myvar
, with the function below:
jq_to_aa () {
# takes a json hash in stdin, and the name of an var in $1. Adds all entries from stdin into $1.
input=$(jq -rM ". | to_entries | .[] | @text \"\(.key) \(.value)\"")
local -n j2a_result=$1
while read -r key val; do
j2a_result["$key"]="$val"
done < <(echo "${input}")
}
data='{"a": 0.96}'
declare -A assoc=( )
echo "${data}" | jq_to_aa assoc
echo expect 0.96: ${assoc[@]}
$ bash script2.sh
expect 0.96:
This does not appear to work, and I'm struggling to see why. My suspicion is that piping the value in causes problems, perhaps because it creates a subshell (does it? I dont know) and then the values get separated.
Using a pipeline causes the function to run in a subshell. Variable assignments in the subshell don't affect the parent shell. Use a here-string instead.
jq_to_aa assoc <<<"$data"