I have this simple function "decode" that takes in 2 arrays as input, where the second array is used to decode the first array.
The starting input (not when the function recurses) must always be of the following format:
Example input:
([4],[0,2,6])
For some reason, my code always returns undefined when I try to return a decoded array. In fact, I can't seem to return anything other than undefined, even when I change the return statement to something like "return false". The log statements show that the correct values are being captured for both arrays, leaving me very confused.
Here's my code:
var decode = function(A, B){
console.log("A: "+A+" B:"+B);
console.log(B.length);
if(B.length===0){
return A;
}
var newA = [];
var newB = [];
var act = 0;
for(let i=0; i<A.length; i++){
newA[act] = A[i] - (B[i]/2);
newA[act+1] = A[i] + (B[i]/2);
act+=2;
newB = B.slice(i+1);
}
decode(newA, newB);
}
console.log("Answer is" + decode([4], [0,2,6]));
This will always return undefined, regardless of what you make the return statement. Console.log(A); on the other hand is giving me the correct value for what I want to return.
Thank you very much for the help! Greatly appreciated.
The problem is that if B.length != 0, there is no return value. Change
decode(newA, newB);
to
return decode(newA, newB);