I'm curious as to why I get two different results when decrementing the arr.length with arr.length-- and arr.length - 1. I would have thought these would have functioned the same. For example:
const numbers = [10, 20, 40, 30, 50]
function solution(numbers) {
for (let i = 0; i < numbers.length - 1; i++) {
if (numbers[i] < numbers[i + 1]) {
console.log('TRUE: ' + numbers[i]+" is increasing to "+ numbers[i+1])
} else {
return console.log('FALSE: ' + numbers[i] +' is decreasing to ' + numbers[i+1])
}
}
}
logs out as:
"TRUE: 10 is increasing to 20"
"TRUE: 20 is increasing to 40"
"FALSE: 40 is decreasing to 30"
but when I swap in a decrement:
(let i = 0; i < numbers.length--; i++)
the else returns undefined
"TRUE: 10 is increasing to 20"
"TRUE: 20 is increasing to 40"
"FALSE: undefined is decreasing to undefined"
Try this:
console.log(numbers)
You'll see that every loop, the numbers
array loses the last value. In JavaScript, array lengths are mutable. You can essentially delete every element in the array by doing array.length = 0
.
The --
is the decrement operator, which not only subtracts 1
from the value of numbers.length
, but also sets the value of numbers.length
to numbers.length - 1
.
If you want to control the for-loop correctly, you will always use i < numbers.length - 1
, otherwise each loop, you are actually deleting the last element of the array.