console.log("#1", "a12312a".match(/^\d+/)?.[0].length);
console.log("#2", ("a12312a".match(/^\d+/)?.[0]).length);
I was writing some code and stumbled upon something I don’t understand. In Chrome 89.0.4389.128 (Official Build) (64-bit), the code above gives this:
#1 undefined
Uncaught TypeError: Cannot read property 'length' of undefined
The both lines look the same to me: "a12312a".match(/^\d+/)?.[0]
is an undefined
, and they are trying to read the property length
of the undefined
, which is supposed to throw a TypeError
. But the first line did not, while the second did.
…Why? I’m confused. Am I missing something very basic?
The .match
returns null since the pattern doesn't match. So the comparison is between
null?.[0].length
and
(null?.[0]).length
This should make the process clearer. With .
and ?.
chains, as they evaluate left-to-right, if at any point the expression on the left is null
or undefined
, the chain will stop there and evaluate the whole thing to undefined
.
But if you break the chain by surrounding one of them in parentheses instead, you just get a plain expression inside the parentheses:
(undefined).length
without the special mechanics of the optional chain.
Optional chaining only functions along a contiguous sequence of property accesses and function calls. Any other operator in between (such as grouping parentheses) will break the chain.