I am looking at JavaScript's number type system.
I'm using Chrome, When I evaluate 15--
for a number literal I get a ReferenceError
since it makes no sense to decrement a constant.
When I evaluate var x=10;x--;
as expected everything works.
Expectantly var a=Infinity;a--
evaluates to Infinity
, this all makes sense and is in accordance to the javascript language spec.
However to my surprise Infinity--
and Infinity++
evaluate to Infinity
unlike other literals.
This also happens for Number.POSITIVE_INFINITY
which is the same.
tl;dr :
Why does Infinity--
yield infinity as a result when 15--
and (new Number(15))--
yield a reference error?
Infinity
as used in your example is not actually a value but refers to the Infinity
property of the global object:
15.1 The Global Object
[...]
15.1.1 Value Properties of the Global Object
[...]
15.1.1.2 Infinity
The value ofInfinity
is+∞
(see 8.5). This property has the attributes { [[Writable]]:false
, [[Enumerable]]:false
, [[Configurable]]:false
}.
So, Infinity--
is the same as window.Infinity--
which is perfectly valid.