I just ran into a bit of confusion today, "string".indexOf('');
always returns 0
, but I would expect -1
(for false
); inversely, "string".lastIndexOf('');
always returns 6
lastIndexOf
is easier to understand, since string is 6 letters long ("string".length
, being zero-indexed returns 5
) but I don't see anywhere in the ECMAscript spec (5.1 or 6.0) that describes why ''
would be treated like a word/character boundary
What, exactly, is going on here?
The spec says:
Return the smallest possible integer k not smaller than start such that k+searchLen is not greater than len, and for all nonnegative integers j less than searchLen, the character at position k+j of S is the same as the character at position j of searchStr; but if there is no such integer k, then return the value -1.
That condition is fulfilled at position 0 because of vacuous truth: since you are searching the empty string, any statement you can think of will hold for every character, because it has no characters.
More formally, for any statement P
, if S = ∅
, P(x)
holds ∀ x ∈ S
.