Talking about this method defined in Arrays:
public static int binarySearch(int[] a, int key)
I couldn't get the point behind why return (-(insertion point) - 1)
instead of -(insertion point)
in case of no match found in array.
https://docs.oracle.com/javase/7/docs/api/java/util/Arrays.html#binarySearch(int[],%20int)
Could it be the reason that Math.abs((-(insertion point) - 1))
equals size of the array?
If you are wondering why I am asking this at all, I see that to find the insertion point I have to basically do the subtraction.
int returnedVal = Arrays.binarySearch(arr, needle);
if (returnedVal < 0)
insertionPoint = Math.abs(returnedVal) - 1;
Because 0 is a legitimate insertion point, but you can't just return 0 because that would mean the key was found at index 0.
Edit: note that the choice of substracting 1 rather than another number is not arbitrary. -a - 1
equals ~a
(the bitwise negation of a), that function is bijective for all positive (and negative) integers because is just flipping the bits and uses the entire Integer range: ~Integer.MIN_VALUE == Integer.MAX_VALUE
, ~(-1) == 0
. Also note that ~(~a) == a
.
You can find the insertion point by doing ~returnedVal
. Integers in Java always use two's complement so the equivalence is valid.
Using any subtrahend higher than 1 would cause an integer underflow for the highest indexes near Integer.MAX_VALUE. For more information about the ~ operator see Explanation of Bitwise NOT Operator.