Search code examples
javaperformanceequalsinstanceofmicrobenchmark

What does Java do with my "equals" implementations here?


Today, I've stumbled over the following:

Consider two classes NewClass and NewClass1, which have the following "equals"-methods:

NewClass:

@Override
public boolean equals(Object obj) {
    return false;
}

public boolean equals(NewClass obj) {
    return value == obj.getValue();
}

NewClass1:

@Override
public boolean equals(Object obj) {
    if(!(obj instanceof NewClass1)) {
        return false;
    }
    return equals((NewClass1) obj);
}

public boolean equals(NewClass1 obj) {
    return value == obj.getValue();
}

What I find weird is that the equals in NewClass1 seems to be exponentially slower than the one in NewClass (for 10.000.000 calls 14ms against 3000ms). At first, I thought this was related to the "instanceof" check, but if I replace "return equals((NewClass1) obj);" with "return false;" in NewClass1, suddenly it runs more or less equally fast. I don't really understand what is happening here, because in my opinion, the return statement in equals(Object) should never actually be called. What am I getting wrong here?

The following is my "benchmarking code", in case I made some mistake there:

public static void main(String[] args) {
    // TODO code application logic here

    NewClass i1 = new NewClass(1);
    NewClass i2 = new NewClass(1);
    NewClass i3 = new NewClass(5);

    NewClass1 j1 = new NewClass1(1);
    NewClass1 j2 = new NewClass1(1);
    NewClass1 j3 = new NewClass1(5);

    Object o1 = new Object();
    Object o2 = new Object();


    assert(i1.equals(i1));
    assert(i1.equals(i2));
    assert(i1.equals(i3) == false);
    assert(i1.equals(o1) == false);

    assert(j1.equals(j1));
    assert(j1.equals(j2));
    assert(j1.equals(j3) == false);
    assert(j1.equals(o1) == false);


    long start = System.currentTimeMillis();

    for(int i=0; i<1000000000; i++) {
        i1.equals(i1);
        i1.equals(i2);
        i1.equals(o1);
        i1.equals(o2);
    }

    long end = System.currentTimeMillis();

    System.out.println("Execution time was "+(end-start)+" ms.");



    start = System.currentTimeMillis();

    for(int i=0; i<1000000000; i++) {
        j1.equals(j1);
        j1.equals(j2);
        j1.equals(o1);
        j1.equals(o2);
    }

    end = System.currentTimeMillis();

    System.out.println("Execution time was "+(end-start)+" ms.");
}

Solution

  • I would guess that it is the instanceof test that is consuming the time. When you change the final return in that method to always return false, the compiler probably eliminates the conditional, since the result will be the same (return false) regardless of its evaluation. This would also explain why changing the final return has any effect at all, since as you say it should never actually be reached in the code path.

    To put it more generally, a code change can impact performance even if it is not on the executed code path, by changing how the compiler optimizes the code.