Search code examples
javaprogramming-languagesintegertheory

Size of value affecting computation time?


Say I have a Java program such:

//case1
Long first = 1;
Long second = 1;
Long third = first - second;

//case2
Long first = Long.MAX_VALUE;
Long second = 100000L;
Long third = first - second;

Those two cases should have the exact same execution time and overhead shouldn't they? The actual operation is performed on every bit in the Long right, regardless of the value contained within it, right?

If my assumption is true,is there any language where this is NOT the case?

EDIT: The case that prompted this is a 16 bit PIC we are using at work ( C code), that calculates averages over a variable amount of time. After the answers below from M S and Thom, I now understand that it IS possible to introduce a bug this way, since the PIC is computing mission critical information, on a time-sensitive basis.

Thank you all very much.


Solution

  • What you say is true in Java — the operations execute in a time that is independent of the values. In some languages (such as Lisp), if the value exceeds the maximum legal value for the data type, execution automatically switches to use a "big integer" package, which slows the execution down considerably.

    EDIT There is a slight difference between the first and second cases: the value 1 is special (as is 0). The byte code for

    Long first=1L;
    

    is:

    lconst_1
    invokestatic    #2; //Method java/lang/Long.valueOf:(J)Ljava/lang/Long;
    astore_1
    

    whereas if the constant is (say) 2L, one gets this byte code:

    ldc2_w  #3; //long 2l
    invokestatic    #2; //Method java/lang/Long.valueOf:(J)Ljava/lang/Long;
    astore_2
    

    Since lconst_1 runs faster than ldc2_w, there is a slight time difference between cases 1 and 2.