Search code examples
javaperformancedynamic-memory-allocationdynamic-arraysmemory-pool

Which is faster: Array list or looping through all data combinations?


I'm programming something in Java, for context see this question: Markov Model descision process in Java

I have two options:

byte[MAX][4] mypatterns;

or ArrayList mypatterns

I can use a Java ArrayList and append a new arrays whenever I create them, or use a static array by calculating all possible data combinations, then looping through to see which indexes are 'on or off'.

Essentially, I'm wondering if I should allocate a large block that may contain uninitialized values, or use the dynamic array.

I'm running in fps, so looping through 200 elements every frame could be very slow, especially because I will have multiple instances of this loop.

Based on theory and what I have heard, dynamic arrays are very inefficient

My question is: Would looping through an array of say, 200 elements be faster than appending an object to a dynamic array?

Edit>>>

More information:

  • I will know the maxlength of the array, if it is static.
  • The items in the array will frequently change, but their sizes are constant, therefore I can easily change them.
  • Allocating it statically will be the likeness of a memory pool
  • Other instances may have more or less of the data initialized than others

Solution

  • You right really, I should use a profiler first, but I'm also just curious about the question 'in theory'.

    The "theory" is too complicated. There are too many alternatives (different ways to implement this) to analyse. On top of that, the actual performance for each alternative will depend on the the hardware, JIT compiler, the dimensions of the data structure, and the access and update patterns in your (real) application on (real) inputs.

    And the chances are that it really doesn't matter.

    In short, nobody can give you an answer that is well founded in theory. The best we can give is recommendations that are based on intuition about performance, and / or based on software engineering common sense:

    • simpler code is easier to write and to maintain,

    • a compiler is a more consistent1 optimizer than a human being,

    • time spent on optimizing code that doesn't need to be optimized is wasted time.


    1 - Certainly over a large code-base. Given enough time and patience, human can do a better job for some problems, but that is not sustainable over a large code-base and it doesn't take account of the facts that 1) compilers are always being improved, 2) optimal code can depend on things that a human cannot take into account, and 3) a compiler doesn't get tired and make mistakes.