Intro
I am planning to write an algorithm in Java (so that I can learn this language) which run time should be around a few seconds. This small algorithm will be called about 10,000 times from bash.
Questions
Java will make optimization every time the process is being called even if called 10,000 times via a bash for loop, is it correct?
My code will probably occupy about 2000 lines and run for about 5 seconds. Do you think the time to perform optimization will be negligible in comparison to the running time?
Is there a way to compute the optimization once (just like it would work in C++)?
Is java not to be used for short processes that are called many times?
It's hard to predict what optimizations will occur at runtime, but if the process is newly started with each call and only runs for a few seconds, it'll hardly make any difference. Compile-style optimizations will not be affected, but runtime optimizations might be. However, the startup time of the JVM itself will overwhelm any benefit.
Furthermore, if the program itself is small and the algorithm is correct, there may be nearly nothing the runtime optimizer can do.
Micro-optimization at this stage is not fruitful. Do you have any measurements at all of how long each part of the process takes? If not, how would you know if any efforts help, or even where any bottlenecks are?
In this case your best bet is to eliminate the JVM startup time as a factor. Set up your Java program as a server and call it from the shell script via service calls, for example. The slowest part of the program goes away, and the JVM can accumulate statistics over multiple uses.