Suppose we have a following code-snippet in some abstract programming language:
map<string, int> user_settings;
//somewhere at run-time user_settings are populated
for(i=0; i < 1000000000000; i++) {
if (user_settings["do_stuff"] == 1) { do some stuff... }
else { do other stuff }
}
The problem here is that we will do a lot of loads inside the loop at runtime of some thing, that is "runtime constant" (namely, some value of a user-setting). There is a potential to supply the value via immediate load or even completely optimize away useless branch and use only code of needed then-else-branch.
Are there systems/programming languages/jit compilers that perform such optimizations? Only related work I could find was the thesis of some MIT guy: http://groups.csail.mit.edu/cag/rio/josh-meng-thesis.pdf
P.S. I am not asking to optimize this code. I know, that you can load the value beforehand, which will be loaded to a local register. There are sometimes cases, where you cannot perform such an "optimization" by hand.
One neat approach that is for example used in dynamic linkers might be useful here:
Make your compiler generate the dynamic code so it knows who called it.
That way, if the dynamic code finds that the data it receives can never change, it can just rewrite the code at the call site. So instead of a CALL get_user_setting "do_stuff"
it will replace the instruction with PUSH int 1
or whatever, and maybe a few NOP
s to pad out the size of the CALL instruction.
Dynamic linkers often do this. The first time you try to call a dynamically linked function, they actually jump into the loader. It loads the actual function, then rewrites the CALL instruction so it calls the loaded function instead of the routine to load the function. Any future calls then don't have to do the look-up a second time. (I blogged about this here)