Is the difference between integer multiply(temporarily forgetting about division) still in favor of shifting and if so how big is the difference?
It simply seems such a low level optimization, even if you wanted it the shouldn't the (C#/Java) to bytecode compiler or the jit catch it in most cases?
Note: I tested the compiled output for C#(with gmcs Mono C# compiler version 2.6.7.0) and the multiply examples didn't use shift for multiplying even when multiplying by a multiple of 2.
C# http://csharp.pastebin.com/hcrRnPrb
cil http://csharp.pastebin.com/0js9F2c1
P.S. I forgot how it might be somewhat useful to use it on bytes, but still having some trouble on using it for Numbers.
You are right, if shift operators are used only as an alternative for multiplications, it should be left to the compiler.
I suppose you overlooked applications like:
and much more need bit-twiddling for efficient implementation without native code.