My question is basically about how the C# compiler handles memory allocation of small datatypes. I do know that for example operators like add are defined on int and not on short and thus computations will be executed as if the shorts are int members.
Assuming the following:
Does using the short datatype wherever possible reduce the memory footprint of my application and is it advisable to do so? Or is using short and the like not worth the effort as the compiler allocates the full memory ammount of a int32 for example and adds additional casts when doing arithmetic.
Any links on the supposed runtime performance impact would be greatly appreciated.
Related questions:
From a memory-only perspective, using short
instead of int
will be better. The simple reason is that a short
variable needs only half the size of an int
variable in memory. The CLR does not expand short
to int
in memory.
Nevertheless this reduced memory consumption might and probably will decrease runtime performance of your application significantly. All modern CPUs do perform much better with 32bit numbers than with 16bit numbers. Additionally in many cases the CLR will have to convert between short
and int
when e.g. calling methods that take int
arguments. There are many other performance considerations you have to take before going this way.
I would only change this at very dedicated locations and modules of your application and only if you really encounter measurable memory shortages.
In some cases you can of course switch from int
to short
easily without hurting performance. One example is a giant array of int
s all of which do also fit to short
s.