I've been using (ulong) 1
to represent 1bit in 64bit-data type. I wonder if (ulong) 1
does some sort of type conversion and takes more time than 1ul
.
Just being curious if this makes any difference. I thought both of them are just exactly same in performance. Is this correct, or is (ulong) 1
actually slower?
No, no runtime type conversion will happen when you use casting with compile time constant and build-in numeric conversion (i.e. (ulong) 1
) since compiler can handle such things and both approaches should result in the same IL/ASM. For example:
var x = 1ul;
var y = (ulong)1;
Console.WriteLine(x);
Console.WriteLine(y);
Will produce the same IL:
.maxstack 1
.entrypoint
.locals init (
[0] uint64 x,
[1] uint64 y
)
IL_0000: ldc.i4.1
IL_0001: conv.i8
IL_0002: stloc.0
IL_0003: ldc.i4.1
IL_0004: conv.i8
IL_0005: stloc.1
IL_0006: ldloc.0
IL_0007: call void [System.Console]System.Console::WriteLine(uint64)
IL_000c: nop
IL_000d: ldloc.1
IL_000e: call void [System.Console]System.Console::WriteLine(uint64)
IL_0013: nop
IL_0014: ret
From the language specification - 12.23 Constant expressions:
Only the following constructs are permitted in constant expressions:
- Literals (including the
null
literal).- ...
- Cast expressions.
- ...
So basically in this case (ulong) 1
will be treated as constant as the integer literal 1ul
.
Also see the Integer literals section of the docs.