I'm at a complete loss! Why does my code not work as expected? I have the following code:
UINT64 tmpA = 0;
UINT64 tmpB = 0;
UINT64 alarmed_lans = 0;
int foprtmsk[2]={0};
switch_fo_prtmsk_getptr(foprtmsk);
tmpA = foprtmsk[1];
tmpB = foprtmsk[0];
gDbgLog("tmpA <%016llx>",tmpA);
gDbgLog("tmpB <%016llx>",tmpB);
gDbgLog("alarmed_lans <%016llx>",alarmed_lans);
alarmed_lans &= ((tmpA<<32) |tmpB);
gDbgLog("alarmed_lans <%016llx>",alarmed_lans);
and the log produced looks like:
|0x1f604|7857[us]|fpga-alarm|fpga_faultlocalizer|tmpA <ffffffffeffeffff>
|0x1f6cb|7861[us]|fpga-alarm|fpga_faultlocalizer|tmpB <ffffffffffffffff>
|0x1f741|7863[us]|fpga-alarm|fpga_faultlocalizer|alarmed_lans <3003000000000000>
|0x1f7b8|7865[us]|fpga-alarm|fpga_faultlocalizer|alarmed_lans <3003000000000000>
Now, I'm wondering, why does the bitmask not get applied properly??? I'd expect to see
|0x1f7b8|7865[us]|fpga-alarm|fpga_faultlocalizer|alarmed_lans <2002000000000000>
What's going on here?
CPU: PPC85XXe500
compiler: diab
OS: VxWorks
alarmed_lans &= ((tmpA<<32) |tmpB);
alarmed_lans
was never initialised. Using it results in undefined behaviour.
x
and you get x
In theory, you should get the uninitialised value of alarmed_lans
, but in practice, the compiler is allowed to do anything, including the invocation of nasal demons