Wow I thought I knew my C++ but this is weird
This function returns an unsigned int so I thought that means I will never get a negative number returned right?
The function determines how many hours ahead or behind of UTC you are. So for me I'm in Australia, Sydney so I am +10 GMT which means I am UTC = LocalTime + (-10). Therefore the GetTimeZoneInformation correctly determines I am -10.
BUT my function returns an unsigned int so shouldn't it return 10 not -10?
unsigned int getTimeZoneBias()
{
TIME_ZONE_INFORMATION tzInfo;
DWORD res = GetTimeZoneInformation( &tzInfo );
if ( res == TIME_ZONE_ID_INVALID )
{
return (INT_MAX/2);
}
return (unsigned int(tzInfo.Bias / 60)); // convert from minutes to hours
}
TCHAR ch[200];
_stprintf( ch, _T("A: %d\n"), getTimeZoneBias()); // this prints out A: -10
debugLog += _T("Bias: ") + tstring(ch) + _T("\r\n");
Here's what I think is happening:
The value of tzInfo.Bias
is actually -10. (0xFFFFFFF6
)
On most systems, casting a signed integer to an unsigned integer of the same size does nothing to the representation.
So the function still returns 0xFFFFFFF6
.
But when you print it out, you're printing it back as a signed integer. So it prints-10
. If you printed it as an unsigned integer, you'll probably get 4294967286
.
What you're probably trying to do is to get the absolute value of the time difference. So you want to convert this -10 into a 10. In which you should return abs(tzInfo.Bias / 60)
.