I have a call to SetSystemTime from my c# application. However if I have the Windows timezone set to a non-zero offset from UTC, it seems sometimes to adjust the system clock as if the time I provided was UTC (i.e. converts to local time) and other times it doesn't, it just sets the time directly to the date
parameter.
[StructLayout(LayoutKind.Sequential)]
internal struct SystemTime
{
public short Year;
public short Month;
public short DayOfWeek;
public short Day;
public short Hour;
public short Minute;
public short Second;
public short Milliseconds;
}
[DllImport("kernel32.dll", SetLastError = true)]
internal static extern bool SetSystemTime(ref SystemTime st);
public static bool AdjustSystemClock(DateTime date)
{
SystemTime systemTime = new SystemTime();
systemTime.Year = (short)date.Year;
systemTime.Month = (short)date.Month;
systemTime.Day = (short)date.Day;
systemTime.Hour = (short)date.Hour;
systemTime.Minute = (short)date.Minute;
systemTime.Second = (short)date.Second;
return SetSystemTime(ref systemTime);
}
The difference seems to be: When I set the time zone using Windows, then start the application, when I call SetSystemTime()
it adjusts the time provided as if it was UTC.
But when I set the time zone using the SetDynamicTimeZoneInformation()
function, restart the application and then call SetSystemTime()
then it sets the time directly to the time I provide, regardless of the time zone.
Is this the expected behaviour? How can I get consistency between the two methods for setting the time zone?
I believe I found the problem.
It turns out the person who wrote the bit of code to SetDynamicTimeZoneInformation()
neglected to set the Bias
property.
Therefore the timezone information being set had a UTC offset of zero, so no adjustment would take place.