Search code examples
.net-core

differences between how linux and windows performs date calculations


Is there a major difference between how Linux and windows perform date calculations? I have an application at work that converts DateTime from whatever the local time zone is to UTC to be stored in the database. When I'm testing on my local windows dev machine it does this with no problem, however, in any higher environment (which is Linux) it appears to apply the time zone conversion twice. For Example, starting with a DateTime of "12/5/2022 08:00 AM" I'm in the Arizona time zone so UTC time is "12/05/2022 15:00" but instead of that value, I'm seeing a value of "12/6/2022 02:00" being put in the database. Has anyone run into this situation before? If so, how did you handle it?


            DateTime currentDate = DateTime.Now;
            Console.WriteLine("The Current Datetime is: " +currentDate.ToString());

            currentDate = DateTime.SpecifyKind(currentDate, DateTimeKind.Unspecified);
            Console.WriteLine("Unspecified kind datetime is: " + currentDate.ToString());

            currentDate = TimeZoneInfo.ConvertTimeToUtc(currentDate, TimeZoneInfo.Local);
            Console.WriteLine("UTC time zone datetime is: " + currentDate.ToString());
            Console.ReadKey();

Above are the pertinent date calculations I'm performing. I'm using .net core 3.1


Solution

  • the problem was there was one place we were sending the date as a DateTime type. When you do this the UI will automatically change the time to match the time zone on the server. This is why the issue couldn't be reproduced locally and would only appear after deployment.