I am trying to make a timing system in C#, and I am having trouble calculating delta time.
Here is my code:
private static long lastTime = System.Environment.TickCount;
private static int fps = 1;
private static int frames;
private static float deltaTime = 0.005f;
public static void Update()
{
if(System.Environment.TickCount - lastTime >= 1000)
{
fps = frames;
frames = 0;
lastTime = System.Environment.TickCount;
}
frames++;
deltaTime = System.Environment.TickCount - lastTime;
}
public static int getFPS()
{
return fps;
}
public static float getDeltaTime()
{
return (deltaTime / 1000.0f);
}
The FPS counting is working correctly, but the delta time is faster than it should be.
Value of System.Environment.TickCount changes during the execution of your function which is causing deltaTime to move faster than you expect.
Try
private static long lastTime = System.Environment.TickCount;
private static int fps = 1;
private static int frames;
private static float deltaTime = 0.005f;
public static void Update()
{
var currentTick = System.Environment.TickCount;
if(currentTick - lastTime >= 1000)
{
fps = frames;
frames = 0;
lastTime = currentTick ;
}
frames++;
deltaTime = currentTick - lastTime;
}
public static int getFPS()
{
return fps;
}
public static float getDeltaTime()
{
return (deltaTime / 1000.0f);
}