I am currently trying to calculate the delta time of a loop in cpp. Currently i use this code
#include <chrono>
int main()
{
typedef std::chrono::high_resolution_clock Clock;
typedef std::chrono::duration<float> fsec;
auto lastTime = Clock::now();
auto currentTime = Clock::now();
fsec passedTime = currentTime - lastTime;
while (true)
{
lastTime = currentTime;
printf("");
printf("");
printf("");
printf("");
printf("");
printf("");
printf("");
printf("");
printf("");
printf("");
currentTime = Clock::now();
passedTime = currentTime - lastTime;
printf("%i\n", passedTime.count());
}
return 0;
}
This is not the actual code i use in the program but the delta time calculation and the output is the exact same as in this sample.
It used to work fine and output a number of about ~0.0167 when locked to 60 ticks/s by GLFW and a ways lower number that at about 6k ticks/s.
My Problem is that I shifted everything out of the main function into a method and now get this result
It seems like I changed something by mistake but I can't figure out what I've changed.
Your printf is wrong, the data is a float, but the printf format is an int.