I have been writing a game which I have put on github for others to play around with, but I have a question about hardware when running the game.
When I run the game with a dedicated GPU, the player moves perfectly but if I run it on a laptop with integrated graphics, then the player seems to move very slowly. However, I added a player speed button and if I increase the player speed then the player moves perfectly again. What I can't understand is how come the player moves slowly on a laptop but if I change the speed of the player she moves how she should? Is there a way to detect if a dedicated GPU is not installed and so I can set the speed slightly higher? Thank you.
Since there's no code post, I can't help that much. I will tell you instead what I suspect the issue is.
Are you doing proper delta timesteps? If your game loop looks like
while (not quit) {
readInput();
runSimulation();
render();
sleep(16.666); // Or whatever you want here
}
this would be bad because the game will run different depending on the power of the hardware. If your game takes 1ms to render on a good GPU, but takes 10ms to render on a bad GPU, you will naturally run slower since each "game loop" takes longer on one machine.
What you can do is find the amount of time that everything ran, and then delay yourself by some amount so that it sums up to your target FPS.
For example, you time how long readInput, runSimulation, and render take, assign this to timeTaken
, and if you are targeting 60 fps then you would do 16.666 - timeTaken
. This way, every frame regardless of computer will take the exact same time.
This is not the best way either (it already has problems I can see), there are much better ways than this so please do your research, but this is a quick answer before your question gets closed. See this for more info, or search up Timesteps on Gamedev Stackexchange.