I am having an issue with gravity varying severely with frame rate shifts. When I run at 160 fps, my player jumps a few meters in the air then falls, but at about 10 fps, my player jumps half a meter then falls. My code for gravity is as follows:
public void fall(long delta) {
float increase = acceleration * ((delta) / 1000000000); //changes delta time (nanoseconds) to seconds
if(player.y + velocity + increase < -1.15f) {
if(velocity + inc < terminal_velocity) {
velocity += inc;
}
player.y += velocity;
}else{
player.y = -1.15f;
velocity = 0;
}
}
And where I call it:
while(!close_request) {
now = getTime();
int delta = getDelta(now);
player.fall(delta);
........other functions.........
}
I thought implementing the delta would keep the player from changing velocity too fast or too slow, but it actually made it a bit worse. I think this is due to the fact that as the time between frames increases, so does the increase in velocity which causes the player to fall abnormally fast. This comes from the fact that as the FPS increases, the player jumps much, much higher. Any ideas?
Your problem is in this line:
player.y += velocity;
which fails to take into account that velocity is "distance divided by time".
You're correctly modelling acceleration:
v = u + a * t // v = current velocity, a = acceleration, t = time
but not distance, which for small enough delta
is:
delta_s = v * delta_t
You need to multiply velocity
by delta
before adding it to to the position.