Here it goes, I am making a climbing system in a Unity project where depending on the direction your speed is changed. In my case it would be a speed of 1 for going up, 1.5 for sideways and 2 for going down.
This is how I calculate angles right now:
float angle = (Mathf.Atan2(this.characterController.GetAxisControlValue(CharacterAxisControl.Vertical), this.characterController.GetAxisControlValue(CharacterAxisControl.Horizontal)) * Mathf.Rad2Deg);
angle %= 360.0f;
if (angle < 0.0f)
{
angle += 360.0f;
}
GetAxisControl value returns a vale between -1 and 1. Now I need to find out how you would get the average speed between point like this: Example
I'm searching for a formula that can solve this problem.
Could anyone assist me please, pretty please.
If you want it to be proportional to the angle, then it's easy:
var speed = (angle % 180) / 180 + 1;
This will give you:
0 deg 1
90 deg 1.5
180 deg 2
270 deg 1.5
45 deg 1.25
150 deg 1.83 // this is your picture example
If you want any arbitrary speeds you could use linear interpolation. Let's say you want speed Vu
for going up, Vs
for going sideways and Vd
down.
var t = (angle % 180) / 90; // we only care about vertical direction
var speed = t < 1
? Vu * (1 - t) + Vs * t // this is for picking the value in the range 0..90
: Vs * (2 - t) + Vd * (t - 1); // this is in range 90..180