I can quite easily calculate the point of intersection given two lines. If I start with two vertices:
(x1,y1)
(x2,y2)
I can calculate the slope by doing (y1-y2)/(x1-x2)
, and then calculating the intercept
y1 - slope * x1
Then do that again, so I have to sets of slope and intercept, then just do:
x = (intercept2 - intercept1) / (slope1 - slope2)
y = slope1 * x + intercept1
(disclaimer: this might not even work, but i've gotten something very close to it to work, and it illustrates my general technique)
BUT that only works with data types with decimals, or non integral. Say the vertices are:
(0,1)
(10,2)
To calculate the slope would result in (1-2)/(0-10)
, which is -1/-10
which is not 1/10
, it is 0
.
How can I get code that yields a valid result using only integers?
Edit: I can't use floats AT ALL!. No casting, no nothing. Also, values are capped at 65535. And everything is unsigned.
In high school when subtracting fractions, our teachers taught us to find a common denominator
So 1/4 - 1/6 = 3/12 - 2/12 = 1/12
So do the same with your slopes.
int slope1 = n1 / d1; // numerator / denominator
int slope2 = n2 / d2;
// All divisions below should have 0 for remainder
int g = gcd( d1, d2 ); // gcd( 4, 6 ) = 2
int d = d1 * d2 / g; // common denominator (12 above)
int n = (d/d1) * n1 - (d/d2) * n2; // (1 in 1/12 above)
// n1/d1 - n2/d2 == n/d
I hope I got that right.