I'm trying to write a function that returns true if a ray intersects a sphere and the code I'm referencing goes something like this:
// given Sphere and Ray as arguments
invert the Sphere matrix
make new Ray object
origin of this object = old Ray origin * inverted Sphere matrix
direction = old Ray direction * inverted Sphere matrix
a = |new direction| ^ 2
b = dot product of new origin and new direction
c = |new origin| ^ 2 - 1
det = b*b - a*c
if det > 0 there is an intersection
I'm stuck at understanding why we need to invert the Sphere matrix first and then multiply it to the Ray's origin and direction. Also I'm confused how to derive the quadratic equation variables a, b, and c and the end. I know I have to combine the parametric equations for a ray (p + td) and for a circle (x dot x - 1 = 0) but I can't figure out how to do so.
You need to invert the sphere matrix to have the ray in the sphere's coordinate frame, which, if the sphere is not scaled, is the same as simply setting new_origin = origin - sphere_center (and using the original direction) The equation is formed from the formula:
|new_dir*t + new_origin|^2 = r^2 (presumably r is 1)
If you expand it, you get:
|new_dir|^2*t^2 + 2*(new_origin·new_dir)*t + |new_origin|^2-r^2 = 0