Search code examples
c++raycastingraytracing

How to calculate a ray plane intersection


How do I calculate the intersection between a ray and a plane?

The following code produces the wrong results.

float denom = normal.dot(ray.direction);

if (denom > 0)
{
    float t = -((center - ray.origin).dot(normal)) / denom;

    if (t >= 0)
    {
        rec.tHit = t;
        rec.anyHit = true;
        computeSurfaceHitFields(ray, rec);
        return true;
    }
}

The input variables are described as follows:

  • ray represents the ray object.
  • ray.direction is the direction vector.
  • ray.origin is the origin vector.
  • rec represents the result object.
  • rec.tHit is the value of the hit.
  • rec.anyHit is a boolean.

My function has access to the plane:

  • center and normal defines the plane

Solution

  • As wonce commented, you want to also allow the denominator to be negative, otherwise you will miss intersections with the front face of your plane. However, you still want a test to avoid a division by zero, which would indicate the ray being parallel to the plane. You also have a superfluous negation in your computation of t. Overall, it should look like this:

    float denom = normal.dot(ray.direction);
    if (abs(denom) > 0.0001f) // your favorite epsilon
    {
        float t = (center - ray.origin).dot(normal) / denom;
        if (t >= 0) return true; // you might want to allow an epsilon here too
    }
    return false;