Search code examples
c++raytracing

Ray Tracing: Sphere distortion due to Camera Movement


I am building a ray Tracer from scratch. My question is: When I change camera coordinates the Sphere changes to ellipse. I don't understand why it's happening.

Here are some images to show the artifacts:

Sphere: 1 1 -1 1.0 (Center, radius)
Camera: 0 0 5 0 0 0 0 1 0 45.0 1.0 (eyepos, lookat, up, foy, aspect)

enter camera 0 0 5 0 0 0 0 1 0 45.0 1.0

But when I changed camera coordinate, the sphere looks distorted as shown below:

Camera: -2 -2 2 0 0 0 0 1 0 45.0 1.0

enter image description here

I don't understand what is wrong. If someone can help that would be great!

I set my imagePlane as follows:

   //Computing u,v,w axes coordinates of Camera as follows:

 {       
        Vector a = Normalize(eye - lookat);  //Camera_eye - Camera_lookAt
        Vector b = up;    //Camera Up Vector 
        m_w = a;
        m_u = b.cross(m_w);
        m_u.normalize();
        m_v = m_w.cross(m_u);
}

After that I compute directions for each pixel from the Camera position (eye) as mentioned below:

//Then Computing direction as follows:

int half_w = m_width * 0.5;
    int half_h = m_height * 0.5;

double half_fy = fovy() * 0.5;
double angle = tan( ( M_PI * half_fy) / (double)180.0 );

for(int k=0; k<pixels.size(); k++){
    double j = pixels[k].x();       //width
    double i = pixels[k].y();       //height

    double XX = aspect() * angle * ( (j - half_w ) / (double)half_w );
    double YY  =           angle * ( (half_h - i ) / (double)half_h );

    Vector dir = (m_u * XX + m_v * YY) - m_w ;


 directions.push_back(dir);
}

After that:

 for each dir:

    Ray ray(eye, dir);
    int depth = 0;
    t_color += Trace(g_primitive, ray, depth);

Solution

  • After playing a lot and with the help of the comments of all you guys I was able to create successfully my rayTracer properly. Sorry for answering late, but I would like to close this thread with few remarks.

    So, the above mentioned code is perfectly correct. Based on my own assumptions (as mentioned in above comments) I have decided to set my Camera parameters like that.

    The problem I mentioned above is a normal behaviour of the camera (as also mentioned above in the comments).

    I have got good results now but there are few things to check while coding a rayTracer:

    1. Always make sure to take care of Radians to Degrees (or vice versa) conversion while computing FOV and ASPECT RATIO. I did it as follows:
         double angle = tan((M_PI * 0.5 * fovy) / 180.0);
         double y = angle;
         double x = aspect * angle;
    

    2) While computing Triangle intersections, make sure to implement cross product properly.

    3) While using intersections of different objects make sure to find the intersection which is at a minimum distance from the camera.

    Here's the result I got: enter image description here

    Above is a very simple model (courtesy UCBerkeley), which I rayTraced.