Search code examples
c++opengl3dprojectionqglwidget

Coordinate transformations and projection issues


I've been trying for a while to get my mouse coordinates converted into 3D space coordinates in an OpenGL scene.

Currently, my projections are a little bit of a mess (I think), and it doesn't seem to fully take my "camera" into account when I move around a scene. To check this, I draw a line.

My resize function:

    void oglWidget::resizeGL(int width, int height)
    {
        if (height == 0) {
            height = 1;
        }
        pMatrix.setToIdentity();
        pMatrix.perspective(fov, (float) width / (float) height, -1, 1);
        glViewport(0, 0, width, height);
    }

My rendering function (paintGl()) goes as follows:

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glPolygonMode( GL_FRONT_AND_BACK, GL_LINE );
QMatrix4x4 mMatrix;
QMatrix4x4 vMatrix;

QMatrix4x4 cameraTransformation;
cameraTransformation.rotate(alpha, 0, 1, 0);
cameraTransformation.rotate(beta, 1, 0, 0);

QVector3D cameraPosition = cameraTransformation * QVector3D(camX, camY, distance);
QVector3D cameraUpDirection = cameraTransformation * QVector3D(0, 1, 0);
vMatrix.lookAt(cameraPosition, QVector3D(camX, camY, 0), cameraUpDirection);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();    
gluLookAt(cameraPosition.x(), cameraPosition.y(), cameraPosition.z(), camX, camY, 0, cameraUpDirection.x(), cameraUpDirection.y(), cameraUpDirection.z());

shaderProgram.bind();
shaderProgram.setUniformValue("mvpMatrix", pMatrix * vMatrix * mMatrix);
shaderProgram.setUniformValue("texture", 0);


for (int x = 0; x < tileCount; x++)
{
    shaderProgram.setAttributeArray("vertex", tiles[x]->vertices.constData());
    shaderProgram.enableAttributeArray("vertex");
    shaderProgram.setAttributeArray("textureCoordinate", textureCoordinates.constData());
    shaderProgram.enableAttributeArray("textureCoordinate");
    glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA, tiles[x]->image.width(), tiles[x]->image.height(), 0, GL_RGBA, GL_UNSIGNED_BYTE, tiles[x]->image.bits());
    glDrawArrays(GL_TRIANGLES, 0, tiles[x]->vertices.size());
}
shaderProgram.release();

And to create my Ray:

GLdouble modelViewMatrix[16];
GLdouble projectionMatrix[16];
GLint viewport[4];
GLfloat winX, winY, winZ;
glGetDoublev(GL_MODELVIEW_MATRIX, modelViewMatrix);
glGetDoublev(GL_PROJECTION_MATRIX, projectionMatrix);
glGetIntegerv(GL_VIEWPORT, viewport);

winX = (float)x;
winY = (float)viewport[3] - (float)y;
glReadPixels( winX, winY, 1, 1, GL_DEPTH_COMPONENT, GL_FLOAT, &winZ );

GLdouble nearPlaneLocation[3];
gluUnProject(winX, winY, 0, modelViewMatrix, projectionMatrix,
             viewport, &nearPlaneLocation[0], &nearPlaneLocation[1],
        &nearPlaneLocation[2]);

GLdouble farPlaneLocation[3];
gluUnProject(winX, winY, 1, modelViewMatrix, projectionMatrix,
             viewport, &farPlaneLocation[0], &farPlaneLocation[1],
        &farPlaneLocation[2]);

 QVector3D nearP = QVector3D(nearPlaneLocation[0], nearPlaneLocation[1],
            nearPlaneLocation[2]);
 QVector3D farP = QVector3D(farPlaneLocation[0], farPlaneLocation[1],
            farPlaneLocation[2]);

I feel like I'm using conflicting systems or something.

Should I be using different variables to manage my camera? I see talk of projection view, model view, etc, but I don't see how I would use that and also use the shader program. I'm still novice when it comes to OpenGL.

So to clarify: I'm attempting to convert my mouse coordinates into 3D space coordinates. So far, it appears to semi-work, it just doesn't take into account camera rotation. I confirmed that my problem has to do with either the ray creation, or with the unprojection of coordinates, not with my actual ray picking logic.


Solution

  • Here, I think I had a series of problems.

    1. I should have used -1 for my near plane, not 0
    2. I had differing data types for my matrices (QT ones specifically,), making me unable to directly plug them into the unproject function.

    I solved these by doing the following:

    1. I installed the GLM library to normalize my data types
    2. I performed the matrix calculations myself
    • So I pulled the matrices into my ray creation function, then multiplied the inverse view matrix by the inverse model matrix, then by the inverse projection matrix.
    • This value would then be multiplied in 2 different vectors of the screen coordinate (which had to be in NDC space). One with a z of -1, the other of +1 corresponding to the near and far planes of window space. Lastly, those vectors also had to have a "w" value of 1, so that the resulting calculation of matrix * vector could be divided by the last spot of itself.

    I had a series of questions opened here because my initial problem of ray picking lead to revealing a whole series of errors in my code. If anyone has come across the problem posted here, it may be worth checking out my other questions that I opened, as all of my problems all stem from projection issues:

    • In here, I learned that I actually need some form of calculations in order for ray creation.
    • In here, I learned that unProject wasn't working because I was trying to pull the model and view matrices using OpenGL functions, but I never actually set them in the first place, because I built the matrices by hand. I solved that problem in 2 fold: I did the math manually, and I made all the matrices of the same data type (they were mixed data types earlier, leading to issues as well).
    • And lastly, in here, I learned that my order of operations was slightly off (need to multiply matrices by a vector, not the reverse), that my near plane needs to be -1, not 0, and that the last value of the vector which would be multiplied with the matrices (value "w") needed to be 1.

    Since my problem extended over quite a great length of time, and took up several questions, I owe credit to a few individuals who helped me along the way: