I've written a small 2D engine in opengl in the process of making a game. I'm using OpenGL ES 2 and the code compiles and runs on iOS and Mac OSX.
Now I'm extending it to support 3D and I'm having a problem setting up the camera.
I've checked the code a hundred times and I can't finde where the problem is, so maybe someone with experience on this can give an idea.
This is the code I have: I'm posting the part of the code where I think the problem might be, but if something else is needed just ask me.
Matrix4 _getFrustumMatrix(float left, float right, float bottom, float top, float near, float far){
Matrix4 res = Matrix4(2.0 * near / (right - left), 0, 0, 0,
0, 2.0 * near / (top - bottom), 0, 0,
(right + left) / (right - left), (top + bottom) / (top - bottom), -(far + near) / (far - near), -1.0,
0,0, -2.0 * far * near / (far - near), 0);
return res;
}
Matrix4 _getPerspectiveMatrix(float near, float far, float angleOfView){
static float aspectRatio = float(SCREENW)/float(SCREENH);
float top = near * tan(angleOfView * 3.1415927 / 360.0);
float bottom = -top;
float left = bottom * aspectRatio;
float right = top * aspectRatio;
return _getFrustumMatrix(left, right, bottom, top, near, far);
}
Matrix4 _getLookAtMatrix(Vector3 eye, Vector3 at, Vector3 up){
Vector3 forward, side;
forward = at - eye;
forward.normalize();
side = forward ^ up;
side.normalize();
up = side ^ forward;
Matrix4 res = Matrix4(side.x, up.x, -forward.x, 0,
side.y, up.y, -forward.y, 0,
side.z, up.z, -forward.z, 0,
0, 0, 0, 1);
res.translate(Vector3(0 - eye));
return res;
}
void Scene3D::_deepRender(){
cameraEye = Vector3(10,0,40);
cameraAt = Vector3(0,0,0);
cameraUp = Vector3(0,1,0);
MatrixStack::push();
Matrix4 projection = _getPerspectiveMatrix(1, 100, 45);
Matrix4 view = _getLookAtMatrix(cameraEye, cameraAt, cameraUp);
MatrixStack::set(projection * view);
Space3D::_deepRender();
MatrixStack::pop();
}
The drawn object is a representation of the axes where x=red, y=green, z=blue, and it's located at (0,0,0).
If I put the eye at (0,0,40) everything looks as expected:
If I put the eye at (10,0,40) then the object is not drawn in the middle of the screen as it should be.
This is the Matrix4::translate method:
void Matrix4::translate(const Vector3& v) {
a14 += a11 * v.x + a12 * v.y + a13 * v.z;
a24 += a21 * v.x + a22 * v.y + a23 * v.z;
a34 += a31 * v.x + a32 * v.y + a33 * v.z;
a44 += a41 * v.x + a42 * v.y + a43 * v.z;
}
EDIT: To add some information:
Using _getLookAtMatrix() with this parameters:
cameraEye = Vector3(40,40,40);
cameraAt = Vector3(0,0,0);
cameraUp = Vector3(0,1,0);
Should give me an equivalent matrix to this one?
Matrix4 view;
view.setIdentity();
view.translate(Vector3(0,0,-69.2820323)); // 69.2820323 is the length of Vector3(40,40,40)
view.rotate(45, Vector3(1,0,0));
view.rotate(-45, Vector3(0,1,0));
At least those transformations makes sense to me and the resulting image looks as what I should expect. But this matrix compared to the one I get using _getLookAtMatrix() are very different:
view:
0.707106769, -0.49999997, 0.49999997, 0,
0, 0.707106769, 0.707106769, 0,
-0.707106769, -0.49999997, 0.49999997, 0,
0, 0, -69.2820358, 1
_getLookAtMatrix(cameraEye, cameraAt, cameraUp):
0.707106769, 0, -0.707106769, 0,
-0.408248276, 0.816496551, -0.408248276, 0,
0.577350259, 0.577350259, 0.577350259, 0,
-35.0483475, -55.7538719, 21.520195, 1
You seem to have some serious ordering inconsistencies in your matrix class.
For example I assumed your Matrix4
constructor takes it arguments (the matrix elements) as column-major, otherwise your functions wouldn't match the reference implementations of glFrustum
and gluLookAt
and you would get completely screwed results.
And the code of your translate
function also looks correct, since it has to modify the last column of the matrix, which are the elements (a14
, a24
, a34
and a44
).
But the your print out of the view
matrix suggests that translate
actually modifies the last row, unless you print the matrix in column-major format and therefore transposed. But in this case the print of the _getLookAtMatrix
suggests that the Matrix4
constructor takes its arguments in row-major order, which indeed invalidates other things.
Of course all this is also depending on how you send the matrices to OpenGL and how you use them in the vertex shader (I assume ES 2.0, otherwise there would be no need for your own matrix library). If you indeed use ES 1 then you need to send the matrix elements to OpenGL in column-major order, but the translation has to be in the last column and not the last row.
But no matter what convention you use, there is definitely a severe inconsistency inside your matrix code. But without seeing the whole Matrix4
class, the vertex shader and the code where you upload the matrices to OpenGL, it is hard to tell where this inconsistency is.