I've got a problem with moving and rotating objects in OpenGL. I'm using C# and OpenTK (Mono), but I guess the problem is with me not understanding the OpenGL part, so you might be able to help me even if you don't know anything about C# / OpenTK.
I'm reading the OpenGL SuperBible (latest edition) and I tried to rewrite the GLFrame in C#. Here is the part I've already rewritten:
public class GameObject
{
protected Vector3 vLocation;
public Vector3 vUp;
protected Vector3 vForward;
public GameObject(float x, float y, float z)
{
vLocation = new Vector3(x, y, z);
vUp = Vector3.UnitY;
vForward = Vector3.UnitZ;
}
public Matrix4 GetMatrix(bool rotationOnly = false)
{
Matrix4 matrix;
Vector3 vXAxis;
Vector3.Cross(ref vUp, ref vForward, out vXAxis);
matrix = new Matrix4();
matrix.Row0 = new Vector4(vXAxis.X, vUp.X, vForward.X, vLocation.X);
matrix.Row1 = new Vector4(vXAxis.Y, vUp.Y, vForward.Y, vLocation.Y);
matrix.Row2 = new Vector4(vXAxis.Z, vUp.Z, vForward.Z, vLocation.Z);
matrix.Row3 = new Vector4(0.0f, 0.0f, 0.0f, 1.0f);
return matrix;
}
public void Move(float x, float y, float z)
{
vLocation = new Vector3(x, y, z);
}
public void RotateLocalZ(float angle)
{
Matrix4 rotMat;
// Just Rotate around the up vector
// Create a rotation matrix around my Up (Y) vector
rotMat = Matrix4.CreateFromAxisAngle(vForward, angle);
Vector3 newVect;
// Rotate forward pointing vector (inlined 3x3 transform)
newVect.X = rotMat.M11 * vUp.X + rotMat.M12 * vUp.Y + rotMat.M13 * vUp.Z;
newVect.Y = rotMat.M21 * vUp.X + rotMat.M22 * vUp.Y + rotMat.M23 * vUp.Z;
newVect.Z = rotMat.M31 * vUp.X + rotMat.M32 * vUp.Y + rotMat.M33 * vUp.Z;
vUp = newVect;
}
}
So I create a new GameObject (GLFrame) on some random coordinates: GameObject go = new GameObject(0, 0, 5);
and rotate it a bit: go.RotateLocalZ(rotZ);
. Then I get the matrix using Matrix4 matrix = go.GetMatrix();
and render frame (first, I set the viewing matrix and then I multiply it with modeling matrix)
protected override void OnRenderFrame(FrameEventArgs e)
{
base.OnRenderFrame(e);
this.Title = "FPS: " + (1 / e.Time).ToString("0.0");
GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit);
GL.MatrixMode(MatrixMode.Modelview);
GL.LoadIdentity();
Matrix4 modelmatrix = go.GetMatrix();
Matrix4 viewmatrix = Matrix4.LookAt(new Vector3(5, 5, -10), Vector3.Zero, Vector3.UnitY);
GL.LoadMatrix(ref viewmatrix);
GL.MultMatrix(ref modelmatrix);
DrawCube(new float[] { 0.5f, 0.4f, 0.5f, 0.8f });
SwapBuffers();
}
The DrawCube(float[] color)
is my own method for drawing a cube.
Now the most important part: If I render the frame without the GL.MultMatrix(ref matrix);
part, but using GL.Translate()
and GL.Rotate()
, it works (second screenshot). However, if I don't use these two methods and I pass the modeling matrix directly to OpenGL using GL.MultMatrix()
, it draws something strange (first screenshot).
Can you help me and explain me where is the problem? Why does it work using translate and rotate methods, but not with multiplying the view matrix by the modeling matrix?
OpenGL transformation matrices are ordered column wise. You should use the transpose of the matrix you are using.