I'm trying to transform 3d point coordinates to the 2d screen coordinates.
But, the problem is when I implemented it and run the program, even though I change the camera position, there was no change on the output. Output generally out of the screen coordinates range even though I can completely see the 3d model(so no chance of any point or triangle to go out of the screen coordinates).
Method that convers the 3d coordinates to 2d ones:
public Vector2f get2DFrom3D(float x, float y, float z)
{
FloatBuffer screen_coords = BufferUtils.createFloatBuffer(4);
IntBuffer viewport = BufferUtils.createIntBuffer(16);
FloatBuffer modelview = BufferUtils.createFloatBuffer(16);
FloatBuffer projection = BufferUtils.createFloatBuffer(16);
GL11.glGetFloat(GL11.GL_MODELVIEW_MATRIX, modelview);
System.out.println("modelview:");
displayFloatBuffer(modelview);
GL11.glGetFloat(GL11.GL_PROJECTION_MATRIX, projection);
System.out.println("projection:");
displayFloatBuffer(projection);
GL11.glGetInteger(GL11.GL_VIEWPORT, viewport);
System.out.println("viewport:");
displayIntBuffer(viewport);
boolean result = GLU.gluProject(x, y, z, modelview, projection, viewport, screen_coords);
if (result)
{
System.out.printf("Convert [ %6.2f %6.2f %6.2f ] -> Screen [ %4d %4d ]\n", x, y, z, (int)screen_coords.get(0), (int)(screen_coords.get(3)-screen_coords.get(1)));
return new Vector2f((int)screen_coords.get(0), (int)(screen_coords.get(3)-screen_coords.get(1)));
}
else
{
return null;
}
}
Projection matrix is created via this method and directly loaded into the vertex shader:
private void createProjectionMatrix(){
float aspectRatio = (float) Display.getWidth() / (float) Display.getHeight();
float y_scale = (float) ((1f / Math.tan(Math.toRadians(FOV / 2f))) * aspectRatio);
float x_scale = y_scale / aspectRatio;
float frustum_length = FAR_PLANE - NEAR_PLANE;
projectionMatrix = new Matrix4f();
projectionMatrix.m00 = x_scale;
projectionMatrix.m11 = y_scale;
projectionMatrix.m22 = -((FAR_PLANE + NEAR_PLANE) / frustum_length);
projectionMatrix.m23 = -1;
projectionMatrix.m32 = -((2 * NEAR_PLANE * FAR_PLANE) / frustum_length);
projectionMatrix.m33 = 0;
}
Vertex shader:
#version 400 core
in vec3 position;
uniform mat4 transformationMatrix;
uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
void main(void) {
vec4 worldPosition = transformationMatrix * vec4(position, 1);
gl_Position = projectionMatrix * viewMatrix * worldPosition;
}
Renderer:
public void render(Entity entity, int displayMode) {
RawModel model = entity.getModel();
shader.start();
GL30.glBindVertexArray(model.getVaoID());
GL20.glEnableVertexAttribArray(0);
GL20.glEnableVertexAttribArray(1);
Matrix4f transformationMatrix = Maths.createTransformationMatrix(entity.getPosition(), entity.getRotX(),
entity.getRotY(), entity.getRotZ(), entity.getScale());
shader.loadTransformationMatrix(transformationMatrix);
GL11.glDrawElements(GL11.GL_TRIANGLES, model.getVertexAmount(), GL11.GL_UNSIGNED_INT, 0);
GL20.glDisableVertexAttribArray(0);
GL20.glDisableVertexAttribArray(1);
GL30.glBindVertexArray(0);
shader.stop();
}
Then, I debug the code and see that:
GL11.glGetFloat(GL11.GL_MODELVIEW_MATRIX, modelView);
GL11.glGetFloat(GL11.GL_PROJECTION_MATRIX, projection);
These lines are just not taking real matrix values that I defined. They are just order 4 identity matrices. Then, I searched for glMatrixMode to set that matrices or set gl_ModelViewMatrix, but it turns out that the opengl version I use is not supporting those anymore.
So, I think the problem is I somehow related to those variables and I somehow need to set them. Last but not least, here is the output for the get2DFrom3D method:
modelview:
1.0 0.0 0.0 0.0
0.0 1.0 0.0 0.0
0.0 0.0 1.0 0.0
0.0 0.0 0.0 1.0
projection:
1.0 0.0 0.0 0.0
0.0 1.0 0.0 0.0
0.0 0.0 1.0 0.0
0.0 0.0 0.0 1.0
viewport:
0 0 1209 891
0 0 0 0
0 0 0 0
0 0 0 0
I think only viewport is correct but modelview and projection matrices are looking like they're not loading the values calculated for them.
Note: I'm currently using lwjgl 2.9.3.
I found a solution to my problem and it works fine. Here is my new get2DFrom3D method:
public Vector2f get2DFrom3D(float x, float y, float z)
{
FloatBuffer screen_coords = BufferUtils.createFloatBuffer(4);
IntBuffer viewport = BufferUtils.createIntBuffer(16);
FloatBuffer modelview = BufferUtils.createFloatBuffer(16);
FloatBuffer projection = BufferUtils.createFloatBuffer(16);
Matrix4f modelviewMatrix = new Matrix4f();
Matrix4f transformationMatrix = new Matrix4f();
Matrix4f.mul(transformationMatrix
, Maths.createViewMatrix(camera)
, modelviewMatrix);
modelviewMatrix.store(modelview);
modelview.rewind();
projectionMatrix.store(projection);
projection.rewind();
GL11.glGetInteger(GL11.GL_VIEWPORT, viewport);
boolean result = GLU.gluProject(x, y, z, modelview, projection, viewport, screen_coords);
if (result)
{
Vector2f vector = new Vector2f((int)screen_coords.get(0), (int)(screen_coords.get(1)));
return vector;
}
else
{
return null;
}
}
Instead of using glGetFloat method to get modelview and projection matrices. I instead used the matrices I already created in my other classes and pass those matrices as parameters. Then, I convert the matrices to buffers so that I can use them. After rewinding them, I finally be able to get the correct screen coordinates of a point from gluProject.
Even though problem is solved, I still don't know why these lines did't work:
GL11.glGetFloat(GL11.GL_MODELVIEW_MATRIX, modelview);
System.out.println("modelview:");
displayFloatBuffer(modelview);
GL11.glGetFloat(GL11.GL_PROJECTION_MATRIX, projection);