I'm trying to get a working OpenGL renderer, but I'm stuck at a weird spot. I'm generating a Model, View and Projection matrices that seem correct. GLIntercept gives me the following log:
glClear(GL_COLOR_BUFFER_BIT)
glEnableVertexAttribArray(0)
glUniformMatrix4fv(0,1,false,[... see below ...])
glBindBuffer(GL_ARRAY_BUFFER,1)
glVertexAttribPointer(0,3,GL_FLOAT,false,0,0000000000000000)
glDrawArrays(GL_POLYGON,0,3) GLSL=3
glDisableVertexAttribArray(0)
wglSwapBuffers(FFFFFFFF9D011AB3)=true
My MVP matrix is as follows:
[
0,050000 0,000000 0,000000 0,500000
0,000000 0,050000 0,000000 0,500000
0,000000 0,000000 5,000000 0,000000
0,000000 0,000000 -50,000000 1,000000
]
Here are the vertex shader I'm using:
#version 330
layout (location = 0) in vec3 vertex;
uniform mat4 mvp;
out vec4 color;
void main() {
gl_Position = mvp * vec4(vertex, 0.0);
color = vec4(1, 1, 1, 1);
}
and the fragment shader:
#version 330
in vec4 color;
out vec4 fragColor;
void main() {
fragColor = color;
}
This is how I calculate the projection matrix, and the coordinates are the simplest possible, a flattened cube:
[
new Vector3f(-1, -1, 0),
new Vector3f(1, -1, 0),
new Vector3f(1, 1, 0),
new Vector3f(-1, 1, 0)
]
Yet that renders to a weird thing:
If I multiply the matrix with the same coordinates in-app, however, I get expected results:
(0,45 0,45 0,00 1,00)
(0,55 0,45 0,00 1,00)
(0,55 0,55 0,00 1,00)
(0,45 0,55 0,00 1,00)
Could someone point me in a direction to solve this?
The main problem is in your vertex shader:
gl_Position = mvp * vec4(vertex, 0.0);
When expanding regular 3D coordinates to homogeneous coordinates, you need to set the w
component to 1.0. So the statement should look like this:
gl_Position = mvp * vec4(vertex, 1.0);
It's also slightly odd that you have 4 vertices, but pass 3 as the number of vertices (last argument) to the draw call:
glDrawArrays(GL_POLYGON, 0, 3);