I am learning trying to learn GLSL with Java and the LWJGL, but I am currently having problems with a basic Vertex Shader.
Vertex Shader:
#version 120
void main() {
gl_Position = gl_Vertex * 0.5;
}
Fragment Shader:
#version 120
void main() {
gl_FragColor = vec4(1.0, 0.4, 0.4, 1.0);
}
Rendering code:
public void draw() {
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glVertexPointer(3, GL_FLOAT, 0, 0L);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ibo);
glEnableClientState(GL_VERTEX_ARRAY);
glDrawElements(GL_TRIANGLES, size, GL_UNSIGNED_INT, 0);
glDisableClientState(GL_VERTEX_ARRAY);
}
The compilation yields no errors and the Fragment Shader works fine, tinting a simple triangle I draw. The only problem is that I dont see the triangle downscaled (0.5 factor) as expected, it's vertices remain unaltered. What am I doing wrong?
The problem is that gl_Vertex
is a 4 dimensional vector. Multiplying it directly with a scalar will not give you the scaling results intended. OpenGL divides the x,y,z with the w component when converting the homogeneous gl_Position
to cartesian screen coordinates. 0.5 / 0.5 = 1.0 so the result is no scaling.
gl_Position = gl_Vertex * 0.5;
This multiples the x, y, z, and w components by 0.5. Change the code to only scale x, y, and z.
gl_Position = vec4(gl_Vertex.xyz * 0.5, gl_Vertex.w);