Search code examples
opengl3dglslnormalsgeometry-shader

Faulty geometry shaders to visualise normals


Somehing is wrong with visualising my normals.

As you can see in this video, the normals are faulty and seem to move. The duck and sphere are loaded .dea files (with assimp) and I hard coded the cube myself.

Here are my shaders:

Vertex shader:

#version 430 
layout(location = 0) in vec3 inPos;
layout(location = 1) in vec3 inNormal;
out vec4 vertex_normal;
void main(void) {
    gl_Position = vec4(inPos, 1.0);
    vertex_normal = vec4(inNormal, 1.0f);
}

Geometry Shader:

#version 430 
layout(triangles) in;
layout(line_strip, max_vertices = 6) out;

uniform mat4 projectionMatrix;
uniform mat4 viewMatrix;
uniform mat4 modelMatrix;
uniform mat4 normalMatix;

in vec4 vertex_normal[3];
out vec4 outColor;

void main(void)
{   
    //The "NormalViewProjection" matrix for the normals
    mat4 NVP = projectionMatrix * viewMatrix * normalMatix;
    //The ModelViewProjection matrix for the vertices
    mat4 MVP = projectionMatrix * viewMatrix * modelMatrix;

    //Normals transformed to screen space
    const vec4 normals[] = { 
    normalize(NVP * vertex_normal[0]),
    normalize(NVP * vertex_normal[0]),
    normalize(NVP * vertex_normal[0]),
    };

    const float normalLength = 1.2f;

    //gl_in.length() = 3, since we are working with triangles           
    for(int i = 0; i < gl_in.length(); i++)
    {
        outColor = normals[i];
        const vec4 position = MVP * gl_in[i].gl_Position;
        //First vertex
        gl_Position = position;
        EmitVertex();
        //Second vertex
        gl_Position = position + normals[i] * normalLength;
        EmitVertex();       
        //Send the line to the fragment shader
        EndPrimitive();
        }
}

Fragment Shader:

#version 430 
layout(location = 0) out vec4 fragColor;
in vec4 outColor;
void main(void)
{    
    fragColor = vec4(outColor.xyz, 1.0f);
}

I have tried multiple tutorials. tutorial1 tutorial2 and a few more with out results.


Solution

  • One has to distinguish between two different types of vectors here:

    • Position vectors, which describe a position in 3D space. For example vertices, and
    • Direction vectors, which describe a direction, but not a position. Normals for example are such direction vectors.

    Projections only gives you correct results with position vectors but not with direction vectors. And this is exactly where the problem in your code lies: You are treating normals (directions) as positions.

    In my opinion, the correct code has to look somehow like this:

    ...
    for(int i = 0; i < gl_in.length(); i++)
    {
        const vec4 position = MVP * gl_in[i].gl_Position;
        //First vertex
        gl_Position = position;
        EmitVertex();
        //Second vertex
        const vec4 position2 = MVP * vec4(gl_in[i].gl_Position.xyz + vertex_normal[i].xyz, 1.0);
        gl_Position = position2;
        EmitVertex();
    }
    ...
    

    Here the position of the tip of the normal is first calculated in model space (gl_in[i].gl_Position.xyz + vertex_normal[i].xyz) and then projected.

    Edit: When you are using scalings in the model matrix and want to have all normals at the same lenght, then you might have to transform normals and positions to world space before adding:

    mat4 VP = projectionMatrix * viewMatrix;
    
    ...
    for(int i = 0; i < gl_in.length(); i++)
    {
        const vec4 position_ws = modelMatrix * gl_in[i].gl_Position;
        //First vertex
        gl_Position = VP * position_ws;
        EmitVertex();
        //Second vertex
        const vec4 normal_ws = normalize(normalMatrix * vertex_normal[i]);
        gl_Position = VP * vec4(position_ws.xyz + normal_ws.xyz, 1.0);
        EmitVertex();
    }
    ...