Search code examples
c#openglglsl

GLSL: Uniforms disappear when modifying the fragment shader


I want to render a huge number of points at once, so I pass all the data of the points to a shader program via a vertex buffer object (VBO). Everything works fine so far.

My plain made out of points

The buffer not only contains the position information, but also some integers which should represent some properties of every single point.

 public struct DataVertex
 {
      public Vector3 position;
      public int tileType;
      public int terrainType;

      public DataVertex(Vector3 position, int tileType, int terrainType)
      {
           this.position = position;
           this.tileType = tileType;
           this.terrainType = terrainType;
      }
 }

To test this I want to vary the color of each point depending on the Integers. To achieve this, I pass the integer data from the vertex shader to the geometry shader and finally to the fragment shader, where I want to set the colors. But if I want to ask for the values in the fragment shader (via if-statements), the uniforms in the vertex shader are gone (optimized away?). Where is the problem?

Vertex shader:

#version 450 core

layout (location = 0) in vec3 aPos;
layout (location = 1) in int aTileType;
layout (location = 2) in int aTerrainType;

out DATA
{
    int tileType;
    int terrainType;
    mat4 projection;
} data_out;

uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;

void main()
{   
    gl_Position = vec4(aPos, 1.0) * model;

    data_out.tileType = aTileType;
    data_out.terrainType = aTerrainType;
    data_out.projection = view * projection;
}

Geometry shader:

#version 450 core

layout (points) in;
layout(points, max_vertices = 1) out;

in DATA
{
    int tileType;
    int terrainType;
    mat4 projection;
} data_in[];

out int oTileType;
out int oTerrainType;

void main()
{
    oTileType = data_in[0].tileType;
    oTerrainType = data_in[0].terrainType;
    gl_Position = gl_in[0].gl_Position * data_in[0].projection;
    EmitVertex();

    EndPrimitive();
}

Fragment shader:

#version 450 core

out vec4 FragColor;

in int oTileType;
in int oTerrainType;

void main()
{
    if (oTileType == 0)
        FragColor = vec4(0.0f, 0.0f, 1.0f, 1.00f);
    else if (oTileType == 1)
        FragColor = vec4(0.0f, 1.0f, 0.0f, 1.00f);
    
    else if (oTerrainType == 2)
        FragColor = vec4(1.0f, 1.0f, 0.0f, 1.00f);
}

THIS version of the fragment shader works:

#version 450 core

out vec4 FragColor;

in int oTileType;
in int oTerrainType;

void main()
{
    FragColor = vec4(1.0f, 1.0f, 0.0f, 1.00f);
}

Whoever is interested in the full source code, check out https://github.com/BanditBloodwyn/SimulateTheWorld

The relevant classes are:

  • SimulateTheWorld.Graphics.Rendering/Rendering/OpenGLRenderer.cs
  • SimulateTheWorld.Graphics.Data/OpenGL/ShaderProgram.cs
  • "point" shaders in SimulateTheWorld.Graphics.Resources/Rendering/Shaders/
  • SimulateTheWorld.Graphics.Data.Components.PointCloud.cs

Solution

  • Has the program been compiled or linked at all? In GLSL 320 es, it's an error (either compilation or link error) to pass an integral data type without the flat qualifier, because integers can't be interpolated.

    I think that's why when you query uniforms you fail, because the shader hasn't even linked.

    Error codes are:

    S0055: Vertex output with integral type must be declared as flat

    S0056: Fragment input with integral type must be declared as flat

    But I didn't find anything related to this issue in OpenGL and GLSL (core) documents.