Search code examples
c#openglopentkopengl-4

OpenGL rendering: All vertices move to bounds of unit sphere


I am programming a renderer for my toy game engine using OpenTK in C#. As it will be used only by me, I am using OpenGL 4.5. After I implemented the basics, I tried to render the Utah Teapot. This is what I got: Attempt to render the Utah Teapot

When I render a simple cube, it renders fine. Once I add more vertices, everything starts to resemble a sphere. I use the .obj file format. My loader isn't causing the problem, I logged all vertex positions to console and manually created another .obj file with these positions, imported them to Blender and they showed up fine. My vertex shader simply passes all data, and fragment shader just assigns the white color. I browsed the whole internet but nobody had this problem. I just load the .obj file, create vertex and indices arrays, create VAO, VBO with vertices, EBO with indices. GL.GetError() isn't giving me anything. I think the problem is somewhere in loading model data into VBO, but I just can't find what the problem could be. This is my code for loading mesh data:

private bool _initialized = false;
public readonly int _id, _vertexCount, _indexCount;
private readonly int _vboID, _iboID;
public Mesh(Vertex[] vertices, int[] indices)
{
    _vertexCount = vertices.Length;
    _indexCount = indices.Length;
    GL.CreateVertexArrays(1, out _id);

    GL.CreateBuffers(1, out _vboID);   
    GL.NamedBufferStorage(
        _vboID,
        Vertex.Size * _vertexCount,   // Vertex is a struct with Vector3 position, Vector2 texCoord, Vector3 normal
        vertices,
        BufferStorageFlags.MapReadBit);

    GL.EnableVertexArrayAttrib(_id, 0); // _id is the VAO id provided by GL.CreateVertexArrays()
    GL.VertexArrayAttribFormat( 
        _id,
        0,                      
        3,                      // Vector3 - position
        VertexAttribType.Float, 
        false,                  
        0);                     // first item, offset 0
    GL.VertexArrayAttribBinding(_id, 0, 0);

    GL.EnableVertexArrayAttrib(_id, 1);
    GL.VertexArrayAttribFormat(
        _id,
        1,
        2,                      // Vector2 - texCoord
        VertexAttribType.Float,
        false,
        12);                    // sizeof(float) * (3) = (Size of Vector3)
    GL.VertexArrayAttribBinding(_id, 1, 0);

    GL.EnableVertexArrayAttrib(_id, 2);
    GL.VertexArrayAttribFormat(
        _id,
        0,
        3,                      // Vector3 - normal
        VertexAttribType.Float,
        false,
        20);                    // sizeof(float) * (3 + 2) = (Size of Vector3 + Vector2)
    GL.VertexArrayAttribBinding(_id, 2, 0);

    GL.VertexArrayVertexBuffer(_id, 0, _vboID, IntPtr.Zero, Vertex.Size);

    GL.CreateBuffers(1, out _iboID);
    GL.NamedBufferStorage(
        _iboID,
        sizeof(int) * _indexCount,
        indices,
        BufferStorageFlags.MapReadBit);
    GL.VertexArrayElementBuffer(_id, _iboID);
    _initialized = true;
}

Solution

  • The issue is when setting up the array of normal vector attributes. The 2nd parameter of GL.VertexArrayAttribFormat is the attribute index and has to be 2 instead of 0, in your case:

    GL.EnableVertexArrayAttrib(_id, 2);
    GL.VertexArrayAttribFormat(
        _id,
        2,  // <----------------------- 2 instead of 0
        3,                     
        VertexAttribType.Float,
        false,
        20);                    
    GL.VertexArrayAttribBinding(_id, 2, 0);
    

    The attribute index of 0 causes, that the vertex coordinate specification is overwritten by the normal vectors. The normal vectors are Unit vectors (length == 1), if they are treated as vertex coordinates they form a spherical shape.