Search code examples
openglvertex-buffervao

Ho do you convert an OpenGL project from older glVertexAttribPointer methods to newer glVertexAttribBinding methods?


I have an OpenGL project that has previously used OpenGL 3.0-based methods for drawing arrays and I'm trying to convert it to use newer methods (at least available as of OpenGL 4.3). However, so far I have not been able to make it work.

The piece of code I'll use for explanation creates groupings of points and draws lines between them. It can also fill the resulting polygon (using triangles). I'll give an example using the point-drawing routines. Here's the pseudo-code for how it used to work:

[Original] When points are first generated (happens once):

// ORIGINAL, WORKING CODE RUN ONCE DURING SETUP:
    // NOTE: This code is in c# and uses a library called SharpGL
    // to invoke OpenGL calls via a context herein called "gl"

    float[] xyzArray = [CODE NOT SHOWN -- GENERATES ARRAY OF POINTS]

    // Create a buffer for the vertex data
    // METHOD CODE NOT SHOWN, but uses glGenBuffers to fill class-member buffer IDs
    GenerateBuffers(gl);  // Note: we now have a VerticesBufferId

    // Set the vertex buffer as the current buffer and fill it
    gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, this.VerticesBufferId);
    gl.BufferData(OpenGL.GL_ARRAY_BUFFER, xyzArray, OpenGL.GL_STATIC_DRAW);

[Original] Within the loop that does the drawing:

// ORIGINAL, WORKING CODE EXECUTED DURING EACH DRAW LOOP:
    // NOTE: This code is in c# and uses a library called SharpGL
    // to invoke OpenGL calls via a context herein called "gl"

    // Note: positionAttributeId (below) was derived from the active
    // shader program via glGetAttribLocation 
    gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, this.VerticesBufferId);
    gl.EnableVertexAttribArray(positionAttributeId);
    gl.VertexAttribPointer(positionAttributeId, 3, OpenGL.GL_FLOAT, false, 0, IntPtr.Zero);
    
    // Missing code sets some uniforms in the shader program and determines
    // the start (iStart) and length (pointCount) of points to draw

    gl.DrawArrays(OpenGL.GL_LINE_STRIP, iStart, pointCount);

That code has worked for quite a while now, but I'm trying to move to more modern techniques. Most of my code didn't change at all, and this is the new code that replaced the above:

[After Mods] When points are first generated (happens once):

// MODIFIED CODE RUN ONCE DURING SETUP:
    // NOTE: This code is in c# and uses a library called SharpGL
    // to invoke OpenGL calls via a context herein called "gl"

    float[] xyzArray = [CODE NOT SHOWN -- GENERATES ARRAY OF POINTS]

    // Create a buffer for the vertex data
    // METHOD CODE NOT SHOWN, but uses glGenBuffers to fill class-member buffer IDs
    GenerateBuffers(gl);  // Note: we now have a VerticesBufferId.

    // Set the vertex buffer as the current buffer
    gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, this.VerticesBufferId);
    gl.BufferData(OpenGL.GL_ARRAY_BUFFER, xyzArray, OpenGL.GL_STATIC_DRAW);

// ^^^ ALL CODE ABOVE THIS LINE IS IDENTIAL TO THE ORIGINAL ^^^
    // Generate Vertex Arrays
    // METHOD CODE NOT SHOWN, but uses glGenVertexArrays to fill class-member array IDs
    GenerateVertexArrays(gl);  // Note: we now have a PointsArrayId

    // My understanding: I'm telling OpenGL to associate following calls
    // with the vertex array generated with the ID PointsArrayId...
    gl.BindVertexArray(PointsArrayId);

    // Here I associate the positionAttributeId (found from the shader program)
    // with the currently bound vertex array (right?)
    gl.EnableVertexAttribArray(positionAttributeId);

    // Here I tell the bound vertex array about the format of the position
    // attribute as it relates to that array -- I think.
    gl.VertexAttribFormat(positionAttributeId, 3, OpenGL.GL_FLOAT, false, 0);

    // As I understand it, I can define my own "local" buffer index
    // in the following calls (?).  Below I use 0, which I then bind
    // to the buffer with id = this.VerticesBufferId (for the purposes
    // of the currently bound vertex array)
    gl.VertexAttribBinding(positionAttributeId, 0);
    gl.BindVertexBuffer(0, this.VerticesBufferId, IntPtr.Zero, 0);

    gl.BindVertexArray(0); // we no longer want to be bound to PointsArrayId

[After Mods] Within the loop that does the drawing:

// MODIFIED CODE EXECUTED DURING EACH DRAW LOOP::
    // NOTE: This code is in c# and uses a library called SharpGL
    // to invoke OpenGL calls via a context herein called "gl"

    // Here I tell OpenGL to bind the VertexArray I established previously
    // (which should understand how to fill the position attribute in the
    // shader program using the "zeroth" buffer index tied to the 
    // VerticesBufferId data buffer -- because I went through all the trouble
    // of telling it that above, right?)
    gl.BindVertexArray(this.PointsArrayId);

    // \/ \/ \/ NOTE: NO CODE CHANGES IN THE CODE BELOW ThIS LINE  \/ \/ \/
    // Missing code sets some uniforms in the shader program and determines
    // the start (iStart) and length (pointCount) of points to draw

    gl.DrawArrays(OpenGL.GL_LINE_STRIP, iStart, pointCount);

After the modifications, the routines draw nothing to the screen. There's no exceptions thrown or indications (that I can tell) of a problem executing the commands. It just leaves a blank screen.

General questions:

  1. Does my conversion to the newer vertex array methods look correct? Do you see any errors in the logic?
  2. Am I supposed to do specific glEnable calls to make this method work vice the old method?
  3. Can I mix and match between the two methods to fill attributes in the same shader program? (e.g., In addition to the above, I fill out triangle data and use it wit the same shader program. If I haven't switched that process to the new method, will that cause a problem)?

If there is there anything else I'm missing here, I'd really appreciate it if you'd let me know.


Solution

  • A little more sleuthing and I figured out my error:

    When using glVertexAttribPointer, you can set the stride parameter to 0 (zero) and OpenGL will automatically determine the stride; however, when using glVertexAttribFormat, you must set the stride yourself.

    Once I manually set the stride value in glVertexAttribFormat, everything worked as expected.