Search code examples
glslunsigned-integer

why gl_VertexID is not an unsigned int?


I am in the process of designing a shader program that makes use of the built-in variable gl_VertexID:

gl_VertexID — contains the index of the current vertex

The variable is defined as a signed int. Why it is not an unsigned int? What happens when it is used with very large arrays (e.g. a 2^30 long array)? Does GLSL treat it as an unsigned int?

I want to use its content as an output of my shader (e.g writing it into an output FBO buffer) I will read its content using glReadPixels with GL_RED_INTEGER as format and either GL_INT or GL_UNSIGNED_INT as type. Which one is correct?

  • If I use GL_INT I will not be able to address very large arrays.

  • In order to use GL_UNSIGNED_INT I might cast the generated gl_VertexID to a uint inside my shader but again, how to access long array?


Solution

  • Most likely historical reasons. gl_VertexID was first defined as part of the EXT_gpu_shader4 extension. This extension is defined based on OpenGL 2.0:

    This extension is written against the OpenGL 2.0 specification and version 1.10.59 of the OpenGL Shading Language specification.

    GLSL did not yet support unsigned types at the time. They were not introduced until OpenGL 3.0.