I'm using OpenGL 4.2 and I can't figure out why I'm getting a GL_INVALID_VALUE error in this program. I get an error when I call glBindAttribLocation. According to the OpenGL 4 reference page, there are only two reasons why GL_INVALID_VALUE should be generated from glBindAttribLocation.
void glBindAttribLocation(GLuint program, GLuint index, const GLchar *name);
As you can see from the program below, condition 1 is not set since index
is 20
and GL_MAX_VERTEX_ATTRIBS
is 34921
. Condition 2 is not met because program
is generated by OpenGL using glCreateProgram()
. So how could I possibly get a GL_INVALID_VALUE
error?
// test.cpp
#include <GL/glew.h>
#include <GL/glut.h>
#include <iostream>
int main(int argc, char* argv[])
{
glutInit(&argc, argv);
glutCreateWindow("Test");
glewInit();
std::cout << "Max Vertex Attributes : " << GL_MAX_VERTEX_ATTRIBS << std::endl;
// create program
GLuint program = glCreateProgram();
if ( program == 0 )
std::cout << "Program error" << std::endl;
// clear existing errors
if ( glGetError() != GL_NO_ERROR )
std::cout << "Pre-existing error" << std::endl;
// bind attribute location to index 20
glBindAttribLocation(program, 20U, "DoesNotExist");
// why is this generating an INVALID_VALUE error?
if ( glGetError() == GL_INVALID_VALUE )
std::cout << "Invalid value error" << std::endl;
glDeleteProgram(program);
return 0;
}
Terminal output
$ g++ test.cpp -lGLEW -lglut
$ ./a.out
Max Vertex Attributes : 34921
Invalid value error
Also to verify OpenGL 4.2
$ glxinfo | grep OpenGL
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce GT 540M/PCIe/SSE2
OpenGL version string: 4.2.0 NVIDIA 304.64
OpenGL shading language version string: 4.20 NVIDIA via Cg compiler
OpenGL extensions:
Note : According to the reference page. "glBindAttribLocation can be called before any vertex shader objects are bound to the specified program object. It is also permissible to bind a generic attribute index to an attribute variable name that is never used in a vertex shader." So the fact that no shaders are loaded and DoesNotExist
doesn't exist aren't the problem.
This is the second time I can recall this question being asked in a couple of months now. The last time this was asked, though it is not immediately apparent that this is the same question, was here.
What it boils down to is this: GL_MAX_VERTEX_ATTRIBS
as the compiler sees it is a pre-processor token that defines an ID that you can use to ask the OpenGL driver its implementation-defined limit at run-time. When you attempt to print this value directly the only thing you are doing is printing the universal ID that all OpenGL implementations use to query this particular limit.
To get the actual implementation-dependent limit, you need to do this at run-time:
int max_attribs;
glGetIntegerv (GL_MAX_VERTEX_ATTRIBS, &max_attribs);
Incidentally, OpenGL implementations are only required to provide a minimum of 16 per-vertex attributes; most only give you the minimum, which explains why 20 is out-of-bounds.