Our application crashes on old Nvidia drivers..
Debug code is here
Looking around, here they say it is often due to an incorrect vertex attribute setup
This is how I setup my vbo and vao:
/**
* Init Vbo/vao.
*/
float[] vertexData = new float[]{
0, 0,
1, 0,
1, 1};
debugVbo = new int[1];
gl3.glGenBuffers(1, debugVbo, 0);
gl3.glBindBuffer(GL3.GL_ARRAY_BUFFER, debugVbo[0]);
{
FloatBuffer buffer = GLBuffers.newDirectFloatBuffer(vertexData);
gl3.glBufferData(GL3.GL_ARRAY_BUFFER, vertexData.length * Float.BYTES, buffer, GL3.GL_STATIC_DRAW);
}
gl3.glBindBuffer(GL3.GL_ARRAY_BUFFER, 0);
debugVao = new int[1];
gl3.glGenVertexArrays(1, debugVao, 0);
gl3.glBindVertexArray(debugVao[0]);
{
gl3.glBindBuffer(GL3.GL_ARRAY_BUFFER, debugVbo[0]);
{
gl3.glEnableVertexAttribArray(0);
{
gl3.glVertexAttribPointer(0, 2, GL3.GL_FLOAT, false, 0, 0);
}
}
gl3.glBindBuffer(GL3.GL_ARRAY_BUFFER, 0);
}
gl3.glBindVertexArray(0);
}
And this is how I render:
public static void render(GL3 gl3) {
gl3.glClear(GL3.GL_DEPTH_BUFFER_BIT | GL3.GL_COLOR_BUFFER_BIT);
gl3.glUseProgram(textureProgram);
{
gl3.glBindVertexArray(debugVao[0]);
{
gl3.glActiveTexture(GL3.GL_TEXTURE0);
gl3.glBindTexture(GL3.GL_TEXTURE_2D, texture[0]);
gl3.glBindSampler(0, EC_Samplers.pool[EC_Samplers.Id.clampToEdge_nearest_0maxAn.ordinal()]);
{
gl3.glDrawArrays(GL3.GL_TRIANGLES, 0, 3);
}
gl3.glBindTexture(GL3.GL_TEXTURE_2D, 0);
gl3.glBindSampler(0, 0);
}
gl3.glBindVertexArray(0);
}
gl3.glUseProgram(0);
}
This is my VS:
#version 330
layout (location = 0) in vec2 position;
uniform mat4 modelToCameraMatrix;
uniform mat4 cameraToClipMatrix;
out vec2 fragmentUV;
void main()
{
gl_Position = cameraToClipMatrix * modelToCameraMatrix * vec4(position, 0, 1);
fragmentUV = position;
}
And my FS:
#version 330
in vec2 fragmentUV;
out vec4 outputColor;
uniform sampler2D textureNode;
void main()
{
outputColor = texture(textureNode, fragmentUV);
}
I read and re-read the same code since 2 days now, I can't find anything wrong. I tried also defining a stride of 2*4=8, but same outcome..
I can't believe it.
Problem lied somewhere else, where I was initializing my samplers
public static void init(GL3 gl3) {
pool = new int[Id.size.ordinal()];
gl3.glGenSamplers(Id.size.ordinal(), pool, 0);
gl3.glSamplerParameteri(pool[Id.clampToEdge_nearest_0maxAn.ordinal()],
GL3.GL_TEXTURE_WRAP_S, GL3.GL_CLAMP_TO_EDGE);
gl3.glSamplerParameteri(pool[Id.clampToEdge_nearest_0maxAn.ordinal()],
GL3.GL_TEXTURE_WRAP_T, GL3.GL_CLAMP_TO_EDGE);
gl3.glSamplerParameteri(pool[Id.clampToEdge_nearest_0maxAn.ordinal()],
GL3.GL_TEXTURE_MIN_FILTER, GL3.GL_NEAREST);
gl3.glSamplerParameteri(pool[Id.clampToEdge_nearest_0maxAn.ordinal()],
GL3.GL_TEXTURE_MAG_FILTER, GL3.GL_NEAREST);
gl3.glSamplerParameteri(pool[Id.clampToEdge_nearest_0maxAn.ordinal()],
GL3.GL_TEXTURE_MAX_ANISOTROPY_EXT, 0);
}
The crash was caused by setting the max anisotropy to 0... 1 resolved the crash..
Ps: also glSamplerParameterf
instead glSamplerParameteri
since it is a float value..
Anyway it is weird because that code was since since a lot of time and never trigger the violation previously.. I don't know.. maybe some latter code modification made in a way that the Nvidia driver couldn't detect anymore the problem and fix it by itself, who knows..