Search code examples
openglgraphicsglsltexturesshader

Sampling integers in OpenGL shader doesn't work. Result is always 0


I have a heightmap of 16 bit signed integers. If I sample the texture using normalised float, it samples fine. So for normalised floats I upload the texture as:

glTexImage2D(GL_TEXTURE_2D, 0, GL_R16_SNORM, 512, 512, 0, GL_RED, GL_SHORT, data);

Then in the shader I do:

layout (binding = 0) uniform sampler2D heightmap;

float height = texture(heightmap, inTexCoords).r;

Then height is the -32768 to 32767 normalised to -1.0 and 1.0. This works fine. But I want to read the value as an integer, so I do:

glTexImage2D(GL_TEXTURE_2D, 0, GL_R16I, 512, 512, 0, GL_RED, GL_SHORT, data);

And in the shader:

layout (binding = 0) uniform isampler2D heightmap;    // < -- Notice I put isampler here

float height = texture(heightmap, inTexCoords).r;     // < -- This should return an ivec4, the r value should have my 16 bit signed value, right?

But no matter what I do it ALWAYS samples 0. I don't know what to do, I've tried disabling texture filtering and blending (as I've heard they might not work) and still nothing.

Are the other arguments to glTexImage2D correct? GL_RED, and GL_SHORT?

Thank you.


Solution

  • Are the other arguments to glTexImage2D correct? GL_RED, and GL_SHORT?

    No. For an GL_R16I internal format, you must use GL_RED_INTEGER as the client side data format.