Search code examples
opengltexturescolor-mapping

openGL Translating pixel brightness to colormap texture produces incorrect result


See gif switching between RGB and colormap:

enter image description here

The problem is that the two images are different.

I am drawing dots that are RGB white (1.0,1.0,1.0). The alpha channel controls pixel brightness, which creates the dot blur. That's what you see as the brighter image. Then I have a 2-pixel texture of black and white (0.0,0.0,0.0,1.0) (1.0,1.0,1.0,1.0) and in a fragment shader I do:

#version 330

precision highp float;

uniform sampler2D originalColor;
uniform sampler1D colorMap;
in vec2 uv;
out vec4 color;

void main()
{
  vec4 oldColor = texture(originalColor, uv);
  color = texture(colorMap, oldColor.a);
}

Very simply, take the fragment of the originalColor texture's alpha value of 0 to 1, and translate it to a new color with colorMap texture of black to white. There should be no difference between the two images! Or... at least, that's my goal.

Here's my setup for the colormap texture

  glActiveTexture(GL_TEXTURE0);
  glGenTextures(1, &colormap_texture_id); // get texture id
  glBindTexture(GL_TEXTURE_1D, colormap_texture_id);
  glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); // required: stop texture wrapping
  glTexParameteri(GL_TEXTURE_1D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); // required: scale texture with linear sampling
  glTexImage1D(GL_TEXTURE_1D, 0, GL_RGBA32F, colormapColors.size(), 0, GL_RGBA, GL_FLOAT, colormapColors.data()); // setup memory   

Render loop:

  GLuint textures[] = { textureIDs[currentTexture], colormap_texture_id };
  glBindTextures(0, 2, textures);

  colormapShader->use();
  colormapShader->setUniform("originalColor", 0);
  colormapShader->setUniform("colorMap", 1);
  renderFullScreenQuad(colormapShader, "position", "texCoord");

I am using a 1D texture as a colormap because it seems that's the only way to potentially have a 1000 to 2000 indexes of colormap values stored in the GPU memory. If there's a better way, let me know. I assume the problem is that the math for interpolating between two pixels is not right for my purposes.

What should I do to get my expected results?

To make sure there's no shenanigans I tried to following shader code:

color = texture(colorMap, oldColor.a); //incorrect results
color = texture(colorMap, (oldColor.r + oldColor.g + oldColor.b)/3); //incorrect
color = texture(colorMap, (oldColor.r + oldColor.g + oldColor.b + oldColor.a)/4); //incorrect
color = vec4(oldColor.a); //incorrect
color = oldColor; // CORRECT... obviously...

Solution

  • I think to be more accurate, you'd need to change:

    color = texture(colorMap, oldColor.a);
    

    to

    color = texture(colorMap, oldColor.a * 0.5 + 0.25);
    

    Or more generally

    color = texture(colorMap, oldColor.a * (1.0 - (1.0 / texWidth)) + (0.5 / texWidth));
    

    Normally, you wouldn't notice the error, it's just because texWidth is so tiny that the difference is significant.

    The reason for this is because the texture is only going to start linear filtering from black to white after you pass the centre of the first texel (at 0.25 in your 2 texel wide texture). The interpolation is complete once you pass the centre of the last texel (at 0.75).

    If you had a 1024 texture like you mention you plan to end up with then interpolation starts at 0.000488 and I doubt you'd notice the error.