Search code examples
image-processingwebglfragment-shaderwebgl2glteximage2d

Sampling integer texture in WebGL returns weird values


I'm trying to render an grayscale image from a 16-bit array buffer in WebGL2, by applying window leveling in the fragment shader. I'v generated the texture as below:

let typedArray = new Int16Array(data);
gl.texImage2D(
        gl.TEXTURE_2D,
        0,
        gl.R16I,
        w, h,
        0,
        gl.RED_INTEGER,
        gl.SHORT,
        typedArray);

and tried to use the data from the fragment shader below:

let fragmentShaderSource = `#version 300 es
    precision highp float;
    precision highp int;
    precision highp isampler2D;

    // our texture
    uniform isampler2D u_image;

    uniform highp float u_windowWidth;
    uniform highp float u_windowCenter;

    in vec2 v_texCoord;
    out vec4 outColor;

    void main() {
        highp float f = float(texture(u_image, v_texCoord).r);
        f = (f - (u_windowCenter - 0.5)) / max(u_windowWidth - 1.0, 1.0) + 0.5;
        f = min(max(f, 0.0), 1.0);
        outColor = vec4(vec3(f), 1.0);
    }
    `;

but this only renders a black screen. Actually, after some debugging, I found that texture(u_image, v_texCoord) had zero values in rgb across all pixels and a (alpha) field had very large (2^29 ~ 2^30) value. I've tried changing precisions in the shader but results were the same.

In order to narrow down the problem scope, I've tried a different approach by splitting the 16-bit integer into gl.RGBA4, which contains 4-bits in each RGBA channels:

let typedArray = new Uint16Array(data);
gl.texImage2D(
        gl.TEXTURE_2D,
        0,
        gl.RGBA4,
        w, h,
        0,
        gl.RGBA,
        gl.UNSIGNED_SHORT_4_4_4_4,
        typedArray);

and combined RGBA values back into 16-bit integer in the fragment shader.

let fragmentShaderSource = `#version 300 es
    precision highp float;
    precision highp int;
    precision highp sampler2D;

    // our texture
    uniform sampler2D u_image;

    uniform highp float u_windowWidth;
    uniform highp float u_windowCenter;

    in vec2 v_texCoord;
    out vec4 outColor;

    void main() {
        highp vec4 rgba_map = texture(u_image, v_texCoord);
        // Combining rgba4 back into int16
        highp f = rgba_map.r * 65536.0 + rgba_map.g * 4096.0 + rgba_map.b * 256.0 + rgba_map.a * 16.0;
        // signed value
        if (f > 32768.0) {
            f = 65536.0 - f;
        }
        f = (f - (u_windowCenter - 0.5)) / max(u_windowWidth - 1.0, 1.0) + 0.5;
        f = min(max(f, 0.0), 1.0);
        outColor = vec4(vec3(f), 1.0);
    }
    `;

and this version rendered the expected image quite well, although the result was a bit noisy due to the conversion. I've also tried some other formats, and those with float type were fine and the integer type formats were all not working. So I think the other parts of the program are fine. I wonder what is wrong with my program.


Solution

  • You haven't really posted enough code to debug so let's just make something that works.

    function main() {
      const gl = document.querySelector('canvas').getContext('webgl2');
      if (!gl) {
        return alert('need WebGL2');
      }
      const vs = `#version 300 es
      void main() {
        gl_PointSize = 300.0;
        gl_Position = vec4(0, 0, 0, 1);
      }
      `;
      const fs = `#version 300 es
      precision highp float;
      precision highp int;
      precision highp isampler2D;
    
      // our texture
      uniform isampler2D u_image;
    
      out vec4 color;
      
      void main() {
        ivec4 intColor = texture(u_image, gl_PointCoord.xy);
        color = vec4(vec3(intColor.rrr) / 10000.0, 1);
      }
      `;
      
      const program = twgl.createProgram(gl, [vs, fs]);
      const tex = gl.createTexture();
      gl.bindTexture(gl.TEXTURE_2D, tex);
      gl.texImage2D(
          gl.TEXTURE_2D,
          0,               // mip level
          gl.R16I,         // internal format
          10,              // width
          1,               // height
          0,               // border
          gl.RED_INTEGER,  // source format
          gl.SHORT,        // source type
          new Int16Array([
            1000, 2000, 3000, 4000, 5000, 6000, 7000, 8000, 9000, 10000
          ]));
      // can't filter integer textures
      gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
      gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
      
      gl.useProgram(program);
      
      // no need to set any attributes or
      // uniforms as we're not using attributes
      // and uniforms default to zero so will use
      // texture unit zero
      gl.drawArrays(gl.POINTS, 0, 1);
      
      console.log('max point size:', gl.getParameter(gl.ALIASED_POINT_SIZE_RANGE)[1]);
    }
    main();
    canvas {
      border: 1px solid black;
      background: red;
    }
    <script src="https://twgljs.org/dist/4.x/twgl-full.min.js"></script>
    <canvas></canvas>

    Should look like this

    enter image description here

    but might have red borders if your GPUs max point size < 300

    a few ideas

    • did you check the JavaScript console for errors?

    • did you turn off filtering for the texture?

      integer texture can not be filtered

    • is your texture width an even number?

      If not you probably need to set gl.pixelStorei(gl.UNPACK_ALIGNMENT, 1) though I'd have expected you to get an error here unless your Int16Array is larger than width * height