The code snippet below attempts to create a texture and then checks gl.getError()
. When the internal format is set to RG8
the creation succeeds. I want to set the internal format to RG8UI
, but this results in the creation failing. In firefox a warning is also printed to the console: Mismatched internalFormat and format/type: 0x8238 and 0x8227/0x1401
where 0x8238=RG8UI
, 0x8227=RG
, and 0x1401=UNSIGNED_BYTE
.
As far as I can tell, MDN's documentation on texImage2D indicates the pairing of internal format RG8UI
with format RG
is allowed though it is not "texture filterable" whatever that means. What am I doing wrong here?
const gl = document.createElement('canvas').getContext('webgl2');
const GL = WebGL2RenderingContext;
const w = 8;
const h = 8;
let texture = gl.createTexture();
gl.bindTexture(GL.TEXTURE_2D, texture);
gl.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MAG_FILTER, GL.NEAREST);
gl.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_MIN_FILTER, GL.NEAREST);
gl.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_WRAP_S, GL.CLAMP_TO_EDGE);
gl.texParameteri(GL.TEXTURE_2D, GL.TEXTURE_WRAP_T, GL.CLAMP_TO_EDGE);
gl.texImage2D(GL.TEXTURE_2D, 0, GL.RG8UI, w, h, 0, GL.RG, GL.UNSIGNED_BYTE, null);
if (gl.getError() != GL.NO_ERROR) {
throw new Error("Failed");
}
console.log("passed");
If the imageformat is an integral format, then the format has to be integral, too. Either an INVALID_OPERATION
error is generated.
Note, there the allowed combinations of internalformat, format and type are specified: Valid combinations of format, type, and sized internalformat. A valid combination is RG8UI
, RG_INTEGER
, UNSIGNED_BYTE
.
Change the format argument from GL.RG
to GL.RG_INTEGER
:
gl.texImage2D(GL.TEXTURE_2D, 0, GL.RG8UI, w, h, 0, GL.RG, GL.UNSIGNED_BYTE, null);
gl.texImage2D(GL.TEXTURE_2D, 0, GL.RG8UI, w, h, 0, GL.RG_INTEGER, GL.UNSIGNED_BYTE, null);