Search code examples
webglantialiasingwebgl2

Is antialias possible in WebGL 2 if one of your fragment shader outputs is integer?


From my own attempts and reading through the WebGL/OpenGL documentation it seems that it's impossible to anti-alias if you are also writing integer data to your framebuffer object. Is this correct?

For example:

gl.renderbufferStorageMultisample(gl.RENDERBUFFER, 4, gl.R16I, w,h);

...gives the warning WebGL warning: renderbufferStorage(Multisample)?: 'samples' is out of the valid range. (Firefox) or INVALID_OPERATION: renderbufferStorageMultisample: for integer formats, samples > 0 (Chrome).

The WebGL spec says:

Generates INVALID_OPERATION when internalFormat == DEPTH_STENCIL && samples > 0.

The OpenGL spec (§4.4.2.1) goes much further and says:

If internalformat is a signed or unsigned integer format and samples is greater than zero, then the error INVALID_OPERATION is generated.

And later in the same section it says:

(Calling glRenderbufferStorage() is equivalent to calling RenderbufferStorageMultisample with samples equal to zero.

Since sample counts must be identical for all FBO attachments, I assume this means it's impossible to have anti-aliased colour output and integer output in the same draw call, so if I want both, I have to do two draw calls.

Is this correct?

I get that antialiasing integer data doesn't make a lot of sense, but I would have thought there would be some equivalent to gl.NEAREST for renderbuffer resolution.


Solution

  • No, it is not possible. You have already quoted the relevant parts of the specification. Technically, you cannot interpolate an integer correctly and when rendering it is usually an id or index, where interpolation makes no sense at all and there is no way to interpolate the bits in the stencil buffer in a meaningful way. If interpolation makes sense, you should always use floating point numbers and do not use a stencil buffer if you do not need it.