I'm having a hard time finding out why some int
and unsigned int
internal storage types are "color renderable" and others aren't. A few examples
internal storage type | color renderable |
---|---|
GL_RGBA8 | Y |
GL_RGBA8I | Y |
GL_RGBA32UI | Y |
internal storage type | color renderable |
---|---|
GL_RGB8 | Y |
GL_RGB8I | |
GL_RGB32UI | |
GL_RGB565 | Y |
internal storage type | color renderable |
---|---|
GL_RG8 | Y |
GL_RG8UI | Y |
GL_R8 | Y |
GL_R8UI | Y |
I understand (I think) why sRGB- and floating-point internal storage types are non-color-renderable, but why are non-normalized, three-channel, int
and unsigned int
storage types non-color renderable? Why aren't one- and two-channel int
and unsigned int
storage types non-color-renderable too? I read through the applicable parts of the OpenGL ES 3 spec and all it seems to say is 'because it's not marked as "color renderable" in the table'. I think I'm missing something embarrassingly basic.
In general three channel storage types are disliked by hardware, as three is not a power of two and therefore texels/pixels can end up spanning cache lines.
To avoid this, in practice most 3 channel formats will end up being rounded up to a 4 channel internal format with 1 channel of padding.
Just use a 4 channel format, you're no worse off.