Search code examples
glslwebglwebgl2

What is the Difference Between WebGL 2.0 Max Image Texture Units and Max Combined Texture Units?


I'm currently working on a GLSL shader which is using a large number of texture files to layer multiple materials on top of each other, which are masked off individually by an alpha blend image for each layer. The effect is supposed to mimic a fabric with a backer, a middle mesh, and a third mesh on top. In working on this, I've quickly run into limits on texture units in my shader. As a solution, I'm going to start combining textures into RGB channels to get a few more textures available to the shader, and looking into combining multiple diffuse images or others into a single image. But in the meantime, my question is:

Will WebGL 2.0 offer more texture units, or is this something limited by the graphics card itself? And just as a general knowledge question, is the Max Combined Texture Unit number the total number of texture units I can have on a WebGL context? How is that different from Max Texture Units? I'm just trying to understand the limitations here. Below is the results from Browserleaks for WebGL on my Macbook Pro.

WebGL Browserleaks Report


Solution

  • Will WebGL 2.0 offer more texture units, or is this something limited by the graphics card itself?

    It is a hardware limitation.

    is the Max Combined Texture Unit number the total number of texture units I can have on a WebGL context?

    Yes.

    How is that different from Max Texture Units?

    GL_MAX_TEXTURE_IMAGE_UNITS applies only to textures accessed by the fragment shader. GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS specifies the total number of textures that can be accessed across all shader stages.

    WebGL 1.0 didn't allow non-FS stages to access textures. WebGL 2.0 permits multiple stages to access textures.