Search code examples
google-chromeglslwebgl

Why doesn't pointsize default to a standard value in webgl?


I have been working on a rendering engine using Webgl and I've noticed inconsistent behavior with the gl.POINTS primitive. On my mac in Chrome, points render and are visible as expected, albeit with a very small default value. When I run the same code on my Windows 10 PC in Chrome, none of the points are visible by default. If I explicitly set point size in the vertex shader (e.g. "gl_PointSize = 1.0"), the points become visible, which suggests that on Windows they are defaulting to an extremely small value, smaller than on my mac.

This is annoying - it introduces unexpected behavior across browsers, and requires that I make a call in my vertex shaders that otherwise would be unnecessary, and which presumably adds overhead to execution.

My question - is this the expected behavior, or have I uncovered a bug in Chrome, and if it is expected - why would this be the default behavior when we aim for standardization across browsers and devices? Thanks.


Solution

  • According to OpenGL ES documentation:

    If gl_PointSize is not written to, its value is undefined in subsequent pipeline stages

    This means that rendering of points requires this variable to be explicitly written, or otherwise results will be undefined.

    I believe that defining this value as 0.0 when unspecified (what you see on Windows) can be considered as a good option as WebGL tends to avoid "undefined" in favor of "zero-defined". Chrome on macOS behavior could be found misleading in this case, although obviously drawing points with 1.0 size could be considered as more useful to see something on the screen.

    In practice, there is a bit of inconsistency related to gl_PointSize in various graphic APIs. For instance, gl_PointSize in GLSL program is optional within OpenGL desktop implementation and point size specified by glPointSize() will be used by default, which in turns is defined as 1.0 in initial OpenGL state. At the same time, OpenGL ES 2.0+ (and, hence, WebGL) does not define glPointSize() method at all, and gl_PointSize becomes a mandatory in GLSL code to avoid undefined behavior.

    So I believe that experienced different behavior is due to different graphic libraries (layers) used on macOS / Windows for implementing WebGL in Chrome. But as gl_PointSize is mandatory for drawing points anyway - there is no point investigating which undefined behavior is better or more correct.