I have a vertex shader that sets gl_PointSize = mix( 0. , 5. , step(foo,bar) );
. My mac with an Intel HD Graphics 5000, renders only the points that are 5, not the 0. Another mac with some kind of a radeon rendered the zeroes as if they were one.
I've read through the gl_PointSize and did not understand what is the desired behavior with gl_PointSize = 0.;
and did not understand the bit about aliasing and rounding.
If I want to "hide" certain points, should I rely on gl_PointSize = 0.;
to not even rasterize the points, or should i use something else?
Two things come to mind:
//frag
...
if( vSomeVarying < 0.5 ) discard; //but why even rasterize, and this is still like rendering all the points regardless of the discard?
...
//vert
if( someValue < 0.5 ){
gl_Position.z = 2.; //should clip it?
}
If I want to "hide" certain points, should I rely on gl_PointSize = 0.; to not even rasterize the points, or should i use something else?
Use something else, because in GLES 2.0.25 on p51:
If the value written to
gl_PointSize
is less than or equal to zero, results are undefined.
Forcing the vert to be clipped works.