according to the books that i've read stencil test is achieved by comparing a reference value with that of the stencil buffer corresponding to a pixel, how ever in one of the books it states that:
A mask is bitwise AND-ed with the value in the stencil planes and with the reference value before the comparison is applied
here i see a third parameter which is the mask, is this a mask related to the stencil buffer or it is another parameter generated by openGL itself??
can someone explain the comparison process and the values
that have a role in this process??
glStencilMask (...)
is used to enable or disable writing to individual bits in the stencil buffer. To make the number of parameters manageable and accommodate stencil buffers of varying bit-depth, it takes a GLuint
instead of individual GLboolean
s like glColorMask (...)
and glDepthMask (...)
.
Typically the stencil buffer is 8-bits wide, though it need not be. The default stencil mask is such that every bit-plane is enabled. In an 8-bit stencil buffer, this means the default mask is 0xff (11111111b). Additionally, stenciling can be done separately for front/back facing polygons in OpenGL 2.0+, so there are technically two stencil masks.
In your question, you are likely referring to glStencilFunc (...)
, which also has a mask. This mask is not associated with the stencil buffer itself, but with the actual stencil test. The principle is the same, however; the above link details how this mask is AND'd together with the ref. value during testing.