The article here says:
Dividing x, y, and z by w accomplishes this. The resulting coordinates are called normalized device coordinates. Now all the visible geometric data lies in a cube with positions between <-1, -1, -1> and <1, 1, 1> in OpenGL, and between <-1, -1, 0> and <1, 1, 1> in Direct3D.
This raises a problem for cross-platform shaders which want to test the Z coordinate for some specific reason. Is there a way to get a Z coord in the same range, regardless of platform?
Where are you doing this testing of Z that you want to do?
If you're doing it in the fragment shader, then you shouldn't care. gl_FragCoord.z
or whatever Cg's equivalent to this is in window-space. The window-space Z extent is defined by glDepthRange
; by default, it goes from 0 to 1.
If you're doing this test in the vertex shader, then you'll just have to live with it. A better test might be one done in camera space, rather than clip-space or NDC space. At least then, you're usually dealing with world-sized distances.