I initialize depth testing here:
def _initGL(self):
glEnable(GL_DEPTH_TEST)
glDepthMask(GL_TRUE)
glDepthFunc(GL_NEVER)
glDepthRange(0.0, 1.0)
glClearDepth(1.0)
Then, later I display some geometry
def _display(self):
print(glGetBooleanv(GL_DEPTH_TEST))
print(glGetIntegerv(GL_DEPTH_FUNC)==GL_NEVER)
print(glGetFloatv(GL_DEPTH_RANGE))
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT)
glUseProgram(self._program)
#This is a wrapped VBO doing the drawing. That actually works.
self._buffer.draw()
glutSwapBuffers()
The output from the prints is
1 True [0. 1.]
So there should be no doubt that depth testing is on and set to never let any fragment pass, and yet:
EDIT: I just ran it on a windows machine instead (this was done on Ubuntu) and everything works perfectly.
Probably the default framebuffer (Ubuntu) has no depth buffer. You've to specify the display mode by glutInitDisplayMode
, before the OpenGL window is created by glutCreateWindow
.
GLUT_DEPTH
selects a window with a depth buffer. e.g.:
glutInitDisplayMode(GLUT_RGBA | GLUT_DEPTH | GLUT_DOUBLE)