Search code examples
c++openglwxwidgets

OpenGL is not discarding my fragments when depth testing is enabled


I am writing an OpenGL renderer with wxWidgets 3.1.3. I can see fragments that are hidden behind my model, but facing towards me even when they should be obscured. Setting my fragment shader to show the depth shows that the depths are being set correctly, and I have znear=0.1 and zfar=100.

Here is my OpenGL initialisation code:

this->m_glcontext = new wxGLContext(this);
this->SetCurrent(*this->m_glcontext);

std::remove(ENGINECANVAS_LOG_PATH);

glewExperimental = GL_TRUE;
glewInit();
glLoadIdentity();

glEnable(GL_CULL_FACE);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);;

glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);

glEnable(GL_DEBUG_OUTPUT);
glEnable(GL_DEBUG_OUTPUT_SYNCHRONOUS);

glDebugMessageCallback(MessageCallback, 0);

this->Bind(wxEVT_PAINT, &EngineCanvas::Paint, this);
this->Render();

Here is my render loop:

glViewport(0, 0, (GLint)this->GetSize().x, (GLint)this->GetSize().y);

glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

this->m_active_camera->GenPerspMat((float)canvas->GetSize().x, (float)canvas->GetSize().y);
this->m_active_camera->GenViewMat();

for (size_t i = 0; i < this->models.size(); i++)
{
    this->models.at(i)->GenPosMat();

    this->models.at(i)->shader_program->Select(); //selects shader program

    this->m_active_camera->SetUniforms(this->models.at(i)->shader_program);
    this->models.at(i)->SetUniforms();
        glBindVertexArray(this->models.at(i)->vao);

    for (size_t j = 0; j < this->models.at(i)->vertex_buffers_count.size(); j++)
    {
        glBindBuffer(GL_ARRAY_BUFFER, this->models.at(i)->vertex_buffers.at(j));
        glDrawArrays(this->models.at(i)->triangle_mode, 0, this->models.at(i)->vertex_buffers_count.at(j));
    }
}

glFlush();
this->SwapBuffers();

Fragment shader:

#version 400 core
layout(location = 0) out vec4 frag_out;
in vec3 vpos;

void main()
{
    frag_out = vec4(vec3(gl_FragCoord.z).xyz, 1.0f);
}

My fragments are discarded fine on my NVIDIA graphics card, but testing on two different laptops and my integrated graphics yields this error. My geometry itself renders correctly. No errors are returned by the callback (the callback is working). All cards support at least OpenGL 4.0 (which is specified in my shaders and my GLCanvas args).


Solution

  • Thanks to Rabbid76,

    I needed to specify the depth buffer in the arguments for the wxGLCanvas. Instead of in the wiki sample, where an array of ints is provided, I needed to use the wxGLAttributes class and pass it instead.

    wxGLAttributes args;
    args.PlatformDefaults().Depth(24).Stencil(8).RGBA().DoubleBuffer().EndList();
    

    Now I know where to look, this is documented in the pyramid sample provided by wxWidgets.