Search code examples
c++openglrenderingglfwvoxel

After about a minute, my openGL app freezes and "D3D12: Removing Device." is printed to the console


I am working on a simple openGL based (voxel) engine, and performing a lot of updates per frame to some vertex and buffer data. After about a minute or so of running, the screen freezes and D3D12: Removing Device. is printed to the console.

The engine is pretty large, but i'll provide some of the important sudo code below.

#include "sim.hpp"

int main()
{
    Engine *engine = new Engine();
    do
    {
        glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
        engine->tick(voxelData);
        glfwSwapBuffers(engine->getWindow());
        glfwPollEvents();

    } while (glfwGetKey(engine->getWindow(), GLFW_KEY_ESCAPE) != GLFW_PRESS && 
    glfwWindowShouldClose(engine->getWindow()) == 0);
}

Above is my main file. In engine->tick(), the following is called:

calculateDeltaTime();
camera->update(deltaTime);
world->setVoxelData(voxelData);
renderer->buildBuffer(*world);
renderer->render(*world, *shaderManager, *camera, glm::mat4(1));

Most importantly, in buildBuffer(), voxel data is grabbed from the world object and a bunch of buffers and vertex arrays are created.

void Renderer::buildBuffer(World &world)
{
    glGenVertexArrays(1, &_vertexArrayId);
    glBindVertexArray(_vertexArrayId);

    const GLfloat *vertices = &(world.getVertices()[0]);
    const GLfloat *colors = &(world.getColors()[0]);
    const GLfloat *normals = &(world.getNormals()[0]);

    _numberOfVertices = static_cast<GLsizei>(world.getVertices().size());

    GLsizeiptr vertexSize = _numberOfVertices * sizeof(GLfloat);
    glGenBuffers(1, &_vertexBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, _vertexBuffer);
    glBufferData(GL_ARRAY_BUFFER, vertexSize, vertices, GL_STATIC_DRAW);

    GLsizeiptr colorSize = world.getColors().size() * sizeof(GLfloat);
    glGenBuffers(1, &_colorBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, _colorBuffer);
    glBufferData(GL_ARRAY_BUFFER, colorSize, colors, GL_STATIC_DRAW);

    GLsizeiptr normalsSize = world.getNormals().size() * sizeof(GLfloat);
    glGenBuffers(1, &_normalBuffer);
    glBindBuffer(GL_ARRAY_BUFFER, _normalBuffer);
    glBufferData(GL_ARRAY_BUFFER, normalsSize, normals, GL_STATIC_DRAW);
}

In the render function, the shaders are given certain parameters, and the buffers are bound and drawn.

Any ideas on a fix for this? Thanks!


Solution

  • I figured out the issue. I was calling buildBuffeer every tick, but never deleted previously created buffers. This caused my PC to run out of memory (at one point openGL was using 8GB of memory...). With a simple check, I was able to delete the buffer using the glDeleteVertexArrays function.