Search code examples
python-3.xglfwpyopengl

OpenGL error 1286 when window is minimised


Some information about what I'm using:

  • Ancient Potato integrated GPU (Intel(R) HD Graphics Family)
  • Windows 7
  • OpenGL <=3.1
  • Python 3.7.0

Error that I get instantly the moment I simply minimise the window:

$ python main.py
Traceback (most recent call last):
  File "main.py", line 71, in <module>
    main()
  File "main.py", line 60, in main
    renderer.render(mesh)
  File "...\myproject\renderer.py", line 22, in render
    glDrawElements(GL_TRIANGLES, mesh.indices, GL_UNSIGNED_INT, ctypes.c_void_p(0))
  File "...\OpenGL\latebind.py", line 41, in __call__
    return self._finalCall( *args, **named )
  File "...\OpenGL\wrapper.py", line 854, in wrapperCall
    raise err
  File "...\OpenGL\wrapper.py", line 847, in wrapperCall
    result = wrappedOperation( *cArguments )
  File "...\OpenGL\error.py", line 232, in glCheckError
    baseOperation = baseOperation,
OpenGL.error.GLError: GLError(
        err = 1286,
        baseOperation = glDrawElements,
        pyArgs = (
                GL_TRIANGLES,
                6,
                GL_UNSIGNED_INT,
                c_void_p(None),
        ),
        cArgs = (
                GL_TRIANGLES,
                6,
                GL_UNSIGNED_INT,
                c_void_p(None),
        ),
        cArguments = (
                GL_TRIANGLES,
                6,
                GL_UNSIGNED_INT,
                c_void_p(None),
        )
)

When I googled OpenGL errorcode 1286 I found that this happens in OpenGL context when something is wrong with Framebuffer. That really doesn't tell anything to me...

# renderer.py
class Renderer:
    def __init__(self, colour=(0.0, 0.0, 0.0)):
        self.colour = colour

    @property
    def colour(self):
        return self._colour

    @colour.setter
    def colour(self, new_colour):
        glClearColor(*new_colour, 1.0)
        self._colour = new_colour

    def render(self, mesh):
        glBindVertexArray(mesh.vao_id)
        glBindTexture(GL_TEXTURE_2D, mesh.texture)
        glDrawElements(GL_TRIANGLES, mesh.indices, GL_UNSIGNED_INT, ctypes.c_void_p(0))

    def clear(self):
        glClear(GL_COLOR_BUFFER_BIT)

As I am using Framebuffers, I COULD have done something wrong, but I got everything to work the way I wanted it to (render to texture, then use the texture as source for rendering on quad and also as source for texture that will be rendered next frame, basically using GPU to manipulate grids of arbitrary data), I do it by swapping FBO's instead of swapping textures if it's unclear from the code:

# bufferedtexture.py (The place where I use Frame Buffer)
class BufferedTexture:
    def __init__(self, width, height):
        self._width = width
        self._height = height
        self._textures = glGenTextures(2)
        self._buffers = glGenFramebuffers(2)
        self._previous_buffer = 0
        self._current_buffer = 1

        self.init_buffer(0)
        self.init_buffer(1)

    @property
    def width(self):
        return self._width

    @property
    def height(self):
        return self._height

    @property
    def buffer(self):
        return self._buffers[self._current_buffer]

    @property
    def texture(self):
        return self._textures[self._previous_buffer]

    def init_buffer(self, index):
        glBindFramebuffer(GL_FRAMEBUFFER, self._buffers[index])
        glBindTexture(GL_TEXTURE_2D, self._textures[index])
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST)
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST)
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, self.width, self.height, 0, GL_RGBA, GL_UNSIGNED_BYTE, ctypes.c_void_p(0))
        glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, self._textures[index], 0)

    def set_texture_data(self, image_data):
        glBindTexture(GL_TEXTURE_2D, self.texture)
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, self.width, self.height, 0, GL_RGBA, GL_UNSIGNED_BYTE, image_data)

    def swap_buffers(self):
        self._previous_buffer = self._current_buffer
        self._current_buffer = (self._current_buffer + 1) % 2

    def enable(self):
        glBindFramebuffer(GL_FRAMEBUFFER, self.buffer)

    def disable(self):
        glBindFramebuffer(GL_FRAMEBUFFER, 0)

    def destroy(self):
        glDeleteFramebuffers(self._buffers)
        glDeleteTextures(self._textures)

And I use everything like this:

# main loop
while not window.should_close: # glfwWindowShouldClose(self.hwnd)
    shader.enable() # glUseProgram(self._program)

    buff.enable() # BufferedTexture object, source above
    renderer.clear() # Renderer object, source above
    renderer.render(mesh) # By the way, mesh is just a Quad, nothing fancy
    buff.disable() # Tells driver that we will be drawing to screen again

    buff.swap_buffers()
    mesh.texture = buff.texture # give quad the texture that was rendered off-screen
    renderer.clear()
    renderer.render(mesh)

    window.swap_buffers() # glfwSwapBuffers(self.hwnd)
    window.poll_events() # glfwPollEvents()

I don't even know what could be wrong, again, this only happens when I minimise the window, otherwise I can leave it to run for hours and it's fine, but the moment I minimise it dies...

I even tried to add

assert(glCheckFramebufferStatus(GL_FRAMEBUFFER) == GL_FRAMEBUFFER_COMPLETE) assert(glGetError() == GL_NO_ERROR)

at the end of BufferedTexture.init_buffer to quickly check whether it's a problem with FBO itself, but...

$ python main.py
<no assertion errors to be found>
<same error once I minimise>

TL;DR

  1. Everything renders properly as intended;
  2. haven't noticed any problems performance wise, no errors are being thrown or swallowed while I initialize glfw and openGL stuff, I raise RuntimeError myself when PyOpenGL would just be all fine with something going wrong (for some reason), without ever catching it;
  3. Program crashes with OpenGL: 1286 when I minimise the window, losing focus does nothing, only when I minimise it...

Send help.

EDIT:

mesh = Mesh(indices, vertices, uvs)

buff = BufferedTexture(800, 600)

with Image.open("cat.jpg") as image:
    w, h = image.size # the image is 800x600
    img_data = np.asarray(image.convert("RGBA"), np.uint8)
    buff.set_texture_data(img_data[::-1])
    buff.swap_buffers()
    buff.set_texture_data(img_data[::-1])

mesh.texture = buff.texture # this is just GL_TEXTURE_2D object ID
buff.disable()
while not window.should_close:
    shader.enable()

    #buff.enable()
    #renderer.clear()
    #renderer.render(mesh)
    #buff.disable()

    #buff.swap_buffers()
    #mesh.texture = buff.texture
    renderer.clear()
    renderer.render(mesh)

    window.swap_buffers()
    window.poll_events()

Once I stop using buffers completely, it works as intended. So there's something wrong with my code, I hope.


Solution

  • Someone pointed out (but deleted their answers/comments for whatever reason), that window size becomes 0, 0 when minimized, and that was indeed the case, to prevent crashing and waste of resources when window is minimized, I did this:

    Create and register a callback for window resize, this was very easy as I'm using only a single window and I already had it set up as a singleton, the callback just tells window whether it should sleep or not.

    def callback(window, width, height):
        Window.instance.sleeping = width == 0 or height == 0
    

    (obviously) registered the callback for my own window:

    glfwSetFramebufferSizeCallback(self.handle, callback)
    

    I don't do anything besides polling events when window is "sleeping":

    while not window.should_close:
        window.poll_events()
        if window.sleeping:
            time.sleep(0.01)
            continue
    
        shader.enable()
    
        buff.enable()
        renderer.clear()
        renderer.render(mesh)
        buff.disable()
    
        buff.swap_buffers()
        mesh.texture = buff.texture
        renderer.clear()
        renderer.render(mesh)
    
        window.swap_buffers()