Search code examples
windowsopengltexturesframebufferfbo

GL_INVALID_VALUE from glFramebufferTexture2DEXT only after delete/realloc texture


I have some C# (SharpGL-esque) code which abstracts OpenGL frame buffer handling away to simple "set this texture as a 'render target'" calls. When a texture is first set as a render target, I create an FBO with matching depth buffer for that size of texture; that FBO/depth-buffer combo will then be reused for all same-sized textures.

I have a curious error as follows.

Initially the app runs and renders fine. But if I increase my window size, this can cause some code to need to resize its 'render target' texture, which it does via glDeleteTextures() and glGenTextures() (then bind, glTexImage2D, and texparams so MIN_FILTER and MAG_FILTER are both GL_NEAREST). I've observed I tend to get the same name (ID) back when doing so (as GL reuses the just-freed name).

We then hit the following code (with apologies for the slightly bastardised GL-like syntax):

    void SetRenderTarget(Texture texture)
    {
        if (texture != null)
        {
            var size = (texture.Width << 16) | texture.Height;
            FrameBufferInfo info;
            if (!_mapSizeToFrameBufferInfo.TryGetValue(size, out info))
            {
                info = new FrameBufferInfo();
                info.Width = texture.Width;
                info.Height = texture.Height;

                GL.GenFramebuffersEXT(1, _buffer);
                info.FrameBuffer = _buffer[0];

                GL.BindFramebufferEXT(GL.FRAMEBUFFER_EXT, info.FrameBuffer);
                GL.FramebufferTexture2DEXT(GL.FRAMEBUFFER_EXT, GL.COLOR_ATTACHMENT0_EXT, GL.TEXTURE_2D, texture.InternalID, 0);

                GL.GenRenderbuffersEXT(1, _buffer);
                info.DepthBuffer = _buffer[0];
                GL.BindRenderBufferEXT(GL.RENDERBUFFER_EXT, info.DepthBuffer);
                GL.RenderbufferStorageEXT(GL.RENDERBUFFER_EXT, GL.DEPTH_COMPONENT16, texture.Width, texture.Height);
                GL.BindRenderBufferEXT(GL.RENDERBUFFER_EXT, 0);
                GL.FramebufferRenderbufferEXT(GL.FRAMEBUFFER_EXT, GL.DEPTH_ATTACHMENT_EXT, GL.RENDERBUFFER_EXT, info.DepthBuffer);
                _mapSizeToFrameBufferInfo.Add(size, info);
            }
            else
            {
                GL.BindFramebufferEXT(GL.FRAMEBUFFER_EXT, info.FrameBuffer);
                GL.FramebufferTexture2DEXT(GL.FRAMEBUFFER_EXT, GL.COLOR_ATTACHMENT0_EXT, GL.TEXTURE_2D, texture.InternalID, 0);
            }
            GL.CheckFrameBufferStatus(GL.FRAMEBUFFER_EXT);
        }
        else
        {
            GL.FramebufferTexture2DEXT(GL.FRAMEBUFFER_EXT, GL.COLOR_ATTACHMENT0_EXT, GL.TEXTURE_2D, 0, 0);
            GL.BindFramebufferEXT(GL.FRAMEBUFFER_EXT, 0);
        }

        ProjectStandardOrthographic();
    }

After said window resize, GL returns a GL_INVALID_VALUE error from the glFramebufferTexture2DEXT() call (identified with glGetError() and gDEBugger). If I ignore this, glCheckFrameBufferStatus() later fails with GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT. If I ignore this too, I can see the expected "framebuffer to dubious to do anything" errors if I check for them and a black screen if I don't.

I'm running on an NVidia GeForce GTX 550 Ti, Vista 64 (32 bit app), 306.97 drivers. I'm using GL 3.3 with the Core profile.

Workaround and curiosity: If when rellocating textures I glGenTextures() before glDeleteTextures() - to avoid getting the same ID back - the problem goes away. I don't want to do this as it's a stupid kluge and increases my chances of out of memory errors. I'm theorising it's because GL was/is using the texture in a recent FBO and now has decided that texture ID is in use or is no longer valid in some way and so isn't acceptable? Maybe?

After the problem gDEBugger shows that both FBOs (the original one with the smaller depth buffer and previous texture, and the new one with the larger combination) have the same texture ID attached.

I've tried detaching the texture from the frame buffer (via glFramebufferTexture2DEXT again) before deallocation, but to no avail (gDEBuffer reflects the change but the problem still occurs). I've tried taking out the depth buffer entirely. I've tried checking the texture sizes via glGetTexLevelParameter() before I use it; it does indeed exist.


Solution

  • This sounds like a bug in NVIDIA's OpenGL implementation. Once you delete an object name, that object name becomes invalid, and thus should be a legitimate candidate for glGen* to return.

    You should file a bug report, with a minimal case that reproduces the issue.

    I don't want to do this as it's a stupid kluge and increases my chances of out of memory errors.

    No, it doesn't. glGenTextures doesn't allocate storage for textures (which is where any real OOM errors might come from). It only creates the texture name. It's unfortunate that you have to use a workaround, but it's not any real concern.