Search code examples
c++luasdl

SDL_SetRenderTarget doesn't set the tartget


I am trying to write a C++ lambda that is registered and to be used in Lua using the Sol2 binding. The callback below should create an SDL_Texture, and clear it to a color. A Lua_Texture is just a wrapper for an SDL_Texture, and l_txt.texture is of type SDL_Texture*.

lua.set_function("init_texture",
    [render](Lua_Texture &l_txt, int w, int h)
    {
        // free any previous texture
        l_txt.deleteTexture();

        l_txt.texture = SDL_CreateTexture(render, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, w, h);

        SDL_SetRenderTarget(render, l_txt.texture);
        SDL_Texture *target = SDL_GetRenderTarget(render);

        assert(l_txt.texture == target);
        assert(target == nullptr);

        SDL_SetRenderDrawColor(render, 0xFF, 0x22, 0x22, 0xFF);
        SDL_RenderClear(render);
    });

My problem is that SDL_SetRenderTarget isn't functioning as I'd expect it. I try to set the texture as the target so I can clear it's color, but when I try to draw the texture to the screen it is still blank. The asserts in the above code both fail, and show that the current target texture is not set to the texture I am trying to clear and later use, nor is it Null (which is the expected value if there is no current target texture).

I have used this snippet of code before in just c++ (not as a Lua callback) and it works as intended. Somehow, embedding it in Lua causes the behavior to change. Any help is very much appreciated as I've been pulling my hair out over this for a while, thanks!


Solution

  • I may have an answer for you, but you're not going to like it.

    It looks like SDL_GetRenderTarget doesn't work as expected.

    I got the exact same problem you have (that's how I found your question), and I could reproduce it reliably using that simple program :

    int rendererIndex;
    
    [snipped code : rendererIndex is set to the index of the DX11 renderer]
    
    SDL_Renderer * renderer = SDL_CreateRenderer(pWindow->pWindow, rendererIndex, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC | SDL_RENDERER_TARGETTEXTURE);
    
    SDL_Texture* rtTexture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, 200, 200);
    
    SDL_SetRenderTarget(renderer, rtTexture);
    
    if(SDL_GetRenderTarget(renderer) != rtTexture)
      printf("ERROR.");
    
    

    This always produces :

    ERROR.

    The workaround I used it that I'm saving the pointer to the render target texture I'm setting for the renderer and I don't use SDL_GetRenderTarget.

    EDIT :

    I was curious why I didn't get the correct render target when getting it, and I look through SDL2's source code. I found out why (code snipped for clarity) :

    int
    SDL_SetRenderTarget(SDL_Renderer *renderer, SDL_Texture *texture)
    {
    
    // CODE SNIPPED
    
        /* texture == NULL is valid and means reset the target to the window */
        if (texture) {
            CHECK_TEXTURE_MAGIC(texture, -1);
            if (renderer != texture->renderer) {
                return SDL_SetError("Texture was not created with this renderer");
            }
            if (texture->access != SDL_TEXTUREACCESS_TARGET) {
                return SDL_SetError("Texture not created with SDL_TEXTUREACCESS_TARGET");
            }
    // *** EMPHASIS MINE : This is the problem.
            if (texture->native) {
                /* Always render to the native texture */
                texture = texture->native;
            }
        }
    
    // CODE SNIPPED
    
        renderer->target = texture;
    
    // CODE SNIPPED
    }
    
    SDL_Texture *
    SDL_GetRenderTarget(SDL_Renderer *renderer)
    {
        return renderer->target;
    }
    
    

    In short, the renderer saves the current render target in renderer->target, but not before converting the current texture to it's native form. When we use SDL_GetRenderTarget, we're getting that native texture, which may or may not be different.