Search code examples
c++openglshaderalphablendingrender-to-texture

Creating and blending a dynamic texture in OpenGL


I need to render a sphere to a texture (done using a Framebuffer Object (FBO)), and then alpha blend that texture with the back buffer. So far I'm not doing any processing with the texture except clearing it at the beginning of every frame.

I should say that my scene consists of nothing but a planet in empty space, the sphere should appear next to or around the planet (kind of like a moon for now). When I render the sphere directly to the back buffer, it displays correctly; but when I do the intermediary step of rendering it to a texture and then blending that texture with the back buffer, the sphere only shows up when it is in front of the planet, the part that isn't in front is just "cut off":

Sphere that is cut off by the edge of the planet

I render the sphere using glutSolidSphere to a RGBA8 fullscreen texture that's bound to an FBO, making sure that every sphere pixel receives an alpha value of 1.0. I then pass the texture to a fragment shader program, and use this code to render a fullscreen quad - with the texture mapped onto it - to the backbuffer while alpha blending:

glBindFramebuffer(GL_FRAMEBUFFER, 0);
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_BLEND);

glDisable(GL_DEPTH_TEST);
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadIdentity();
glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
glBegin(GL_QUADS);
glTexCoord2i(0, 1);
glVertex3i(-1,  1, -1);   // TOP LEFT
glTexCoord2i(0, 0);
glVertex3i(-1, -1, -1);   // BOTTOM LEFT
glTexCoord2i(1, 0);   
glVertex3i( 1, -1, -1);   // BOTTOM RIGHT
glTexCoord2i(1, 1);   
glVertex3i( 1,  1, -1);   // TOP RIGHT
glEnd();
glPopMatrix();
glMatrixMode(GL_MODELVIEW);
glPopMatrix();
glEnable(GL_DEPTH_TEST);

glDisable(GL_BLEND);

This is the shader code (taken from an FX file written in Cg):

sampler2D BlitSamp = sampler_state
{
    MinFilter = LINEAR;
    MagFilter = LINEAR;
    MipFilter = LINEAR;
    AddressU = Clamp;
    AddressV = Clamp;
};

float4 blendPS(float2 texcoords : TEXCOORD0) : COLOR
{
    float4 outColor = tex2D(BlitSamp, texcoords);
    return outColor;
}

I don't even know whether this is a problem with the depth buffer or with alpha blending, I've tried a lot of combinations of enabling and disabling depth testing (with a depth buffer attached to the FBO) and alpha blending.

EDIT: I tried just rendering a blank fullscreen quad straight to the back buffer and even that was cropped around the planet's edges. For some reason, enabling depth testing for rendering the quad (that is, removing the lines glDisable(GL_DEPTH_TEST) and glEnable(GL_DEPTH_TEST) in the code above) got rid of the problem, but now everything but the planet and the sphere appears white:

Sphere with white incorrect white background

I made sure (and could confirm) that the alpha channel of the texture is 0 at every pixel but the sphere's, so I don't understand where the whiteness could be introduced. (Would also still be interested in an explanation why enabling depth testing has this effect.)


Solution

  • I solved it or at least came up with a work around.

    First off, the whiteness stems from the fact that glClearColor had been set to glClearColor(1.0f, 1.0f, 1.0f, 1000.0f), so everything but the planet wasn't even written to in the end. I now copy the contents of the back buffer (which is the planet, the atmosphere, and the space around it) to the texture before rendering the sphere, and I render the atmosphere and space before that copy/blit operation, so they are included in it. Previously, everything but the planet itself was rendered after my quad, which - when using depth testing - apparently placed everything behind the quad, making it invisible.

    The reference implementation of the effect I'm trying to achieve has always used this kind of blit operation in its code but I didn't think it was necessary for the effect. Now I feel like there might be no other way...