Search code examples
opengltextures

Read back texture coordinates from rendered image in OpenGL?


If I render a scene in openGL, is it possible to get back the texture coordinates that were used to paint that pixel?

For example, if I render a triangle that has 3 vertices (x,y,z) and 3 tex coords (u,v), and then I select a pixel on the triangle, I can get the color of the triangle and the depth using opengl calls, but is it possible to also get the interpolated texture coordinate?

Basically, I want to get the image point on the texture that was used to paint the triangle at a particular pixel.

I am guessing the only real way to do this is by reconstructing the ray that goes from the camera center through the pixel on the image plane, and then do a ray-triangle intersection to figure out which triangle it was, and then I can do a lookup into my texture array to get the texture coordinates of the triangle, and then do my own barycentric interpolation, but I would like to avoid having to do all that if possible.

Edit: The code I currently have didn't appear properly formatted in the bounty request below, so I've put it here. This is what I have right now, I would like to add reading texture coordinates u,v to it, ideally without a shader program if possible.

    // First initialize the FBO, I am interested in depth and color

// create a framebuffer object
glGenFramebuffers(1, &id);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, id);

// Create texture to store color info
glGenTextures(1, &color);
glBindTexture(GL_TEXTURE_2D, color);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA,  width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, color, 0);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0);

// Create render buffer to store depth info
glGenRenderbuffers(1, &depth);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, depth);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT, width, height);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, 0);

// Attach the renderbuffer to depth attachment point
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depth);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);

// Then later in the code, I use the actual buffer:

glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fboId);
...
//draw model
...
//read color and depth values (want to also read texture coordinate values u and v here too)
...
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0);

Solution

  • If you are determined to do this without using shaders, you could render your scene without lighting and using a single texture for every object. This texture would be filled with two gradients. The red channel would go from 0 to 255 horizontally and the green channel would go from 0 to 255 vertically. Now you have effectively painted the scene using the texture coordinates (assuming they are in the range 0-1). You can use glReadPixels to read back the buffer (or part of the buffer) you have just rendered to and use the red channel to retrieve u and the green channel to retrieve v.