I am trying to use WebGL to render a scene to a framebuffer and look at the color value of a specified pixel (the one that is clicked on). The issue is when I try and get pixel data using gl.readPixels all pixels but (0,0) return null values (0,0,0,0). The (0,0) pixel (top left corner) returns what is expected, which is the clear color used for the framebuffer. Here is the main part of the code I am using:
gl.bindFramebuffer( gl.FRAMEBUFFER, fbo );
gl.viewport( 0, 0, canvas.width, canvas.height );
gl.clearColor( 1.0, 1.0, 1.0, 1.0 );
gl.clearDepth( 1.0 );
gl.clear( gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT );
// render code goes here
var pixelData = new Uint8Array( 4 );
gl.readPixels( screenPositionX, screenPositionY, 1, 1, gl.RGBA, gl.UNSIGNED_BYTE, pixelData );
gl.bindFramebuffer( gl.FRAMEBUFFER, null );
If I use gl.readPixels on the rendered scene, I get the correct results, but not when I am using it on the framebuffer. I suppose this means I could render the framebuffer and then use gl.readPixels, but I would prefer not to do this as the framebuffer is not meant to be seen and is instead used for data collection. Any info or suggestions?
EDIT: I have found the solution and posted it below.
I found the solution and it is one of those boneheaded mistakes. While I was attaching a texture for the color attachment, the texture I was attaching was only 1x1 pixels. This would explain why the (0,0) coordinate was the only one that was getting accurate data. All other pixels were considered outside the range of the texture. Once I fixed the texture size to that of the canvas, everything worked. Thanks to gman for pointing me in the right direction of looking at the attachments.