Search code examples
three.jsdepth-buffer

Depth component readRenderTargetPixels in Three.js?


Can depth pixel numbers be extracted from THREE.WebGLRenderer, similar to the .readRenderTargetPixels functionality? Basically, is there an update to this question. My starting point is Three.js r80. Normalized values are fine if I can also convert to distances.

Related methods:

  • I see that WebGL's gl.readPixels does not support gl.DEPTH_COMPONENT like OpenGL's .glReadPixels does.

  • THREE.WebGLRenderTarget does support a .depthTexture via THREE.WebGLRenderer's WEBGL_depth_texture extension. Although THREE.DepthTexture does not contain .image.data like THREE.DataTexture does.

  • I also see that THREE.WebGLShadowMap uses .renderBufferDirect with a THREE.MeshDepthMaterial.

Data types:

  • A non-rendered canvas, can use .getContext('2d') with .getImageData(x,y,w,h).data for the topToBottom pixels as a Uint8ClampedArray.
  • For a rendered canvas, render() uses getContext('webgl') and contexts may only be queried once, so getImageData cannot be used.
  • Instead render to a target and use .readRenderTargetPixels(...myArrToCopyInto...) to access (copy out) the bottomToTop pixels in your Uint8Array.
  • Any canvas can use .toDataURL("image/png") to return a String in the pattern "data:image/png;base64,theBase64PixelData".

Solution

  • You can't directly get the content of the FrameBuffer's depth attachment using readPixels. Whether it's a RenderBuffer or a (Depth) Texture. You have to write depth data in the color attachment.

    • You can render your scene using MeshDepthMaterial, like shadow mapping technic. You ends up with the depth RGBA encoded in the color attachment. You can get it using readPixels (still RGBA encoded). It mean you have to render your scene twice, one for the depth and one to display the scene on screen.

    • If the depth you want match what you show on screen (same camera/point of view) you can use WEBGL_depth_texture to render depth and display in one single render loop. It can be faster if your scene contains lots of objects/materials.

    • Finally, if your hardware support OES_texture_float, you should be able to draw depth data to a LUMINANCE/FLOAT texture instead of RGBA. This way you can directly get floating point depth data and skip a costly decoding process in js.