Using A-Frame and Three.JS, I want to render to a WebGLRenderTarget and create a material based on its texture, like so:
var targetPlane = new THREE.Mesh(
new THREE.PlaneBufferGeometry(2, 2),
new THREE.MeshBasicMaterial({color: 'blue'})
);
targetPlane.position.set(-2, 1, -2);
scene.add(targetPlane);
var redPlane = new THREE.Mesh(
new THREE.PlaneBufferGeometry(2, 2),
new THREE.MeshBasicMaterial({color: 'red'})
);
redPlane.position.set(2, 1, -2);
scene.add(redPlane);
var myCamera = new THREE.PerspectiveCamera(45, 1, 0.1, 1000);
myCamera.position.set(2, 1, 0);
scene.add(myCamera);
var myRenderer = new THREE.WebGLRenderer();
myRenderer.setSize(200, 200);
myRenderer.render(scene, myCamera);
var renderTarget = new THREE.WebGLRenderTarget(200, 200, {minFilter: THREE.LinearFilter,
magFilter: THREE.NearestFilter});
myRenderer.setRenderTarget(renderTarget);
myRenderer.render(scene, myCamera);
targetPlane.material = new THREE.MeshBasicMaterial({map: renderTarget.texture});
renderer.render(scene, camera);
The outcome is just a blank white material. The camera settings are correct. The following code works:
myRenderer.setRenderTarget(null);
myRenderer.render(scene, myCamera);
img.src = myRenderer.domElement.toDataURL();
I made a fiddle to demonstrate.
You can't use a render target produced by a WebGLRenderer
with a different instance of WebGLRenderer
because WebGL resources like buffers, textures or shader programs can't be shared across WebGL contexts.
Hence, you have to use renderer
to produce the render target and to render the final scene.
Updated fiddle: https://jsfiddle.net/x4snr9tq/
three.js R112
.