There are multiple examples on how to display a webcam video using ThreeJS by creating a video texture like so :
video = document.getElementById( 'video' );
const texture = new THREE.VideoTexture( video );
texture.colorSpace = THREE.SRGBColorSpace;
const material = new THREE.MeshBasicMaterial( { map: texture } );
const geometry = new THREE.PlaneGeometry(1, 1);
const plane = new THREE.Mesh(geometry, material);
plane.position.set(0.5, 0.5, 0);
Where the video is an html element that plays the webcam's feed. But the problem is I can't access the feed and play with it using fragment shaders!
How can I manipulate the webcam's video feed in my shader files? I have the materials for my shader files defined like so :
const vsh = await fetch('vertex-shader.glsl');
const fsh = await fetch('fragment-shader.glsl');
material = new THREE.ShaderMaterial({
uniforms: {
resolution: { value: new THREE.Vector2(window.innerWidth, window.innerHeight) },
time: { value: 0.0 },
},
vertexShader: await vsh.text(),
fragmentShader: await fsh.text()
Any ideas or simple examples that show that?
In your materials shader, add the video feed as a texture like so :
const videoTexture = new THREE.VideoTexture(video);
//setup material with camera and shaders
const material = new THREE.ShaderMaterial({
uniforms: {
resolution: { value: new THREE.Vector2(window.innerWidth, window.innerHeight) },
time: { value: 0.0 },
uTexture: { value: videoTexture }
},
vertexShader: await vsh.text(),
fragmentShader: await fsh.text()
});
then animate the video feed like so:
function animate() {
requestAnimationFrame(animate);
if (video.readyState === video.HAVE_ENOUGH_DATA) {
videoTexture.needsUpdate = true;
}
renderer.render(scene, camera);
}
animate();
in your GLSL file, access the camera like so:
uniform vec2 resolution;
varying vec2 vUvs;
uniform sampler2D uTexture;
void main()
{
vec2 pixelCoords = vUvs /resolution;
vec4 webcamColor = texture2D(uTexture, vUvs);
gl_FragColor = webcamColor;
}