有多个示例介绍如何使用 ThreeJS 通过创建视频纹理来显示网络摄像头视频:
video = document.getElementById( 'video' );
const texture = new THREE.VideoTexture( video );
texture.colorSpace = THREE.SRGBColorSpace;
const material = new THREE.MeshBasicMaterial( { map: texture } );
const geometry = new THREE.PlaneGeometry(1, 1);
const plane = new THREE.Mesh(geometry, material);
plane.position.set(0.5, 0.5, 0);
视频是播放网络摄像头提要的 html 元素。但问题是我无法访问提要并使用片段着色器来玩它!
如何在着色器文件中操作网络摄像头的视频源?我的着色器文件的材质是这样定义的:
const vsh = await fetch('vertex-shader.glsl');
const fsh = await fetch('fragment-shader.glsl');
material = new THREE.ShaderMaterial({
uniforms: {
resolution: { value: new THREE.Vector2(window.innerWidth, window.innerHeight) },
time: { value: 0.0 },
},
vertexShader: await vsh.text(),
fragmentShader: await fsh.text()
有什么想法或简单的例子可以证明这一点吗?
在材质着色器中,将视频源添加为纹理,如下所示:
const videoTexture = new THREE.VideoTexture(video);
//setup material with camera and shaders
const material = new THREE.ShaderMaterial({
uniforms: {
resolution: { value: new THREE.Vector2(window.innerWidth, window.innerHeight) },
time: { value: 0.0 },
uTexture: { value: videoTexture }
},
vertexShader: await vsh.text(),
fragmentShader: await fsh.text()
});
然后像这样为视频源设置动画:
function animate() {
requestAnimationFrame(animate);
if (video.readyState === video.HAVE_ENOUGH_DATA) {
videoTexture.needsUpdate = true;
}
renderer.render(scene, camera);
}
animate();
在您的 GLSL 文件中,像这样访问相机:
uniform vec2 resolution;
varying vec2 vUvs;
uniform sampler2D uTexture;
void main()
{
vec2 pixelCoords = vUvs /resolution;
vec4 webcamColor = texture2D(uTexture, vUvs);
gl_FragColor = webcamColor;
}