我正在尝试制作简单的实时网络摄像头流(向许多客户端提供1个流媒体),并且在客户端显示视频时遇到了问题。
我的基本系统是使用MediaRecorder将网络摄像头记录为1000ms的块,然后使用WebSocket将其发送到服务器,然后广播记录所有其他用户的记录,并在用户网页上连续转换和播放这些ArrayBuffer。这是我的代码:
//recieve video chunks from server
socket.on('video-stream', (stream) => {
console.log(stream);
createVideo(URL.createObjectURL(new Blob(stream)), 1);
});
//record video in chunks, send over websocket
navigator.mediaDevices.getUserMedia({ video: true, audio: true }).then((stream) => {
setInterval(function () {
record(stream, 1000).then((recording) => {
socket.emit('video-stream', {
stream: recording,
room: window.location.pathname.split('/')[0] || '/',
cam: 1,
});
});
}, 1000);
});
var record = (stream, ms) => {
var rec = new MediaRecorder(stream),
data = [];
rec.ondataavailable = (e) => data.push(e.data);
rec.start();
var stopped = new Promise(
(y, n) => ((rec.onstop = y), (rec.onerror = (e) => n(e.error || e.name)))
);
return Promise.all([stopped, wait(ms).then(() => rec.stop())]).then(() => data);
};
var wait = (ms) => new Promise((resolve) => setTimeout(resolve, ms));
function createVideo(stream, cam) {
var video = document.getElementById('cam-' + cam + '-embed');
video.src = stream;
//video.addEventListener('click', () => {
// if (video.volume != 0) video.volume = 0;
// else video.volume = 1;
//});
}
问题是,这需要每1000毫秒更改页面上视频元素的src,这会使视频不断闪烁且不流畅。我需要某种方法来合并客户端上的传入视频缓冲区,而不是不断更改视频元素。我一直在试图找出如何做到这一点的运气。有人可以帮我将传入的数据合并为1个视频吗?
我也尝试过:-RTC-不起作用,因为流媒体用户需要大量带宽。-在服务器端对视频进行编码和连接,然后将其作为可读流传递到响应。这也没有用。
您需要mediasource和sourcebuffer。像这样的东西:
var video = document.querySelector("#video");
video.src = URL.createObjectURL(mediaSource)
socket.on('onChunk', (d) => {
if(mediaSource.readyState == 'open') {
sourceBuffer.appendBuffer(d);
}
})
var mediaSource = new MediaSource();
var sourceBuffer = null;
mediaSource.addEventListener("sourceopen", function()
{
sourceBuffer = mediaSource.addSourceBuffer("video/webm;codecs=vp8,opus");
});