我正在尝试使用节点对一个简单的应用程序进行 docker 化,以通过 websocker 流式传输 RTSP 摄像机的内容。为了测试我使用的例子:rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mp4
我在node中的代码如下:
VideoStream = require('rtsp-multi-stream')
streamer = new VideoStream.VideoStream({
debug: true,
wsPort: 9000,
ffmpegPath: 'ffmpeg',
ffmpegArgs: {
'-b:v': '2048K',
'-an': '',
'-r': '24',
},
});
setInterval(() => console.log([...streamer.liveMuxers.keys()]), 10000);
当我在本地运行它时,它工作正常:
$ node ./rtsp.js
ffmpeg version 5.1.2-essentials_build-www.gyan.dev Copyright (c) 2000-2022 the F
Fmpeg developers
built with gcc 12.1.0 (Rev2, Built by MSYS2 project)
configuration: --enable-gpl --enable-version3 --enable-static --disable-w32thr
eads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --e
nable-libxml2 --enable-gmp --enable-lzma --enable-zlib --enable-libsrt --enable-
libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable
-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg
--enable-libvpx --enable-libass --enable-libfreetype --enable-libfribidi --enabl
e-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm -
-enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-d3d11va
--enable-dxva2 --enable-libmfx --enable-libgme --enable-libopenmpt --enable-libo
pencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --e
nable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --ena
ble-libvorbis --enable-librubberband
libavutil 57. 28.100 / 57. 28.100
libavcodec 59. 37.100 / 59. 37.100
libavformat 59. 27.100 / 59. 27.100
libavdevice 59. 7.100 / 59. 7.100
libavfilter 8. 44.100 / 8. 44.100
libswscale 6. 7.100 / 6. 7.100
libswresample 4. 7.100 / 4. 7.100
libpostproc 56. 6.100 / 56. 6.100
prueba: New WebSocket Connection (1 total)
Input #0, rtsp, from 'rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_11
5k.mp4':
Metadata:
title : BigBuckBunny_115k.mp4
Duration: 00:10:34.63, start: 0.000000, bitrate: N/A
Stream #0:0: Audio: aac (LC), 12000 Hz, stereo, fltp
Stream #0:1: Video: h264 (High), yuv420p(progressive), 240x160 [SAR 32:27 DAR
16:9], 24 fps, 24.08 tbr, 90k tbn
Stream mapping:
Stream #0:1 -> #0:0 (h264 (native) -> mpeg1video (native))
Stream #0:0 -> #0:1 (aac (native) -> mp2 (native))
Press [q] to stop, [?] for help
[mpeg1video @ 00000217d19e8840] too many threads/slices (11), reducing to 10
Output #0, mpegts, to 'pipe:':
Metadata:
title : BigBuckBunny_115k.mp4
encoder : Lavf59.27.100
Stream #0:0: Video: mpeg1video, yuv420p(progressive), 240x160 [SAR 32:27 DAR 1
6:9], q=2-31, 200 kb/s, 30 fps, 90k tbn
Metadata:
encoder : Lavc59.37.100 mpeg1video
Side data:
cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A
Stream #0:1: Audio: mp2, 16000 Hz, stereo, s16, 160 kb/s
Metadata:
encoder : Lavc59.37.100 mp2
[mpegts @ 00000217d38c64c0] Non-monotonous DTS in output stream 0:1; previous: 3
774, current: 2576; changing to 3775. This may result in incorrect timestamps in
the output file.
frame= 1 fps=0.0 q=0.0 size= 0kB time=00:00:00.60 bitrate= 0.0kbits/s
frame= 23 fps=0.0 q=2.8 size= 69kB time=00:00:01.46 bitrate= 384.0kbits/s
frame= 35 fps= 31 q=2.6 size= 104kB time=00:00:01.97 bitrate= 430.0kbits/s
frame= 48 fps= 30 q=2.3 size= 135kB time=00:00:02.47 bitrate= 446.3kbits/s
frame= 60 fps= 28 q=2.1 size= 168kB time=00:00:02.98 bitrate= 461.7kbits/s
frame= 72 fps= 27 q=2.0 size= 194kB time=00:00:03.48 bitrate= 454.9kbits/s
然后我将它传递给 docker,因为我有以下 dockerfile:
FROM node:16-alpine
RUN apk update
RUN apk add
RUN apk add ffmpeg
RUN mkdir -p /home/node/app
WORKDIR /home/node/app
COPY . .
RUN npm install
CMD [ "node", "rtsp-multi.js" ]
当我运行 docker 时,出现以下错误:
Socket connected /?url=rtsp%3A%2F%2Fwowzaec2demo.streamlock.net%2Fvod%2Fmp4%3ABigBuckBunny_115k.mp4
ffmpeg version 5.1.2 Copyright (c) 2000-2022 the FFmpeg developers
built with gcc 12.2.1 (Alpine 12.2.1_git20220924-r3) 20220924
configuration: --prefix=/usr --enable-avfilter --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-gnutls --enable-gpl --enable-libass --enable-libmp3lame --enable-libpulse --enable-libvorbis --enable-libvpx --enable-libxvid --enable-libx264 --enable-libx265 --enable-libtheora --enable-libv4l2 --enable-libdav1d --enable-lto --enable-postproc --enable-pic --enable-pthreads --enable-shared --enable-libxcb --enable-librist --enable-libsrt --enable-libssh --enable-libvidstab --disable-stripping --disable-static --disable-librtmp --disable-lzma --enable-libaom --enable-libopus --enable-libsoxr --enable-libwebp --enable-vaapi --enable-vdpau --enable-vulkan --enable-libdrm --enable-libzmq --optflags=-O2 --disable-debug --enable-libsvtav1
libavutil 57. 28.100 / 57. 28.100
libavcodec 59. 37.100 / 59. 37.100
libavformat 59. 27.100 / 59. 27.100
libavdevice 59. 7.100 / 59. 7.100
libavfilter 8. 44.100 / 8. 44.100
libswscale 6. 7.100 / 6. 7.100
libswresample 4. 7.100 / 4. 7.100
libpostproc 56. 6.100 / 56. 6.100
Error go live Timeout
[rtsp @ 0x7fdd7d811100] Could not find codec parameters for stream 1 (Video: h264, none, 240x160): unspecified pixel format
Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options
Input #0, rtsp, from 'rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mp4':
Metadata:
title : BigBuckBunny_115k.mp4
Duration: 00:10:34.63, start: 0.000000, bitrate: N/A
Stream #0:0: Audio: aac, 12000 Hz, stereo, fltp
Stream #0:1: Video: h264, none, 240x160, 90k tbr, 90k tbn
Socket closed
Stream mapping:
Stream #0:1 -> #0:0 (h264 (native) -> mpeg1video (native))
Press [q] to stop, [?] for help
Cannot determine format of input stream 0:1 after EOF
Error marking filters as finished
Exiting normally, received signal 15.
我试图将 as 参数放入 ffmpegArgs 中以增加分析时间和探测大小,但它似乎不起作用。我也尝试过使用节点库 node-rtsp-stream 但它也不起作用。
我需要让它在 docker 内部工作,但我做不到,我已经检查过,在我的 PC 和 docker 内部,node 和 ffmpeg 的版本是相同的。我也检查了其他旧版本,错误是一样的。我不知道还能尝试什么,我怎样才能让它在 docker 中工作?