我正在尝试使用FFMediaElement(基于FFmpeg的FFME,WPF MediaElement替换)组件在我的WPF应用程序中播放RSTP实时视频。
我与我的相机有良好的连接,我希望以最小的可用延迟来播放它。
我通过将ProbeSize
更改为最小值来减少延迟:
private void Media_OnMediaInitializing(object Sender, MediaInitializingRoutedEventArgs e)
{
e.Configuration.GlobalOptions.ProbeSize = 32;
}
但是从流的最开始,我仍然有大约1秒的延迟。我的意思是,当我开始播放时,我必须等待1秒直到视频出现,然后我有1秒的延迟。
我还尝试更改以下参数:
e.Configuration.GlobalOptions.EnableReducedBuffering = true;
e.Configuration.GlobalOptions.FlagNoBuffer = true;
e.Configuration.GlobalOptions.MaxAnalyzeDuration = TimeSpan.Zero;
但它没有结果。
我测量了FFmpeg输出线之间的时间间隔(第一列中的数字是从前一行经过的时间,ms)
---- OpenCommand: Entered
39 FFInterop.Initialize: FFmpeg v4.0
0 EVENT START: MediaInitializing
0 EVENT DONE : MediaInitializing
379 EVENT START: MediaOpening
0 EVENT DONE : MediaOpening
0 COMP VIDEO: Start Offset: 0,000; Duration: N/A
41 SYNC-BUFFER: Started.
609 SYNC-BUFFER: Finished. Clock set to 1534932751,634
0 EVENT START: MediaOpened
0 EVENT DONE : MediaOpened
0 EVENT START: BufferingStarted
0 EVENT DONE : BufferingStarted
0 OpenCommand: Completed
0 V BLK: 1534932751,634 | CLK: 1534932751,634 | DFT: 0 | IX: 0 | PQ: 0,0k | TQ: 0,0k
0 Command Queue (1 commands): Before ProcessNext
0 Play - ID: 404 Canceled: False; Completed: False; Status: WaitingForActivation; State:
94 V BLK: 1534932751,675 | CLK: 1534932751,699 | DFT: 24 | IX: 1 | PQ: 0,0k | TQ: 0,0k
因此,“同步缓冲”过程最多占用大部分时间。
是否有FFmpeg的参数允许减小缓冲区的大小?
我不知道这是否适用于WPF,但我使用此代码来减少使用C ++和WIN32 API的Microsoft H.264解码器的延迟。但即便如此,我仍然会在启动时获得短暂的延迟(估计为.5s至1s),但它优于默认值。解码器在开始吐出之前仍然会吸收一些传入的数据包。不确定它是否会改善你的1秒延迟。这里有一些代码可以让您了解所涉及的内容。
IMFTransform* pDecoderTransform;
// ...Set up pDecoderTransform
ICodecAPI* mpCodecAPI = NULL;
hr = pDecoderTransform->QueryInterface(IID_PPV_ARGS(&mpCodecAPI));
VARIANT var;
var.boolVal = VARIANT_TRUE;
hr = mpCodecAPI->SetValue(&CODECAPI_AVLowLatencyMode, &var);
我是FFME的作者。这是一个常见的问题。除了MediaInitializing事件中的容器配置选项之外,您还可以处理MediaOpening
事件并更改以下选项:(这仅适用于版本4.1.280及更高版本)
// You can render audio and video as it becomes available but the downside of disabling time
// synchronization is that video and audio will run on their own independent clocks.
// Do not disable Time Sync for streams that need synchronized audio and video.
e.Options.IsTimeSyncDisabled =
e.Info.Format == "libndi_newtek" ||
e.Info.InputUrl.StartsWith("rtsp://uno");
// You can disable the requirement of buffering packets by setting the playback
// buffer percent to 0. Values of less than 0.5 for live or network streams are not recommended.
e.Options.MinimumPlaybackBufferPercent = e.Info.Format == "libndi_newtek" ? 0 : 0.5;
// The audio renderer will try to keep the audio hardware synchronized
// to the playback position by default.
// A few WMV files I have tested don't have continuous enough audio packets to support
// perfect synchronization between audio and video so we simply disable it.
// Also if time synchronization is disabled, the recommendation is to also disable audio synchronization.
Media.RendererOptions.AudioDisableSync =
e.Options.IsTimeSyncDisabled ||
e.Info.InputUrl.EndsWith(".wmv");