GStreamer appsrc -> h264enc -> rtph264pay -> udpsink 管道停顿

问题描述 投票:0回答:1

我是 GStreamer 新手,正在使用 GStreamer-Sharp、Visual Basic 和 Visual Studio 2022。我正在尝试创建一个简单的测试应用程序,该应用程序准备一系列方形灰度图像(即视频帧) 7.2fps,通过

appsrc
呈现给 GStreamer,使用
x264enc
进行编码并通过 UDP 作为 RTP 进行流式传输。我的管道:

appsrc ! video/x-raw,format=GRAY8,width=256,height=256,framerate=72/10 ! x264enc tune=zerolatency qp-max=0 key-int-max=72 bframes=3 intra-refresh=1 noise-reduction=200 ! rtph264pay pt=96 ! udpsink host=127.0.0.1 port=5000

不幸的是,当我运行我的应用程序时,我没有看到

udpsink
发出的 UDP 数据包。

x264enc
日志中我可以看到视频数据正在到达并正在被压缩。但是,此活动在大约 50 帧后停止。再过 4 帧后,
appsrc
开始发出
enough-data
信号,大概是因为
x264enc
不再获取任何数据并且
appsrc
的有限输入缓冲区已填满。

查看

rtph264pay
日志,我看到单个输入帧的到达,然后它尝试将其发送到
udpsink
。然而,
udpsink
日志完全是空的。就好像
udpsink
未初始化,或者
rtph264pay
难以将其输出数据传递给
udpsink

这是

rtph264pay
日志:

gst_rtp_h264_pay_getcaps:<Payload> returning caps video/x-h264, stream-format=(string)avc, alignment=(string)au; video/x-h264, stream-format=(string)byte-stream, alignment=(string){ nal, au }
gst_rtp_h264_pay_getcaps:<Payload> Intersect video/x-h264, stream-format=(string)avc, alignment=(string)au; video/x-h264, stream-format=(string)byte-stream, alignment=(string){ nal, au } and filter video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string){ avc, byte-stream }, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }
gst_rtp_h264_pay_getcaps:<Payload> returning caps video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string)avc, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }; video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string)byte-stream, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }
gst_rtp_h264_pay_getcaps:<Payload> Intersect video/x-h264, stream-format=(string)avc, alignment=(string)au; video/x-h264, stream-format=(string)byte-stream, alignment=(string){ nal, au } and filter video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string){ avc, byte-stream }, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }
gst_rtp_h264_pay_getcaps:<Payload> returning caps video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string)avc, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }; video/x-h264, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], stream-format=(string)byte-stream, alignment=(string)au, profile=(string){ high-4:4:4, high-4:2:2, high-10, high, main, baseline, constrained-baseline, high-4:4:4-intra, high-4:2:2-intra, high-10-intra }
gst_rtp_h264_pay_sink_event:<Payload> New stream detected => Clear SPS and PPS
gst_rtp_h264_pay_send_bundle:<Payload> no bundle, nothing to send
gst_rtp_h264_pay_getcaps:<Payload> Intersect video/x-h264, stream-format=(string)avc, alignment=(string)au; video/x-h264, stream-format=(string)byte-stream, alignment=(string){ nal, au } and filter video/x-h264, codec_data=(buffer)01640014ffe1001967640014f159010086c05b2000000300a000000911e28532c001000568efb2c8b0, stream-format=(string)avc, alignment=(string)au, level=(string)2, profile=(string)high, width=(int)256, height=(int)256, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)36/5, interlace-mode=(string)progressive, colorimetry=(string)1:4:0:0
gst_rtp_h264_pay_getcaps:<Payload> returning caps video/x-h264, codec_data=(buffer)01640014ffe1001967640014f159010086c05b2000000300a000000911e28532c001000568efb2c8b0, stream-format=(string)avc, alignment=(string)au, level=(string)2, profile=(string)high, width=(int)256, height=(int)256, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)36/5, interlace-mode=(string)progressive, colorimetry=(string)1:4:0:0
gst_rtp_h264_pay_setcaps:<Payload> have packetized h264
gst_rtp_h264_pay_setcaps:<Payload> profile 640014
gst_rtp_h264_pay_setcaps:<Payload> nal length 4
gst_rtp_h264_pay_setcaps:<Payload> num SPS 1
gst_rtp_h264_pay_setcaps:<Payload> SPS 0 size 25
gst_rtp_h264_pay_setcaps:<Payload> num PPS 1
gst_rtp_h264_pay_setcaps:<Payload> PPS 0 size 5
gst_rtp_h264_pay_handle_buffer:<Payload> got 861 bytes
gst_rtp_h264_pay_handle_buffer:<Payload> got NAL of size 2
gst_rtp_h264_pay_payload_nal:<Payload> payloading NAL Unit: datasize=2 type=9 pts=1000:00:00.000000000
gst_rtp_h264_pay_payload_nal_fragment:<Payload> sending NAL Unit: datasize=2 mtu=1400

遗憾的是,没有进一步写入

rtph264pay
日志,大概是因为它无法将其当前数据传递到
x264enc
,因此无法处理来自
udpsink
的任何新数据。

需要明确的是,我需要帮助来理解为什么

udpsink
没有从上游
rtph264pay
获取数据,以及如何纠正这种情况。我认为这是因为我的代码创建了无效的管道配置。所以我提供了下面代码的副本。

我使用一组更简单、更传统的

x264enc
参数运行代码,得到了相同的结果。我还使用
gst-launch-1.0
(将
appsrc
替换为
videotestsrc
capsfilter
)测试了管道,效果很好。设置
GST_DEBUG_DUMP_DOT_DIR
并运行
gst-launch1.0
和我自己的代码,我看到“.dot”文件显示了非常相似的地形。所以我相信我的代码一定接近正确。

这是我的应用程序报告的地形:

enter image description here

使用 GStreamer-Sharp,我的应用程序按如下方式配置 GStreamer:

Private Sub ConfigurePipeline()

    Gst.Application.Init()

    VInfo1 = New VideoInfo()
    VInfo1.SetFormat(VideoFormat.Gray8, SrcSize.Width, SrcSize.Height)
    VInfo1.FpsN = 72
    VInfo1.FpsD = 10
    VCaps = VInfo1.ToCaps()
    Diagnostics.Debug.WriteLine(VCaps.ToString)

    FrameInterval = VInfo1.FpsD / VInfo1.FpsN
    FrameDuration = Util.Uint64ScaleInt(VInfo1.FpsD, Gst.Constants.SECOND, VInfo1.FpsN)

    Dim FrameBytes As UInteger = 256 * 256
    ReDim FrameData(FrameBytes - 1)
    System.Array.Fill(FrameData, 127)

    'Pipe = Parse.Launch("appsrc ! video/x-raw,format=GRAY8,width=256,height=256,framerate=72/10 " &
    '                    "! x264enc tune=zerolatency qp-max=0 key-int-max=72 bframes=3 intra-refresh=1 noise-reduction=200 " &
    '                    "! rtph264pay pt=96 ! udpsink host=127.0.0.1 port=5000")

    Pipe = New Pipeline("Pipe")

    PBus = Pipe.Bus
    PBus.AddSignalWatch()
    AddHandler PBus.Message, AddressOf Handle_PBus_Message

    Source = New AppSrc("Source")
    Compress = ElementFactory.Make("x264enc", "Compress")
    Payload = ElementFactory.Make("rtph264pay", "Payload")
    UDPSink = ElementFactory.Make("udpsink", "UDPSink")

    Source.Caps = VCaps
    Source.SetProperty("stream-type", New GLib.Value(AppStreamType.Stream))
    Source.SetProperty("format", New GLib.Value(Gst.Constants.TIME_FORMAT))
    Source.SetProperty("emit-signals", New GLib.Value(True))

    AddHandler Source.NeedData, AddressOf Handle_Source_NeedData
    AddHandler Source.EnoughData, AddressOf Handle_Source_EnoughData

    Compress.SetProperty("tune", New GLib.Value("zerolatency"))
    Compress.SetProperty("qp-max", New GLib.Value(0))
    Compress.SetProperty("key-int-max", New GLib.Value(72))
    Compress.SetProperty("bframes", New GLib.Value(3))
    Compress.SetProperty("intra-refresh", New GLib.Value(1))
    Compress.SetProperty("noise-reduction", New GLib.Value(200))

    Payload.SetProperty("pt", New GLib.Value(96))

    UDPSink.SetProperty("host", New GLib.Value("127.0.0.1"))
    UDPSink.SetProperty("port", New GLib.Value(5000))

    Pipe.Add(Source, Compress, Payload, UDPSink)
    Source.Link(Compress)
    Compress.Link(Payload)
    Payload.Link(UDPSink)

    Dim Result As StateChangeReturn = Pipe.SetState(State.Playing)
    If Result = StateChangeReturn.Failure Then
        Diagnostics.Debug.WriteLine("Unable to set the pipeline to the playing state")
    Else

        MainLoop = New MainLoop()
        MainLoop.Run()

        FrameTimer.Stop()
        Diagnostics.Debug.WriteLine("Mainloop has exited, stopping pipeline")
        Pipe.SetState(State.Null)

    End If

    Diagnostics.Debug.WriteLine("Disposing pipeline elements")
    Pipe.Dispose()
    Source.Dispose()
    Compress.Dispose()
    Payload.Dispose()
    UDPSink.Dispose()

End Sub

AppSrc.NeedData
事件处理程序启动一个以 7.2Hz 滴答的
System.Timers.Timer
,并且
Timer.Elapsed
事件处理程序调用以下方法将数据传输到
appsrc

Private Sub NewFrame()

    Using GSTBuffer As New Buffer(FrameData)
        GSTBuffer.Pts = Timestamp
        GSTBuffer.Dts = Timestamp
        GSTBuffer.Duration = FrameDuration
        Timestamp += FrameDuration
        Source.PushBuffer(GSTBuffer)
    End Using

End Sub

AppSrc.EnoughData
事件处理程序仅将消息打印到控制台。

如果有人可以检查上述内容并就在哪里寻找我的错误提出任何建议。

我也将此问题发布到discourse.gstreamer.com,但尚未回复。如果收到有用的回复,我会将其作为更新传递。

谢谢

gstreamer h.264 gstreamer-sharp
1个回答
0
投票

在 GStreamer 讨论中某人的帮助下,我被引导更仔细地研究时间戳问题。我现在有了工作代码,尽管它工作的确切原因仍然是个谜。解决方案是“不”在发送到 appsrc 的缓冲区上设置时间戳。这是工作代码: Private Sub ConfigurePipeline() Gst.Application.Init() VInfo1 = New VideoInfo() VInfo1.SetFormat(VideoFormat.Gray8, FrameDimensions.Width, FrameDimensions.Height) VInfo1.FpsN = FrameRateN VInfo1.FpsD = FrameRateD VCaps = VInfo1.ToCaps() Diagnostics.Debug.WriteLine(VCaps.ToString) FrameDuration_ns = Util.Uint64ScaleInt(FrameRateD, Gst.Constants.SECOND, FrameRateN) 'Pipe = Parse.Launch("appsrc ! video/x-raw,format=GRAY8,width=256,height=256,framerate=72/10 " & ' "! x264enc tune=zerolatency qp-max=0 key-int-max=72 bframes=3 intra-refresh=1 noise-reduction=200 " & ' "! rtph264pay pt=96 ! udpsink host=127.0.0.1 port=5000") Pipe = New Pipeline("Pipe") PBus = Pipe.Bus PBus.AddSignalWatch() AddHandler PBus.Message, AddressOf Handle_PBus_Message Source = New AppSrc("Source") Compress = ElementFactory.Make("x264enc", "Compress") Payload = ElementFactory.Make("rtph264pay", "Payload") UDPSink = ElementFactory.Make("udpsink", "UDPSink") Source.Caps = VCaps Source.SetProperty("format", New GLib.Value(Gst.Constants.TIME_FORMAT)) Compress.SetProperty("tune", New GLib.Value(4)) ' "zerolatency" Compress.SetProperty("bitrate", New GLib.Value(2048)) 'Compress.SetProperty("is-live", New GLib.Value(True)) 'Compress.SetProperty("do-timestamp", New GLib.Value(True)) 'Compress.SetProperty("qp-max", New GLib.Value(0)) 'Compress.SetProperty("key-int-max", New GLib.Value(72)) 'Compress.SetProperty("bframes", New GLib.Value(3)) 'Compress.SetProperty("intra-refresh", New GLib.Value(1)) 'Compress.SetProperty("noise-reduction", New GLib.Value(200)) Payload.SetProperty("pt", New GLib.Value(96)) UDPSink.SetProperty("host", New GLib.Value("127.0.0.1")) UDPSink.SetProperty("port", New GLib.Value(5000)) Pipe.Add(Source, Compress, Payload, UDPSink) If Not (Source.Link(Compress) AndAlso Compress.Link(Payload) AndAlso Payload.Link(UDPSink)) Then Diagnostics.Debug.WriteLine("Unable to link elements") Else Diagnostics.Debug.WriteLine("Setting Pipeline to state PLAYING") Dim Result As StateChangeReturn = Pipe.SetState(State.Playing) If Result = StateChangeReturn.Failure Then Diagnostics.Debug.WriteLine("Unable to set Pipeline to the PLAYING state") Else MainLoop = New MainLoop() FrameTimer.Start() MainLoop.Run() FrameTimer.Stop() Diagnostics.Debug.WriteLine("Mainloop has exited, stopping Pipeline") Pipe.SetState(State.Null) End If End If Diagnostics.Debug.WriteLine("Disposing Pipeline elements") Pipe.Dispose() Source.Dispose() Compress.Dispose() Payload.Dispose() UDPSink.Dispose() End Sub

以及在 appsrc 中发出新缓冲区(即视频帧)的方法:

Private Sub NewFrame() UpdateFrame(FrameNumber) Using GSTBuffer As New Buffer(FrameData) Source.PushBuffer(GSTBuffer) End Using FrameNumber += 1 End Sub

	
© www.soinside.com 2019 - 2024. All rights reserved.