Ubuntu 14.04 Gstreamer 自动视频接收器
Ubuntu 14.04 Gstreamer autovideosink
我正在尝试在 Ubuntu 14.04 上获取 Gstreamer 视频流 运行,但接收方无法正确显示视频。我有一个发送 MJPEG 图像的发送管道,我是这样开始的:
gst-launch-1.0 -v videotestsrc ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5200
输出为:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw, width=(int)320, height=(int)240, framerate=(fraction)30/1, format=(string)I420, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:sink: caps = video/x-raw, width=(int)320, height=(int)240, framerate=(fraction)30/1, format=(string)I420, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:src: caps = image/jpeg, sof-marker=(int)0, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96, ssrc=(uint)1970481773, timestamp-offset=(uint)1012832172, seqnum-offset=(uint)21614
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96, ssrc=(uint)1970481773, timestamp-offset=(uint)1012832172, seqnum-offset=(uint)21614
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, sof-marker=(int)0, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 1012832172
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 21614
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
接收管道是:
gst-launch-1.0 udpsrc port=5200 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96" ! rtpjpegdepay ! jpegdec ! xvimagesink
注意我已经从发件人信息中复制了上限。
但是,我不断收到 "could not send sticky events" 错误,接收方立即终止。我做错了什么?我的输出如下:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.016961467 12229 0x266a400 WARN GST_PADS gstpad.c:3669:gst_pad_peer_query:<jpegdec0:src> could not send sticky events
0:00:00.017297845 12229 0x266a400 WARN basesrc gstbasesrc.c:2865:gst_base_src_loop:<udpsrc0> error: Internal data flow error.
0:00:00.017306308 12229 0x266a400 WARN basesrc gstbasesrc.c:2865:gst_base_src_loop:<udpsrc0> error: streaming task paused, reason not-negotiated (-4)
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:00.000651039
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
我已经尝试了 autovideosink,ximagesink,结果相同。
谢谢。
好的,您只需要添加 videoconvert - jpegdec 生成 xvimagesink 无法理解的视频:
gst-launch-1.0 udpsrc port=5200 ! application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)JPEG\,\ a-framerate\=\(string\)\"30\\,000000\"\,\ payload\=\(int\)26\,\ ssrc\=\(uint\)2512101225\,\ timestamp-offset\=\(uint\)1627080146\,\ seqnum-offset\=\(uint\)4727 ! rtpjpegdepay ! jpegdec ! videoconvert ! xvimagesink
我是怎么想出来的?通过这种方式使用 fakesink:
udpsrc ! fakesink -v
成功了,然后
udpsrc ! rptjpegdepay ! fakesink
再次工作..所以我继续这样,发现问题确实在 xvimagesink 中..奇怪的是我们没有得到更明确的错误消息..
您还可以通过以下方式调试应用程序:
GST_DEBUG=4 gst-launch-1.0 udpsrc .....
您可以滚动到未发生协商的部分并四处查看。
我正在尝试在 Ubuntu 14.04 上获取 Gstreamer 视频流 运行,但接收方无法正确显示视频。我有一个发送 MJPEG 图像的发送管道,我是这样开始的:
gst-launch-1.0 -v videotestsrc ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5200
输出为:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw, width=(int)320, height=(int)240, framerate=(fraction)30/1, format=(string)I420, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:sink: caps = video/x-raw, width=(int)320, height=(int)240, framerate=(fraction)30/1, format=(string)I420, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:src: caps = image/jpeg, sof-marker=(int)0, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96, ssrc=(uint)1970481773, timestamp-offset=(uint)1012832172, seqnum-offset=(uint)21614
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96, ssrc=(uint)1970481773, timestamp-offset=(uint)1012832172, seqnum-offset=(uint)21614
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, sof-marker=(int)0, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 1012832172
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 21614
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
接收管道是:
gst-launch-1.0 udpsrc port=5200 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96" ! rtpjpegdepay ! jpegdec ! xvimagesink
注意我已经从发件人信息中复制了上限。
但是,我不断收到 "could not send sticky events" 错误,接收方立即终止。我做错了什么?我的输出如下:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.016961467 12229 0x266a400 WARN GST_PADS gstpad.c:3669:gst_pad_peer_query:<jpegdec0:src> could not send sticky events
0:00:00.017297845 12229 0x266a400 WARN basesrc gstbasesrc.c:2865:gst_base_src_loop:<udpsrc0> error: Internal data flow error.
0:00:00.017306308 12229 0x266a400 WARN basesrc gstbasesrc.c:2865:gst_base_src_loop:<udpsrc0> error: streaming task paused, reason not-negotiated (-4)
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:00.000651039
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
我已经尝试了 autovideosink,ximagesink,结果相同。
谢谢。
好的,您只需要添加 videoconvert - jpegdec 生成 xvimagesink 无法理解的视频:
gst-launch-1.0 udpsrc port=5200 ! application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)JPEG\,\ a-framerate\=\(string\)\"30\\,000000\"\,\ payload\=\(int\)26\,\ ssrc\=\(uint\)2512101225\,\ timestamp-offset\=\(uint\)1627080146\,\ seqnum-offset\=\(uint\)4727 ! rtpjpegdepay ! jpegdec ! videoconvert ! xvimagesink
我是怎么想出来的?通过这种方式使用 fakesink:
udpsrc ! fakesink -v
成功了,然后
udpsrc ! rptjpegdepay ! fakesink
再次工作..所以我继续这样,发现问题确实在 xvimagesink 中..奇怪的是我们没有得到更明确的错误消息..
您还可以通过以下方式调试应用程序:
GST_DEBUG=4 gst-launch-1.0 udpsrc .....
您可以滚动到未发生协商的部分并四处查看。