无法从 OpenCV Gstreamer 接收 gstreamer UDP 流
Cannot receive gstreamer UDP Stream from OpenCV Gstreamer
我正在开发 Gazebo Sim,它使用 'Gstreamer Plugin' 通过 UDP 流式传输摄像机视频。模拟开始于 Ubuntu 18.04.
有一些资源可以帮助您理解这个结构的后端。
Gazebo Simulation PX4 Guide
他们提到了如何创建管道:
The video from Gazebo should then display in QGroundControl just as it
would from a real camera.
It is also possible to view the video using the Gstreamer Pipeline.
Simply enter the following terminal command:
gst-launch-1.0 -v udpsrc port=5600 caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' \
! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink fps-update-interval=1000 sync=false
而且它在终端上运行良好。我读了这些问题:
using gstreamer with python opencv to capture live stream?
然后,我尝试使用以下几行将此管道实现到 opencv 中:
video = cv2.VideoCapture('udpsrc port=5600 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink fps-update-interval=1000 sync=false', cv2.CAP_GSTREAMER)
#video.set(cv2.CAP_PROP_BUFFERSIZE,3)
# Exit if video not opened.
if not video.isOpened():
print("Could not open video")
sys.exit()
# Read first frame.
ok, frame = video.read()
if not ok:
print('Cannot read video file')
sys.exit()
但它只给出错误:
无法打开视频
我在 opencv 中尝试了这个管道的不同变体,但 None 对我有帮助。
目前,您的管道无法让 OpenCV 从管道中提取解码的视频帧。这是因为所有帧都转到末尾的 autovideosink
元素,该元素负责在屏幕上显示帧。相反,您应该使用 appsink
元素,该元素专门用于允许应用程序从管道接收视频帧。
video = cv2.VideoCapture(
'udpsrc port=5600 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264"'
' ! rtph264depay'
' ! avdec_h264'
' ! videoconvert'
' ! appsink', cv2.CAP_GSTREAMER)
以下代码可以正常运行:
# Read video
video = cv2.VideoCapture("udpsrc port=5600 ! application/x-rtp,payload=96,encoding-name=H264 ! rtpjitterbuffer mode=1 ! rtph264depay ! h264parse ! decodebin ! videoconvert ! appsink", cv2.CAP_GSTREAMER);
我认为解码选项不正确。
我试过这段代码但没有用。
我还尝试了不同的管道。以下是我的终端管道:
发件人:
gst-launch-1.0 -v realsensesrc serial=$rs_serial timestamp-mode=clock_all enable-color=true ! rgbddemux name=demux demux.src_depth ! queue ! colorizer near-cut=300 far-cut=3000 ! rtpvrawpay ! udpsink host=192.168.100.80 port=9001
接收者:
gst-launch-1.0 udpsrc port=9001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)1280, height=(string)720, payload=(int)96" ! rtpvrawdepay ! videoconvert ! queue ! fpsdisplaysink sync=false
我可以在接收器中使用上述终端管道看到视频。
但是当我将它转换为 python 代码时,输出是:
无法打开视频
gst_receiver.py
import cv2
import sys
video = cv2.VideoCapture(
'udpsrc port=9001 ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW,'
'sampling=(string)RGB, depth=(string)8, width=(string)1280, height=(string)720, payload=(int)96'
' ! rtpvrawdepay ! decodebin ! videoconvert ! queue ! appsink', cv2.CAP_GSTREAMER)
# video.set(cv2.CAP_PROP_BUFFERSIZE,3)
# Exit if video not opened.
if not video.isOpened():
print("Could not open Video")
sys.exit()
# Read first frame.
ok, frame = video.read()
if not ok:
print('Cannot read Video file')
sys.exit()
系统:
Sender-PC = Ubuntu 18.04
Receiver-PC = Windows 10
Python = 3.7.9
OpenCV = 4.5.5
我正在开发 Gazebo Sim,它使用 'Gstreamer Plugin' 通过 UDP 流式传输摄像机视频。模拟开始于 Ubuntu 18.04.
有一些资源可以帮助您理解这个结构的后端。 Gazebo Simulation PX4 Guide
他们提到了如何创建管道:
The video from Gazebo should then display in QGroundControl just as it would from a real camera.
It is also possible to view the video using the Gstreamer Pipeline. Simply enter the following terminal command:
gst-launch-1.0 -v udpsrc port=5600 caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' \
! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink fps-update-interval=1000 sync=false
而且它在终端上运行良好。我读了这些问题:
using gstreamer with python opencv to capture live stream?
然后,我尝试使用以下几行将此管道实现到 opencv 中:
video = cv2.VideoCapture('udpsrc port=5600 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink fps-update-interval=1000 sync=false', cv2.CAP_GSTREAMER)
#video.set(cv2.CAP_PROP_BUFFERSIZE,3)
# Exit if video not opened.
if not video.isOpened():
print("Could not open video")
sys.exit()
# Read first frame.
ok, frame = video.read()
if not ok:
print('Cannot read video file')
sys.exit()
但它只给出错误:
无法打开视频
我在 opencv 中尝试了这个管道的不同变体,但 None 对我有帮助。
目前,您的管道无法让 OpenCV 从管道中提取解码的视频帧。这是因为所有帧都转到末尾的 autovideosink
元素,该元素负责在屏幕上显示帧。相反,您应该使用 appsink
元素,该元素专门用于允许应用程序从管道接收视频帧。
video = cv2.VideoCapture(
'udpsrc port=5600 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264"'
' ! rtph264depay'
' ! avdec_h264'
' ! videoconvert'
' ! appsink', cv2.CAP_GSTREAMER)
以下代码可以正常运行:
# Read video
video = cv2.VideoCapture("udpsrc port=5600 ! application/x-rtp,payload=96,encoding-name=H264 ! rtpjitterbuffer mode=1 ! rtph264depay ! h264parse ! decodebin ! videoconvert ! appsink", cv2.CAP_GSTREAMER);
我认为解码选项不正确。
我试过这段代码但没有用。 我还尝试了不同的管道。以下是我的终端管道:
发件人:
gst-launch-1.0 -v realsensesrc serial=$rs_serial timestamp-mode=clock_all enable-color=true ! rgbddemux name=demux demux.src_depth ! queue ! colorizer near-cut=300 far-cut=3000 ! rtpvrawpay ! udpsink host=192.168.100.80 port=9001
接收者:
gst-launch-1.0 udpsrc port=9001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)1280, height=(string)720, payload=(int)96" ! rtpvrawdepay ! videoconvert ! queue ! fpsdisplaysink sync=false
我可以在接收器中使用上述终端管道看到视频。 但是当我将它转换为 python 代码时,输出是:
无法打开视频
gst_receiver.py
import cv2
import sys
video = cv2.VideoCapture(
'udpsrc port=9001 ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW,'
'sampling=(string)RGB, depth=(string)8, width=(string)1280, height=(string)720, payload=(int)96'
' ! rtpvrawdepay ! decodebin ! videoconvert ! queue ! appsink', cv2.CAP_GSTREAMER)
# video.set(cv2.CAP_PROP_BUFFERSIZE,3)
# Exit if video not opened.
if not video.isOpened():
print("Could not open Video")
sys.exit()
# Read first frame.
ok, frame = video.read()
if not ok:
print('Cannot read Video file')
sys.exit()
系统:
Sender-PC = Ubuntu 18.04
Receiver-PC = Windows 10
Python = 3.7.9
OpenCV = 4.5.5