我有以下传输视频的管道:
发件人:
gst-launch-1.0 rpicamsrc预览=0!'视频/x-h264,宽度=1280,高度=720,帧速率=15/1,轮廓=高'!队列RTPH264支付!udpsink主机=192.168.0.8端口=50000
收件人:
gst-launch-1.0 udpsrc端口=50000 caps=“应用程序/x-rtp,媒体=(字符串)视频,时钟速率=(int)90000,编码名称=(字符串)H264”!rtph264depay!decodebin!自动视频接收器
这工作正常,但我想用python做接收器,并将视频流引导到窗口中,以某种方式如下所示:
import gi
gi.require_version('Gst', '1.0')
from gi.repository import GObject, Gst, Gtk, GdkX11,GstVideo
GObject.threads_init()
Gst.init(None)
class VideoReceiver:
def __init__(self):
self.window = Gtk.Window()
self.window.connect('destroy', self.stop)
self.window.set_default_size(320, 200)
self.drawingarea = Gtk.DrawingArea()
self.window.add(self.drawingarea)
self.window.show_all()
self.xid = self.drawingarea.get_property('window').get_xid()
self.pipeline = Gst.parse_launch ('udpsrc name=udpsrc port=50000'
' caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! '
'rtph264depay ! decodebin ! autovideosink')
self.bus = self.pipeline.get_bus()
self.bus.add_signal_watch()
self.bus.connect('message::error', self.on_error)
self.bus.enable_sync_message_emission()
self.bus.connect('sync-message::element', self.on_sync_message)
def start(self):
self.pipeline.set_state(Gst.State.PLAYING)
Gtk.main()
def stop(self, window):
self.pipeline.set_state(Gst.State.NULL)
Gtk.main_quit()
def on_sync_message(self, bus, msg):
if msg.get_structure().get_name() == 'prepare-window-handle':
print('prepare-window-handle')
msg.src.set_property('force-aspect-ratio', True)
msg.src.set_window_handle(self.xid)
def on_error(self, bus, msg):
print('on_error():', msg.parse_error())
vr1=VideoReceiver()
vr1.start()
但是,当流媒体开始时,窗口只是简单地关闭,并且程序结束时没有错误。有什么想法可能是错误的,我如何将视频输出定向到窗口中?
如果在没有根权限的情况下运行,则输出:
$GST\U调试=3 python3.2测试。py公司
** (test.py:3275): WARNING **: Error retrieving accessibility bus address: org.freedesktop.DBus.Error.ServiceUnknown: The name org.a11y.Bus was not provided by any .service files
prepare-window-handle
0:00:04.134038733 3275 0x1c72260 ERROR egladaption gstegladaptation_egl.c:311:gst_egl_adaptation_create_surface:<autovideosink0-actual-sink-eglgles> Can't create surface
0:00:04.135032949 3275 0x1c72260 ERROR egladaption gstegladaptation.c:461:gst_egl_adaptation_init_surface:<autovideosink0-actual-sink-eglgles> Can't create surface
0:00:04.135378104 3275 0x1c72260 ERROR egladaption gstegladaptation.c:657:gst_egl_adaptation_init_surface:<autovideosink0-actual-sink-eglgles> Couldn't setup EGL surface
0:00:04.135678780 3275 0x1c72260 ERROR eglglessink gsteglglessink.c:2132:gst_eglglessink_configure_caps:<autovideosink0-actual-sink-eglgles> Couldn't init EGL surface from window
0:00:04.135971436 3275 0x1c72260 ERROR eglglessink gsteglglessink.c:2144:gst_eglglessink_configure_caps:<autovideosink0-actual-sink-eglgles> Configuring caps failed
0:00:04.137130443 3275 0x1c78a60 ERROR eglglessink gsteglglessink.c:2167:gst_eglglessink_setcaps:<autovideosink0-actual-sink-eglgles> Failed to configure caps
0:00:04.137830336 3275 0x1c78a60 ERROR eglglessink gsteglglessink.c:2167:gst_eglglessink_setcaps:<autovideosink0-actual-sink-eglgles> Failed to configure caps
0:00:04.138175544 3275 0x1c78a60 WARN GST_PADS gstpad.c:3620:gst_pad_peer_query:<sink:proxypad1> could not send sticky events
0:00:04.157868139 3275 0x1c78a60 ERROR eglglessink gsteglglessink.c:2167:gst_eglglessink_setcaps:<autovideosink0-actual-sink-eglgles> Failed to configure caps
0:00:04.158217826 3275 0x1c78a60 ERROR eglglessink gsteglglessink.c:2167:gst_eglglessink_setcaps:<autovideosink0-actual-sink-eglgles> Failed to configure caps
0:00:04.158321940 3275 0x1c78a60 WARN GST_PADS gstpad.c:3620:gst_pad_peer_query:<sink:proxypad1> could not send sticky events
0:00:04.184023215 3275 0x1c78a60 ERROR eglglessink gsteglglessink.c:2167:gst_eglglessink_setcaps:<autovideosink0-actual-sink-eglgles> Failed to configure caps
0:00:04.184216600 3275 0x1c78a60 WARN GST_PADS gstpad.c:3620:gst_pad_peer_query:<sink:proxypad1> could not send sticky events
0:00:04.185187274 3275 0x1c78a60 ERROR eglglessink gsteglglessink.c:2167:gst_eglglessink_setcaps:<autovideosink0-actual-sink-eglgles> Failed to configure caps
0:00:04.185499825 3275 0x1c78a60 ERROR eglglessink gsteglglessink.c:2167:gst_eglglessink_setcaps:<autovideosink0-actual-sink-eglgles> Failed to configure caps
0:00:04.186118000 3275 0x1c78a60 WARN omxvideodec gstomxvideodec.c:2817:gst_omx_video_dec_loop:<omxh264dec-omxh264dec0> error: Internal data stream error.
0:00:04.186551488 3275 0x1c78a60 WARN omxvideodec gstomxvideodec.c:2817:gst_omx_video_dec_loop:<omxh264dec-omxh264dec0> error: stream stopped, reason not-negotiated
0:00:04.187462163 3275 0x1c78a60 ERROR eglglessink gsteglglessink.c:2167:gst_eglglessink_setcaps:<autovideosink0-actual-sink-eglgles> Failed to configure caps
0:00:04.187758151 3275 0x1c78a60 ERROR eglglessink gsteglglessink.c:2167:gst_eglglessink_setcaps:<autovideosink0-actual-sink-eglgles> Failed to configure caps
on_error(): (GError('Internal data stream error.',), 'gstomxvideodec.c(2817): gst_omx_video_dec_loop (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0:\nstream stopped, reason not-negotiated'
输出为根目录:
GST_DEBUG=3 sudo python3.2test.py
** (test.py:3205): WARNING **: Error retrieving accessibility bus address: org.freedesktop.DBus.Error.ServiceUnknown: The name org.a11y.Bus was not provided by any .service files
prepare-window-handle
所以没有错误从Gstream er,窗口只是关闭时,流开始。
我没有为此目的使用gtk,只有wxpython,所以只是一些想法。
尝试更改:
self.bus.connect('message::error', self.on_error)
到
self.bus.connect('message', self.on_error)
然后从那里的消息中挑选骨头,即。
t = message.type
if t == Gst.MessageType.ERROR:
error_0 = Gst.Message.parse_error (message)[0]
error_1 = Gst.Message.parse_error (message)[1]
report errors here.......
elif t == Gst.MessageType.EOS:
print ("End of Audio")
return True
从on\u error和on\u sync\u message always(on\u错误和on\u同步消息)返回True(真),最后,尝试在非绘图区域的窗口启动set\u window(设置窗口)句柄。
我尝试使用服务器上的命令检查Windows OS计算机上的视频rtp流: gst-launch-1.0文件rc location=d:/TestVideos/lama。mp4!qtdemux!视频/x-h264!RTPH264支付!udpsink主机=192.168.1.162端口=6001 然后我试着捕捉水流: gst-launch-1.0-v udpsrc port=6001 caps=“应用
我是gstreamer的新手,我想录制音频和视频并将其保存到。mp4格式,使用网络摄像头录制视频,使用麦克风录制音频这是我的管道 gst-launch-1.0-e v4l2src!队列x264enc!H264解析!队列qtmux0.alsasrc!'音频/x-raw,速率=44100,深度=24’!音频转换!音频重采样!voaacenc!aacparse!qtmux!filesink locati
因此,我使用了以下代码: 而我的问题: 一个叫做appsrc的视频文件!自动视频转换!omxh264enc!matroskamux!创建filesink location=test2.mkv sync=false(而不是test2.mkv),并且它为空。同样的问题被纠正了,并且在下面的线程中似乎可以工作,对我来说没有任何影响:用VideoWriter从OpenCV打开一个GStreamer管道 n
说明 调用方法1: $.f2eAct.video(options); 函数说明: 视频弹窗 参数说明: 参数名 类型 说明 备注 width int 视频宽度 必要 height int 视频高度 必要 url string 视频地址 必要 类绑定 <html> <div class="f2e-act-video" data-width="500" data-height="400"
我想创建一个gstreamer管道来背靠背播放两个mp4视频。是否可以使用gst启动进行游戏?我可以为此使用multifilesrc吗? 请告诉我播放两个视频背靠背的路径。 提前感谢!
我们正在使用在Xamarin中构建一个应用程序。在应用程序中,我们需要播放视频,因此我们为此编写了一些代码。但是,视频没有播放,在Android上应用程序崩溃,同时抛出一个通用错误。 这是代码: 视频容器。反恐精英 VideoViewRender。反恐精英 所发生的情况是,被记录到控制台,但没有。我们从Xamarin论坛获得了这段代码,但未能成功实现。我们做错了什么?