使用 udpscr 时,Gstreamer 插件无法在 Android 上正确播放视频

Gstreamer plugin cannot play video correctly on Android when using udpscr

我在实现 gstreamer 插件以在 Android 上播放 RTP 视频时遇到了一些问题。我有以下代码(可以正常工作):

    full_pipeline_description = g_strdup_printf("playbin3 uri=%s", uri);
gub_log_pipeline(pipeline, "Using pipeline: %s", full_pipeline_description);
pipeline->pipeline = gst_parse_launch(full_pipeline_description, &err);
g_free(full_pipeline_description);
if (err) {
    gub_log_pipeline(pipeline, "Failed to create pipeline: %s", err->message);
    return;
}

vsink = gst_parse_bin_from_description(gub_get_video_branch_description(), TRUE, NULL);
gub_log_pipeline(pipeline, "Using video sink: %s", gub_get_video_branch_description());
g_object_set(pipeline->pipeline, "video-sink", vsink, NULL);
g_object_set(pipeline->pipeline, "flags", 0x0003, NULL);

bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline->pipeline));
gst_bus_add_signal_watch(bus);
gst_object_unref(bus);
g_signal_connect(bus, "message", G_CALLBACK(message_received), pipeline);

if (vsink) {
    // Plant a pad probe to answer context queries
    GstElement *sink;
    sink = gst_bin_get_by_name(GST_BIN(vsink), "sink");
    if (sink) {
        GstPad *pad = gst_element_get_static_pad(sink, "sink");
        if (pad) {
            gulong id = gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_QUERY_DOWNSTREAM, pad_probe, pipeline, NULL);
            gst_object_unref(pad);
        }
        gst_object_unref(sink);
    }
}

并且将相同的代码与另一个管道(基于 udpsrc 而不是 playbin3)一起使用则不会。我在这种情况下使用的管道是:

udpsrc 端口=53512! application/x-rtp,media=video,clock-rate=90000,encoding-name=H264,payload=96 ! rtph264depay !解码bin3! glupload !颜色转换! video/x-raw(内存:GLMemory),格式=RGBA,纹理目标=2D! fakesink sync=0 qos=1 name=sink

代码如下:

      full_pipeline_description = g_strdup_printf("%s", pipeline_cmd);
  gub_log_pipeline(pipeline, "Using pipeline: %s", full_pipeline_description);
  pipeline->pipeline = gst_parse_launch(full_pipeline_description, &err);
  g_free(full_pipeline_description);
  if (err) {
    gub_log_pipeline(pipeline, "Failed to create pipeline: %s", err->message);
    return;
  }

  vsink = gst_parse_bin_from_description(gub_get_video_branch_description(), TRUE, NULL);
  gub_log_pipeline(pipeline, "Using video sink: %s", gub_get_video_branch_description());
  g_object_set(pipeline->pipeline, "sink", vsink, NULL);
  g_object_set(pipeline->pipeline, "flags", 0x0003, NULL);

  bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline->pipeline));
  gst_bus_add_signal_watch(bus);
  gst_object_unref(bus);
  g_signal_connect(bus, "message", G_CALLBACK(message_received), pipeline);

  // Plant a pad probe to answer context queries
  GstElement *sink;
  sink = gst_bin_get_by_name(GST_BIN(vsink), "sink");
  if (sink) {
    GstPad *pad = gst_element_get_static_pad(sink, "sink");
    if (pad) {
      gulong id = gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_QUERY_DOWNSTREAM, pad_probe, pipeline, NULL);
      gst_object_unref(pad);
    }
    gst_object_unref(sink);
  }

基本上,在这种情况下,我只看到消隐 window(具有不同的颜色)。执行中的唯一区别是 pad_probe 在使用 playbin3 时被调用,但在使用 udpsrc 时不被调用。这是我可以看到添加一些日志的唯一区别。我想了解为什么在使用 udpsrc 时不调用此回调,以及我是否遗漏了某些内容或使用错误。

我在使用 gstreamer-1.14.4 和 1.16.2 版本时都面临同样的问题。任何提示都非常受欢迎。

g_object_set(pipeline->pipeline, "sink", vsink, NULL); 实际上什么也没做; GstPipeline 没有“接收器”属性(与 playbin 不同)。通常,它会吐出一条日志警告,准确说明这一点。

要将接收器添加到管道,您需要像通常在 GStreamer 应用程序中那样做:找到需要连接的源板,或者等到正确的源填充到 link 它会出现在“pad-added”信号中(这发生在例如 decodebin 中)。

经过一些调查并基于此线程 Gstreamer devel,我终于找到了问题的根本原因。基本上我怀疑在使用 udpsrc 时没有调用 pad probe 的回调,它只在使用 playbin3 时有效。结果,没有提供图形上下文,视频也没有正确再现。为了解决这个问题,我不得不添加逻辑来处理总线上的消息以正确回答 GST_MESSAGE_NEED_CONTEXT 请求。为此,首先您必须连接一个回调来处理总线消息,如下所示:

g_signal_connect(bus, "message", G_CALLBACK(message_received), pipeline);

然后在message_received函数中,我添加了如下代码

static void message_received(GstBus *bus, GstMessage *message, GUBPipeline *pipeline) {
    switch (GST_MESSAGE_TYPE(message)) {
    ...
      case GST_MESSAGE_NEED_CONTEXT:
    {
        const gchar *context_type;
        GstContext *context = NULL;
        gst_message_parse_context_type (message, &context_type);
        context = gub_provide_graphic_context(pipeline->graphic_context, context_type);
         if (context)
         {
             gst_element_set_context (GST_ELEMENT (message->src), context);
             gst_context_unref (context);
         }
        break;
    }
    ...
}

通过这些修改,我现在能够正确接收和再现视频。使用ffmpeg testsrc工具模拟RTP视频流如下:

ffmpeg -f lavfi -i testsrc -vf scale=1280:960 -vcodec libx264 -profile:v baseline -pix_fmt yuv420p -f rtp rtp://YOUR_IP:PORT