首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >通过gstreamer-1.0 appsrc到rtmp接收器的Rtmp流

通过gstreamer-1.0 appsrc到rtmp接收器的Rtmp流
EN

Stack Overflow用户
提问于 2016-07-21 04:55:20
回答 1查看 2.5K关注 0票数 2

我正在尝试通过rtmp传输我的网络摄像头。我试图通过以下管道来传输数据:

gst-启动-1.0 -v v4l2src!‘width=640 /x-raw,height=480,framerate=30/1’!排队!视频转换!omxh264enc!h264parse!弗莱夫穆克斯!Rtmp接收器位置=‘rtmp://{MY_IP}/rtmp/live’

就像一种魅力。我可以在我的网站上看到视频。

然后我想先捕捉帧,然后做一些处理。我像以前一样通过把数据推送到appsrc和流经管道来传输处理过的数据,但是也出现了一些问题。

我不能在我的网站上看到任何流。服务器端和客户端都不会引发任何错误或警告。尽管如此,我仍然可以通过以下方式获得流:

gst-启动-1.0 rtmpsrc位置=‘rtmp://{MY_IP}/rtmp/live’!文件链接位置=‘rtmpsrca.flv’

有人知道这件事吗?

以下是我的网站部件和gstreamer管道的片段

gstreamer管道:

代码语言:javascript
复制
void threadgst(){

    App * app = &s_app; 
    GstCaps *srccap;
    GstCaps * filtercap;
    GstFlowReturn ret;
    GstBus *bus;
    GstElement *pipeline;

    gst_init (NULL,NULL);

    loop = g_main_loop_new (NULL, TRUE);

    //creazione della pipeline:
    pipeline = gst_pipeline_new ("gstreamer-encoder");
    if( ! pipeline ) {
        g_print("Error creating Pipeline, exiting...");
    }

    //creazione elemento appsrc:
    app-> videosrc = gst_element_factory_make ("appsrc", "videosrc");
    if( !  app->videosrc ) {
            g_print( "Error creating source element, exiting...");
    }

    //creazione elemento queue:
    app-> queue = gst_element_factory_make ("queue", "queue");
    if( !  app->queue ) {
            g_print( "Error creating queue element, exiting...");
    }

    app->videocoverter = gst_element_factory_make ("videoconvert", "videocoverter");
    if( ! app->videocoverter ) {
            g_print( "Error creating videocoverter, exiting...");
    }

    //creazione elemento filter:
    app->filter = gst_element_factory_make ("capsfilter", "filter");
    if( ! app->filter ) {
            g_print( "Error creating filter, exiting...");
    }

    app->h264enc = gst_element_factory_make ("omxh264enc", "h264enc");
    if( ! app->h264enc ) {
            g_print( "Error creating omxh264enc, exiting...");
    }

 app->h264parse = gst_element_factory_make ("h264parse", "h264parse");
    if( ! app->h264parse ) {
            g_print( "Error creating h264parse, exiting...");
    }
    app->flvmux = gst_element_factory_make ("flvmux", "flvmux");
    if( ! app->flvmux ) {
            g_print( "Error creating flvmux, exiting...");
    }
    app->rtmpsink = gst_element_factory_make ("rtmpsink", "rtmpsink");
    if( ! app->rtmpsink ) {
            g_print( "Error rtmpsink flvmux, exiting...");
    }



    g_print ("Elements are created\n");
    g_object_set (G_OBJECT (app->rtmpsink), "location" , "rtmp://192.168.3.107/rtmp/live live=1" ,  NULL);
    


    g_print ("end of settings\n");

    srccap = gst_caps_new_simple("video/x-raw",
            "format", G_TYPE_STRING, "RGB",
            "width", G_TYPE_INT, 640,
            "height", G_TYPE_INT, 480,
            //"width", G_TYPE_INT, 320,
            //"height", G_TYPE_INT, 240,
            "framerate", GST_TYPE_FRACTION, 30, 1,
            //"pixel-aspect-ratio", GST_TYPE_FRACTION, 1, 1,
        NULL);

    filtercap = gst_caps_new_simple("video/x-raw",
            "format", G_TYPE_STRING, "I420",
            "width", G_TYPE_INT, 640,
            "height", G_TYPE_INT, 480,
            //"width", G_TYPE_INT, 320,
            //"height", G_TYPE_INT, 240,
            "framerate", GST_TYPE_FRACTION, 30, 1,
        NULL);

    gst_app_src_set_caps(GST_APP_SRC( app->videosrc), srccap);
    g_object_set (G_OBJECT (app->filter), "caps", filtercap, NULL);
    bus = gst_pipeline_get_bus (GST_PIPELINE ( pipeline));
    g_assert(bus);
    gst_bus_add_watch ( bus, (GstBusFunc) bus_call, app);

 gst_bin_add_many (GST_BIN ( pipeline), app-> videosrc, app->queue, app->videocoverter,app->filter, app->h264enc,  app->h264parse, app->flvmux, app->rtmpsink, NULL);

    g_print ("Added all the Elements into the pipeline\n");

    int ok = false;
    ok = gst_element_link_many ( app-> videosrc, app->queue, app->videocoverter, app->filter,app->h264enc,  app->h264parse, app->flvmux, app->rtmpsink, NULL);


    if(ok)g_print ("Linked all the Elements together\n");
    else g_print("*** Linking error ***\n");

    g_assert(app->videosrc);
    g_assert(GST_IS_APP_SRC(app->videosrc));

    g_signal_connect (app->videosrc, "need-data", G_CALLBACK (start_feed), app);
    g_signal_connect (app->videosrc, "enough-data", G_CALLBACK (stop_feed),app);


    g_print ("Playing the video\n");
    gst_element_set_state (pipeline, GST_STATE_PLAYING);

    g_print ("Running...\n");
        g_main_loop_run ( loop);

    g_print ("Returned, stopping playback\n");
    gst_element_set_state (pipeline, GST_STATE_NULL);
    gst_object_unref ( bus);
    g_main_loop_unref (loop);
    g_print ("Deleting pipeline\n");


}

我网页的来源

代码语言:javascript
复制
<!DOCTYPE html>
<meta content="text/html;charset=utf-8" http-equiv="Content-Type">
<meta content="utf-8" http-equiv="encoding">
<html>
<head>
<title>Live Streaming</title>

<!-- strobe -->
<script type="text/javascript" src="strobe/lib/swfobject.js"></script>
<script type="text/javascript">
  var parameters = {  
     src: "rtmp://192.168.3.107/rtmp/live",  
     autoPlay: true,  
     controlBarAutoHide: false,  
     playButtonOverlay: true,  
     showVideoInfoOverlayOnStartUp: true,  
     optimizeBuffering : false,  
     initialBufferTime : 0.1,  
     expandedBufferTime : 0.1,  
     minContinuousPlayback : 0.1,  
     //poster: "images/poster.png"  
  };  
  swfobject.embedSWF(
    "strobe/StrobeMediaPlayback.swf"
    , "StrobeMediaPlayback"
    , 1024
    , 768
    , "10.1.0"
    , "strobe/expressInstall.swf"
    , parameters
    , {
      allowFullScreen: "true"
    }
    , {
      name: "StrobeMediaPlayback"
    }
  );
</script>

</head>
<body>
<div id="StrobeMediaPlayback"></div>
</body>
</html>

EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2016-07-22 07:38:50

当使用appsrc和app接收器时,人们通常会用缓冲区做一些事情,有时他们会以某种方式获取数据并对其进行处理,然后创建新的缓冲区,但忘记正确地标记它。

什么是时间戳?它将时间信息附加到音频/视频缓冲区。为什么?-它对每个应用程序的同步机制(vlc,web .)以一定的速度向用户显示(呈现)视频/音频(这是PTS)。

这与帧(在视频中)或频率(在音频中--但时间戳在这里的工作方式不同--并不是每个音频样本都有4个字节)有关。

那么,在您的web端可能发生了什么-它接收缓冲区,但没有这个时间戳的信息。因此,这个应用程序不知道如何/什么时候显示视频,所以它默默地失败了,什么也没有显示。

该GStreamer应用程序的工作,因为它显然有一些算法,如何猜测框架等。

就像我说的你有两个选择。

1、计算你的PTS和你用以下时间进行的持续时间:

代码语言:javascript
复制
guint64 calculated_pts = some_cool_algorithm();
GstBuffer *buffer = gst_buffer_new(data);//your processed data
GST_BUFFER_PTS(buffer) = calculated_pts; // in nanoseconds
GST_BUFFER_DURATION(buffer) = 1234567890; // in nanoseconds
//push buffer to appsrc

或者通过为appsrc打开do-timestamp,它将自动生成时间戳--现在我不知道它是怎么做的--它要么从大写中选择框架,要么根据您如何将帧推入其中生成PTS。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/38495163

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档