Search code examples
opencvcameraipgstreamerrtsp

IP camera capture


I am trying to capture the stream of two IP cameras directly connected to a mini PCIe dual gigabit expansion card in a nVidia Jetson TK1.

I achieved to capture the stream of both cameras using gstreamer with the next command:

gst-launch-0.10 rtspsrc location=rtsp://admin:[email protected]:554/mpeg4cif latency=0 ! decodebin ! ffmpegcolorspace ! autovideosink rtspsrc location=rtsp://admin:[email protected]:554/mpeg4cif latency=0 ! decodebin ! ffmpegcolorspace ! autovideosink

It displays one window per camera, but gives this output just when the capture starts:

    WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink1/GstXvImageSink:autovideosink1-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2875): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink1/GstXvImageSink:autovideosink1-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
---> TVMR: Video-conferencing detected !!!!!!!!!

The stream is played good, with "good" synchronization also between cameras, but after a while, suddenly one of the cameras stops, and usually few seconds later the other one stops too. Using an interface snifer like Wireshark I can check that the rtsp packets are still sending from the cameras.

My purpose is to use this cameras to use them as a stereo camera using openCV. I am able to capture the stream with OpenCV with the following function:

camera[0].open("rtsp://admin:[email protected]:554/mpeg4cif");//right
camera[1].open("rtsp://admin:[email protected]:554/mpeg4cif");//left

It randomnly starts the capture good or bad, synchronized or not, with delay or not, but after a while is impossible to use the captured images as you can observe in the image:

enter image description here

And the output while running the openCV program usually is this: (I have copied the most complete one)

[h264 @ 0x1b9580] slice type too large (2) at 0 23
[h264 @ 0x1b9580] decode_slice_header error

[h264 @ 0x1b1160] left block unavailable for requested intra mode at 0 6
[h264 @ 0x1b1160] error while decoding MB 0 6, bytestream (-1)

[h264 @ 0x1b1160] mmco: unref short failure

[h264 @ 0x1b9580] too many reference frames

[h264 @ 0x1b1160] pps_id (-1) out of range

The used cameras are two SIP-1080J modules.

Anyone knows how to achieve a good capture using openCV? First of all get rid of those h264 messages and have stable images while the program executes.

If not, how can I improve the pipelines and buffers using gstreamer to have a good capture without the sudden stop of the stream?. Although I never captured through openCV using gstreamer, perhaps some day I will know how to do it and solve this problem.

Thanks a lot.


Solution

  • After some days of deep search and some attempts, I turned on directly to use the gstreamer-0.10 API. First I learned how to use it with the tutorials from http://docs.gstreamer.com/pages/viewpage.action?pageId=327735

    For most of the tutorials, you just need to install libgstreamer0.10-dev and some other packages. I installed all by:

    sudo apt-get install libgstreamer0*
    

    Then copy the code of the example you want to try into a .c file and type from the terminal in the folder where the .c file is located (In some examples you have to add more libs to pkg-config):

    gcc basic-tutorial-1.c $(pkg-config --cflags --libs gstreamer-0.10) -o basic-tutorial-1.c
    

    After that I did not feel lost I started to try to mix some c and c++ code. You can compile it using a proper g++ command, or with a CMakeLists.txt or the way you want to... As I am developing with a nVidia Jetson TK1, I use Nsight Eclipse Edition and I need to configure the project properties properly to be able to use the gstreamer-0.10 libs and the openCV libs.

    Mixing some code, finally I am able to capture the streams of my two IP cameras in real time without appreciable delay, without bad decoding in any frame and both streams synchronized. The only thing left that I have not solved yet is the obtaining of frames in color and not in gray scale when (I have tried with other CV_ values with "segmentation fault" result):

    v = Mat(Size(640, 360),CV_8U, (char*)GST_BUFFER_DATA(gstImageBuffer));
    

    The complete code is next where I capture using gstreamer, transform the capture to a openCV Mat object and then show it. The code is for just a capture of one IP camera. You can replicate the objects and methods for capture multiple cameras at the same time.

    #include <opencv2/core/core.hpp>
    #include <opencv2/contrib/contrib.hpp>
    #include <opencv2/highgui/highgui.hpp>
    #include <opencv2/imgproc/imgproc.hpp>
    #include <opencv2/video/video.hpp>
    
    #include <gst/gst.h>
    #include <gst/app/gstappsink.h>
    #include <gst/app/gstappbuffer.h>
    #include <glib.h>
    
    #define DEFAULT_LATENCY_MS  1
    
    using namespace cv;
    
    typedef struct _vc_cfg_data {
        char server_ip_addr[100];
    } vc_cfg_data;
    
    typedef struct _vc_gst_data {
        GMainLoop *loop;
        GMainContext *context;
        GstElement *pipeline;
        GstElement *rtspsrc,*depayloader, *decoder, *converter, *sink;
        GstPad *recv_rtp_src_pad;
    } vc_gst_data;
    
    typedef struct _vc_data {
        vc_gst_data gst_data;
        vc_cfg_data cfg;
    } vc_data;
    
    /* Global data */
    vc_data app_data;
    
    static void vc_pad_added_handler (GstElement *src, GstPad *new_pad, vc_data *data);
    
    
    #define VC_CHECK_ELEMENT_ERROR(e, name) \
    if (!e) { \
    g_printerr ("Element %s could not be created. Exiting.\n", name); \
    return -1; \
    }
    
    /*******************************************************************************
    Gstreamer pipeline creation and init
    *******************************************************************************/
    int vc_gst_pipeline_init(vc_data *data)
    {
        GstStateChangeReturn ret;
    
        // Template
        GstPadTemplate* rtspsrc_pad_template;
    
        // Create a new GMainLoop
        data->gst_data.loop = g_main_loop_new (NULL, FALSE);
        data->gst_data.context = g_main_loop_get_context(data->gst_data.loop);
    
        // Create gstreamer elements
        data->gst_data.pipeline = gst_pipeline_new ("videoclient");
        VC_CHECK_ELEMENT_ERROR(data->gst_data.pipeline, "pipeline");
    
        //RTP UDP Source - for received RTP messages
        data->gst_data.rtspsrc = gst_element_factory_make ("rtspsrc", "rtspsrc");
        VC_CHECK_ELEMENT_ERROR(data->gst_data.rtspsrc,"rtspsrc");
    
        printf("URL: %s\n",data->cfg.server_ip_addr);
        g_print ("Setting RTSP source properties: \n");
        g_object_set (G_OBJECT (data->gst_data.rtspsrc), "location", data->cfg.server_ip_addr, "latency", DEFAULT_LATENCY_MS, NULL);
    
        //RTP H.264 Depayloader
        data->gst_data.depayloader = gst_element_factory_make ("rtph264depay","depayloader");
        VC_CHECK_ELEMENT_ERROR(data->gst_data.depayloader,"rtph264depay");
    
        //ffmpeg decoder
        data->gst_data.decoder = gst_element_factory_make ("ffdec_h264", "decoder");
        VC_CHECK_ELEMENT_ERROR(data->gst_data.decoder,"ffdec_h264");
    
        data->gst_data.converter = gst_element_factory_make ("ffmpegcolorspace", "converter");
        VC_CHECK_ELEMENT_ERROR(data->gst_data.converter,"ffmpegcolorspace");
    
        // i.MX Video sink
        data->gst_data.sink = gst_element_factory_make ("appsink", "sink");
        VC_CHECK_ELEMENT_ERROR(data->gst_data.sink,"appsink");
        gst_app_sink_set_max_buffers((GstAppSink*)data->gst_data.sink, 1);
        gst_app_sink_set_drop ((GstAppSink*)data->gst_data.sink, TRUE);
        g_object_set (G_OBJECT (data->gst_data.sink),"sync", FALSE, NULL);
    
        //Request pads from rtpbin, starting with the RTP receive sink pad,
        //This pad receives RTP data from the network (rtp-udpsrc).
        rtspsrc_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (data->gst_data.rtspsrc),"recv_rtp_src_0");
    
        // Use the template to request the pad
        data->gst_data.recv_rtp_src_pad = gst_element_request_pad (data->gst_data.rtspsrc, rtspsrc_pad_template,
        "recv_rtp_src_0", NULL);
    
        // Print the name for confirmation
        g_print ("A new pad %s was created\n",
        gst_pad_get_name (data->gst_data.recv_rtp_src_pad));
    
        // Add elements into the pipeline
        g_print(" Adding elements to pipeline...\n");
        gst_bin_add_many (GST_BIN (data->gst_data.pipeline),
                data->gst_data.rtspsrc,
                data->gst_data.depayloader,
                data->gst_data.decoder,
                data->gst_data.converter,
                data->gst_data.sink,
            NULL);
    
        // Link some of the elements together
        g_print(" Linking some elements ...\n");
        if(!gst_element_link_many (data->gst_data.depayloader, data->gst_data.decoder, data->gst_data.converter, data->gst_data.sink, NULL))
            g_print("Error: could not link all elements\n");
    
        // Connect to the pad-added signal for the rtpbin. This allows us to link
        //the dynamic RTP source pad to the depayloader when it is created.
        if(!g_signal_connect (data->gst_data.rtspsrc, "pad-added",
        G_CALLBACK (vc_pad_added_handler), data))
            g_print("Error: could not add signal handler\n");
    
        // Set the pipeline to "playing" state
        g_print ("Now playing A\n");
        ret = gst_element_set_state (data->gst_data.pipeline, GST_STATE_PLAYING);
        if (ret == GST_STATE_CHANGE_FAILURE) {
            g_printerr ("Unable to set the pipeline A to the playing state.\n");
            gst_object_unref (data->gst_data.pipeline);
            return -1;
        }
    
        return 0;
    }
    
    static void vc_pad_added_handler (GstElement *src, GstPad *new_pad, vc_data *data) {
        GstPad *sink_pad = gst_element_get_static_pad (data->gst_data.depayloader, "sink");
        GstPadLinkReturn ret;
        GstCaps *new_pad_caps = NULL;
        GstStructure *new_pad_struct = NULL;
        const gchar *new_pad_type = NULL;
        g_print ("Received new pad '%s' from '%s':\n", GST_PAD_NAME (new_pad), GST_ELEMENT_NAME (src));
    
        /* Check the new pad's name */
        if (!g_str_has_prefix (GST_PAD_NAME (new_pad), "recv_rtp_src_")) {
            g_print (" It is not the right pad. Need recv_rtp_src_. Ignoring.\n");
            goto exit;
        }
    
        /* If our converter is already linked, we have nothing to do here */
        if (gst_pad_is_linked (sink_pad)) {
            g_print (" Sink pad from %s already linked. Ignoring.\n", GST_ELEMENT_NAME (src));
            goto exit;
        }
    
        /* Check the new pad's type */
        new_pad_caps = gst_pad_get_caps (new_pad);
        new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
        new_pad_type = gst_structure_get_name (new_pad_struct);
    
        /* Attempt the link */
        ret = gst_pad_link (new_pad, sink_pad);
        if (GST_PAD_LINK_FAILED (ret)) {
            g_print (" Type is '%s' but link failed.\n", new_pad_type);
        } else {
            g_print (" Link succeeded (type '%s').\n", new_pad_type);
        }
    
        exit:
        /* Unreference the new pad's caps, if we got them */
        if (new_pad_caps != NULL)
            gst_caps_unref (new_pad_caps);
        /* Unreference the sink pad */
        gst_object_unref (sink_pad);
    }
    
    
    
    int vc_gst_pipeline_clean(vc_data *data) {
        GstStateChangeReturn ret;
        GstStateChangeReturn ret2;
    
        /* Cleanup Gstreamer */
        if(!data->gst_data.pipeline)
            return 0;
    
        /* Send the main loop a quit signal */
        g_main_loop_quit(data->gst_data.loop);
        g_main_loop_unref(data->gst_data.loop);
        ret = gst_element_set_state (data->gst_data.pipeline, GST_STATE_NULL);
        if (ret == GST_STATE_CHANGE_FAILURE) {
            g_printerr ("Unable to set the pipeline A to the NULL state.\n");
            gst_object_unref (data->gst_data.pipeline);
            return -1;
        }
    
        g_print ("Deleting pipeline\n");
        gst_object_unref (GST_OBJECT (data->gst_data.pipeline));
        /* Zero out the structure */
        memset(&data->gst_data, 0, sizeof(vc_gst_data));
        return 0;
    }
    
    
    void handleKey(char key)
    {
        switch (key)
        {
        case 27:
    
            break;
        }
    }
    
    
    int vc_mainloop(vc_data* data)
    {
    
        GstBuffer *gstImageBuffer;
    
        Mat v;
    
        namedWindow("view",WINDOW_NORMAL);
    
        while (1) {
    
            gstImageBuffer = gst_app_sink_pull_buffer((GstAppSink*)data->gst_data.sink);
    
            if (gstImageBuffer != NULL )
            {
                    v = Mat(Size(640, 360),CV_8U, (char*)GST_BUFFER_DATA(gstImageBuffer));
    
                    imshow("view", v);
    
                    handleKey((char)waitKey(3));
    
                    gst_buffer_unref(gstImageBuffer);
            }else{
                g_print("gsink buffer didn't return buffer.");
            }
        }
        return 0;
    }
    
    
    int main (int argc, char *argv[])
    {
        setenv("DISPLAY", ":0", 0);
    
        strcpy(app_data.cfg.server_ip_addr, "rtsp://admin:[email protected]:554/mpeg4cif");
    
        gst_init (&argc, &argv);
    
        if(vc_gst_pipeline_init(&app_data) == -1) {
            printf("Gstreamer pipeline creation and init failed\n");
            goto cleanup;
        }
    
        vc_mainloop(&app_data);
    
        printf ("Returned, stopping playback\n");
        cleanup:
        return vc_gst_pipeline_clean(&app_data);
        return  0;
    }
    

    I hope this helps!! ;)