Search code examples
qt5gstreamerembedded-linuxqtquick2qtwidgets

Embed separate video streams in Qt widgets on embedded-linux using Gstreamer


I'm looking for any hints I can get on this problem :

I have an i.MX6 device running an embedded-linux buildroot OS, and a Qt5 Widget-based app that runs on that screen. I also have a custom SDK, that I can't change and is limited to Qt 5.5.1 libraries, that I need to build on the iMX. And since it is embedded, I cannot use commands such as 'dpkg' or 'ldconfig' on it.

Objective: My goal is to add a feature to that Qt 5.5 app that displays live video streams from several cameras (about 4 to 6 of them), inside separate widgets. It needs to be hardware accelerated.

Having no cameras for testing, I use VLC to stream 3 videos (they run in the background locally) and my app reads these streams via RTSP.

What I've tried: I've been learning about Qt and Gstreamer to find a solution. I tried a number of different things, used all the promising video sinks already installed on the screen, and am currently trying QtQuick-based solutions.

I made a simple Qt widget app for testing, which works well when I run it on my x86 system (Ubuntu 16.04). As you can see on these screenshots :

Tab 1

Tab 2

the streams are well integrated within the separate tabs and I can switch between them by clicking on the tabs. However, running this same app on the target device (with a different sink, e.g. imxipuvideosink) displays all the streams "above" the window and on top of each other. They are not embedded in the Tab widgets, maybe it is because the device cannot rely on X.

I tried many different approaches to have the streams embedded in widgets, here are a few of them :

  1. autovideosink, imxipuvideosink, imxeglvivsink : These are the sinks I tested with the example above when deploying on the target and I got the un-embedded streams overlapping each other.
  2. impxpvideosink : I could not get an output.
  3. qtvideosink, qtglvideosink : According to this documentation, these require to connect the "update" and "paint" signals. I tried the following line : QGlib::connect(sink, "update", widget, &QWidget::update); but it throws a "no matching function call for ..." error, with the update slot marked as "< unresolved overloaded function type >". QObject::connect does the same. I probably did something wrong here.
  4. qwidgetvideosink : Using this pipeline : rtspsrc location=rtsp://10.0.1.1:8554/stream ! videoparse width=400 height=300 format=i420 ! videoconvert ! qwidget5videosink I get an output, but it's corrupted on x86, and the target device does not have the videoparse plugin installed.

Since I am bound to using a QWidget app, the only way to use QML is to create a QQuickWidget inside of the widget-based app. I set the source of this QQuickWidget to a simple .qml file and tried the following :

  1. MediaPlayer + VideoOutput : This actually works the way I want it to on both systems, i.e the videos are displayed and embedded into separate widgets. But AFAIK MediaPlayer does not benefit from hardware acceleration, so some instances crash at the start and the remaining ones play, but with strong frame jitter. Because my SDK is limited to Qt and QtMultimedia 5.5, I can't use MediaPlayer's pipeline definition feature that was introduced in 5.12.
  2. qmlglsink : Supposedly the most promising sink for the job, this one is only available in the package gstreamer1.0-qt5 so I would need to install it on the embedded device, which is technically possible but difficult and risky.
  3. VideoSurface + VideoItem : I don't have lib QtGStreamer 1.0 installed.
  4. GstGLVideoItem : I don't have lib org.freedesktop.gstreamer.GLVideoItem 1.0 installed.
  5. I have qtquick2videosink installed but I don't know how to use it. The documentation isn't clear on what QML element it should be used on and I haven't found any usage example. Is it VideoSurface ? GraphicsVideoSurface ?

I know this is very restricted, but since this is embedded I wanted to keep the installation of more packages and libraries as last resort. And, if I must install something, in what order should I try them from most promising to less promising ? Any suggestions or links to examples is appreciated, I'm looking for general guidance to see what I missed or know how I should approach the issue.

EDIT : I searched some more with qwidgetvideosink, and found a way to make it work. The following pipeline is what I found to be the best result : rtspsrc location=rtsp://10.0.1.1:8554/stream ! decodebin ! imxg2dvideotransform ! clockoverlay ! qwidget5videosink sync=false. Unfortunately the performance here is just as poor as when using MediaPlayer, if not worse. But at least with this method I can customize the pipeline. I tried a few different things but couldn't find a better solution. I also can't find a way to replace decodebin with more precise plugins. Any help with this would also be appreciated.


Solution

  • I wanted to share the solution we went for in this particular problem, in case it can help anyone who has a similar issue.

    After failing with all solutions that required missing librairies (QtMultimedia, qmlglsink, etc.), or just didn't work for unknown reasons, I learned about framebuffers - which are basically just layers for the GPU as far as I'm concerned - and how to use them for this case.

    It turns out the linux embedded device I've been working with has 3 framebuffers, which allowed us to split the application into a "background" framebuffer for video stream playback, and a "foreground" framebuffer for the overlay display. The overlay (the Qt MainWindow) needed to be transparent whenever we wanted the video in the background to become visible. For this we used alpha blending and a color key.

    After testing individual parts of this solution, we ended up with an app that launches two pipelines (because I want 2 cameras being displayed on the screen at once, and each of them can be switched to another stream using an input-selector). The pipeline structure looked like this, for example :

    input-selector name=selector ! decodebin ! textoverlay name=text0 ! queue !
    imxg2dvideosink framebuffer=/dev/fb0 name=end0 force-aspect-ratio=false
        window-x-coord=0 window-y-coord=0 window-width=512 window-height=473
    rtspsrc location=rtsp://10.0.1.1:8554/stream name=src0 ! queue name=qs_0 ! selector.sink_0
    rtspsrc location=rtsp://10.0.1.1:8556/stream name=src2 ! queue name=qs_2 ! selector.sink_1
    rtspsrc location=rtsp://10.0.1.1:8558/stream name=src4 ! queue name=qs_4 ! selector.sink_2
    

    We pass the framebuffer property to the sink so that it sends the video to the framebuffer 0, while the application itself is being displayed on framebuffer 1, which appears on top of fb0. To achieve this, we simply set the QT_QPA_EGLFS_FB env variable to /dev/fb1 before calling the app executable, since our device runs with the EGLFS plugin.

    For the alpha blending and color keying part, we had to do this in the app :

    #include <fcntl.h>
    #include <linux/mxcfb.h>
    #include <sys/ioctl.h>
    
    ...
    
    // Read overlay framebuffer fb1
    int fb = 0;
    fb = open("/dev/fb1", O_RDWR);
    if (fb < 0)
        qWarning() << "Error, framebuffer cannot be opened";
    
    // Enable alpha
    struct mxcfb_gbl_alpha alphaStruct;
    alphaStruct.enable = 1;
    alphaStruct.alpha = 255;
    if (ioctl(fb, MXCFB_SET_GBL_ALPHA, &alphaStruct) < 0)
        qWarning() << "Error, framebuffer alpha cannot be set";
    
    // Set color key to pure blue
    struct mxcfb_color_key colorKeyStruct;
    guint32 colorKeyValue = g_ascii_strtoull("0x0000FF", NULL, 16);
    colorKeyStruct.color_key = colorKeyValue;
    colorKeyStruct.enable = 1;
    if (ioctl(fb, MXCFB_SET_CLR_KEY, &colorKeyStruct) < 0)
        qWarning() << "Error, framebuffer color key cannot be set";
    
    ...
    

    Basically this opens the framebuffer that the overlay app is running on, enables alpha on it, and then sets one color (blue) as the transparent color. So every pixel with this exact color value will display the video that's running in the background.

    So now we have an app that plays video streams with a custom Gstreamer pipeline that uses a HW-accelerated video sink.