Search code examples
gstreamerpipelinemjpeg

Using v4l2loopback and GStreamer with MJPEG cameras


I have one 4k camera which has MJPEG and YUY2 formats. Currently, I can run

$ gst-launch-1.0 v4l2src device=/dev/video1 ! "video/x-raw,format=YUY2,width=640,height=480,framerate=30/1" ! tee name=t ! queue ! v4l2sink device=/dev/video20 t. ! queue ! v4l2sink device=/dev/video21
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

And stream video1 image to two different devices.

Q: How to pass MJPEG image from video1 to both video20 and video21, which are in YUY2 format.


Solution

  • In the MJPEG case you need to add image/jpeg caps to v4l2src. After v4l2src you need to convert it to raw video.

    Gstreamer has jpegdec and avdec_mjpeg plugins. In my current version jpegdec does not support YUY2 output, so I would use avdec_mjpeg. Alernatively you can use jpegdec with videoconvert (i.e.... ! jpegdec ! videoconvert ! ...).

    The following line should do it:

    gst-launch-1.0 v4l2src device=/dev/video1 ! "image/jpeg,width=3840,height=2160,framerate=30/1" ! avdec_mjpeg ! "video/x-raw,format=YUY2,width=3840,height=2160,framerate=30/1" ! tee name=t ! queue ! v4l2sink device=/dev/video20 t. ! queue ! v4l2sink device=/dev/video21