I'm trying to save the video input (it can also be frame by frame) from a camera, whose input I can display like this:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! videoconvert ! autovideosink
I want to save this video to a file, either in video format or frame by frame. So I try to run
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! decodebin ! videoconvert ! avimux ! filesink location=video.avi
But my video.avi file is empty. What am I doing wrong? I am a beginner in GStreamer and I can't find useful information online so I can't figure out what each part of that pipeline is doing.
EDIT
Running with verbose I get this:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding=(string)JPEG, payload=(int)26, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
[INFO] bitstreamMode 1, chromaInterleave 0, mapType 0, tiled2LinearEnable 0
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstImxVpuDecoder:imxvpudecoder0.GstPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstImxVpuDecoder:imxvpudecoder0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstDecodePad:src_0.GstProxyPad:proxypad1: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)bt601, framerate=(fraction)0/1
I could eventually write the stream of images using the following command:
gst-launch-1.0 udpsrc port=50004 ! application/x-rtp,encoding=JPEG,payload=26 ! rtpjpegdepay ! matroskamux ! filesink location=video.mkv