I'm creating a GStreamer pipeline in Python so that I stack consecutive camera images onto a buffer and pipe them to create a video (format does not matter). Each image in the buffer is encoded as a numpy array with all three RGB channels. I source the pipeline via GstAppSrc, convert it, encode it, mux it with appropiate caps and then save it via filesink. The code runs, but the actual video file is either empty or does not open in VLC or QuickTime Player, regardless of the format. Below is the code
class VideoStream():
def __init__(self):
self.buffer_size = 10
self.width = 448
self.height = 256
self.buffer = np.zeros(
(self.height, self.width, 3, self.buffer_size), dtype='uint8')
# Height, Width, Channels (3 for rgb), buffer size
self.index = 0
Gst.init(sys.argv[1:])
#Declare elements
self.appsrc = Gst.ElementFactory.make("appsrc", "source")
self.appsrc.set_property("is_live", True)
self.appsrc.set_property("emit_signals", True)
converter = Gst.ElementFactory.make("autovideoconvert", "conv")
encoder = Gst.ElementFactory.make("x264enc", "encoder")
muxcaps = Gst.ElementFactory.make("capsfilter", "capsmux")
mux = Gst.ElementFactory.make("qtmux", "mux")
filesave = Gst.ElementFactory.make("filesink", "sink")
caps = Gst.Caps.from_string("video/x-raw,format=RGB,width=448,height=256,framerate=30/1")
#Set properties of each element
self.appsrc.set_property('caps', caps)
muxcaps.set_property('caps', Gst.Caps.from_string("video/x-h264"))
filesave.set_property("location", "video.qt")
self.pipeline = Gst.Pipeline.new("test-pipeline")
self.pipeline.add(self.appsrc)
self.pipeline.add(converter)
self.pipeline.add(encoder)
self.pipeline.add(filesave)
self.pipeline.add(muxcaps)
self.pipeline.add(mux)
#Linking pipeline elements
self.appsrc.link(converter)
converter.link(encoder)
encoder.link(muxcaps)
muxcaps.link(mux)
mux.link(filesave)
def update_buffer(self, image):
#cv2.imwrite('test.jpg', image)
if self.index < self.buffer_size:
self.buffer[:, :, :, self.index] = image
self.index += 1
else:
self.index = 0
# return buffer to pipeline
self.source_images()
def source_images(self):
self.pipeline.set_state(Gst.State.PLAYING)
duration = 10**9 / (30 / 1)
pts = 0
for image in range(self.buffer_size):
image_bytes = self.buffer[:, :, :, image].tobytes()
gst_buffer = Gst.Buffer.new_wrapped(image_bytes)
pts += duration
gst_buffer.pts = pts
gst_buffer.duration = duration
self.appsrc.emit("push-buffer", gst_buffer)
self.appsrc.emit("end-of-stream")
I've tried different encoders/muxers but the file is still unreadable. Interestingly, this does not happen when I use videotestsrc as the source rather than appsrc, which led me to believe that there is some sort of mismatch in the pipeline format, but I can't figure out the specifics.
Your code is OK, you set the caps in Gst.Caps format, the pipeline you build is OK, the caps, and the buffer you push, The only missing thing to make it work it's to set the AppSrc format property to Gst.Format.TIME
.
The available options are:
format : The format of the segment events and seek
flags: readable, writable
Enum "GstFormat" Default: 2, "bytes"
(0): undefined - GST_FORMAT_UNDEFINED
(1): default - GST_FORMAT_DEFAULT
(2): bytes - GST_FORMAT_BYTES
(3): time - GST_FORMAT_TIME
(4): buffers - GST_FORMAT_BUFFERS
(5): percent - GST_FORMAT_PERCENT
This property specifies the type of buffer the Segment handles. In this case the position of buffer is given by time, start position of 0 and end position of the total duration this output file will have, as we are talking of sinking video to a file. See Segments for more details.
So, go add this line into your code to make it work!
self.appsrc.set_property("format", Gst.Format.TIME)