Why does using image sequence as input in FFmpeg cause it to insist on realtime encoding? Is there no way to turn it off?
For example: I include an image-sequence in a filtergraph, and always get an error about thread message queue blocking:
image2
(the image sequence processor) is implying the thread queue isn't large enough for realtime encoding... but I never intended for realtime encoding.
quality
and preset
and deadline
all appear to have no effect on this error occuring.
Here is an example demonstrating the issue
ffmpeg -loop 1 -i mysequence/%04d.png -loop 1 -i gfx/background.png -filter_complex [0:v]format=rgba -to 10 -ss 5 -vcodec libvpx-vp9 -an out.webm
If you tell the filter to use [1:v]
instead, everything goes fine.
But if you target [0:v]
- the image sequence - the warning occurs
FFmpeg, by default, will read each input as fast as possible, in a dedicated thread.
In this case, the processing rate is slower than the read rate of that input, so the input FIFO gets filled up before older packets get dequeued. Either set a slow read rate (-readrate 0.6 -i mysequence/%04d.png
) or increase the fifo size (-thread_queue_size 256 -i mysequence/%04d.png
)