I have app where it streams from a camera to a file with Preview using liabv
.
Now there is requirement to be able to stream from 2 cameras simultaneously and output to a single file. The Preview will be like a CCTV camera and written to single output. Is this possible with libav?
Before doing anything I've tried with ffmpeg.exe
directly and found this:
ffmpeg -f dshow -i video="Camera1" -i video="Camera2" -filter_complex "nullsrc=size=640x480 [base];[0:v] setpts=PTS-STARTPTS, scale=640x480 [upperleft];[1:v] setpts=PTS-STARTPTS, scale=640x480 [upperright];[base][upperleft] overlay=shortest=1 [tmp1];[tmp1][upperright] overlay=shortest=1:x=640:y=480 [tmp2];"-c:v libx264 output.mp4
But every time throws error 'No such file or directory' for the second camera, while I've verified the camera is working if I use it as single input. Do I miss something?
Overall is it possible to achieve that?
There are several typos in your command, -f dshow
is missing for Camera2, your base is too small to show both inputs, and hstack (or vstack) is easier to use. Try:
ffmpeg -f dshow -i video="Camera1" -f dshow -i video="Camera2" -filter_complex "[0:v]setpts=PTS-STARTPTS,scale=640:-2[left];[1:v]setpts=PTS-STARTPTS,scale=640:-2[right];[left][right]hstack=inputs=2:shortest=1,format=yuv420p[v]" -map "[v]" -c:v libx264 -movflags +faststart output.mp4