I am making a video animator that first generates .jpg
images from an html canvas
tag and then use the images as frames for the video. I am using ffmpeg
to do the video generating.
It works when only using 1 image
const ffmpeg = require('fluent-ffmpeg');
ffmpeg.setFfmpegPath("C://Program Files/ffmpeg/bin/ffmpeg.exe");
var command = ffmpeg();
command
.input("./test.jpg")
.save("./test.mp4")
.outputFPS(1)
.on('end', () => {
console.log("done");
});
I got it to make a mp4 video that is 1 second long with this code, but this is only with one image/frame.
What I'm trying to do (2+ images)
const ffmpeg = require('fluent-ffmpeg');
ffmpeg.setFfmpegPath("C://Program Files/ffmpeg/bin/ffmpeg.exe");
var command = ffmpeg();
command
.input("./test.jpg")
.input("./another.jpg")
.save("./test.mp4")
.outputFPS(1)
.frames(2)
.on('end', () => {
console.log("done");
});
This doesn't work as it generates the same video as the first example. Instead of showing one image for the first second and another image for the second second, it generates a one second long video that only has the first image.
Can someone please show how to fix this?
Ok. After looking in documentation it seems that the only way to add two images as inputs is to use a image file pattern.
I just had to name the images in a pattern like
image0.jpg
image1.jpg
image2.jpg
Or with additional preceeding 0s if I want 10+ or 100+ images.
Now I can just do command.input("./image%1d.jpg")
.
Hope this helps people in the future who had the same problem as me.