Search code examples
javascripthtmlcanvasvideoffmpeg-wasm

Is it possible to bind a HTML Canvas to an ffmpeg JavaScript port to get a video file as output without a server?


First of all, I've found this repository which has a ffmpeg JavaScript implementation: https://github.com/ffmpegwasm/ffmpeg.wasm

I'm curious if I can somehow bind my canvas output and pass some frametimes to get a video output. (for example, visualize a physics object)

So far, I've set up a basic physics simulator in JS. I have a bunch of squares being rendered based on their x and y coordinates.

class PhysicsObject {
  // ...
  render(canvas, ctx) {
    ctx.fillStyle = this.color;
    ctx.fillRect(this.x - this.w / 2, this.y - this.h / 2, this.w, this.h);
  }
  // ...
}

let timer = performance.now();
// ...

function draw() {
  // ...
  let now = performance.now();
  dt = (now - timer) / 1000;
  timer = now;
  // ...

  for (let object of physicsObjects) {
    // ...
    object.update(dt);
    object.render(canvas, ctx);
    // ...
  }
  requestAnimationFrame(draw);
}

I now need a way to link my canvas output to the ffmpeg and some other parameters but I have no idea where to even start.

If there is a way to bind the canvas output to the ffmpeg port, I'd like to delve deeper into the documentation of this ffmpegwasm thing.


Solution

  • Similar request with solution here with example code. The ffmpeg.wasm code on that answer looks a little old, but the basic technique should be what you are after.

    You record the canvas to a .webm video using MediaRecorder (MDN docs with example) and then (optionally) use ffmpeg.wasm to transcode the .webm video to .mp4. All of it is done in the browser.

    Another reference on how to record a canvas to a video file:
    How to record a canvas element