Wondering if anyone can help me out here!
I'm currently saving a HTML5 Canvas as an MP4 file via the MediaRecorder API. Now my canvas doesn't contain any audio but I need an audio channel built in as the file with just h.264 and no audio codec isn't compatible with a piece of software I am using.
Is there anyway to force Safari to bake in an audio codec into the stream even though there is no audio being used in the canvas?
Essentially I'm trying to achieve the following: AAC, H.264
Rather than what I have right now: H.264
Here is what I have so far (minus some other details):
// setup media recording
const recordedChunks = [];
const stream = canvas.captureStream(60);
mediaRecorder = new MediaRecorder(stream);
mediaRecorder.ondataavailable = (e) => recordedChunks.push(e.data);
mediaRecorder.onstop = async (e) => {
const download = (fileName, url) => {
const a = document.createElement("a");
document.body.appendChild(a);
a.style = "display: none";
a.href = url;
a.download = fileName;
a.click();
window.URL.revokeObjectURL(url);
};
// download video
const videoData = new Blob(recordedChunks, { type: "video/mp4" });
download("1.mp4", URL.createObjectURL(videoData));
}
// start recording
mediaRecorder.start();
// do some canvas related operations
// ...
mediaRecorder.stop();
I guess if there's no work around here I might just resort to adding a silent audio channel to the video via FFMPEG.
UPDATE: The accepted answer didn't actually work for me so I resorted to adding the audio channel through FFMPEG which worked. Accepted anyway as it does add an audio channel to the outputted file.
Thank you!
I'm not familiar with codecs and such but you can add a silent audio channel to a video stream as follows:
const stream = canvas.captureStream(60);
const audioContext = new AudioContext();
const oscillator = audioContext.createOscillator();
oscillator.frequency.value = 0;
const streamAudioDestination = audioContext.createMediaStreamDestination();
oscillator.connect(streamAudioDestination);
oscillator.start();
// add audio track
const audioStream = streamAudioDestination.stream;
const audioTracks = audioStream.getAudioTracks();
const firstAudioTrack = audioTracks[0];
stream.addTrack(firstAudioTrack);
const mediaRecorder = new MediaRecorder(stream);
Note that initialization of AudioContext should happen in response to a user action (e.g. within a click handler). Thank you @Nikola Lukic for noticing this!