My web application allows users to dynamically create both video and audio files from the same page, often times rotating between the two.
Each recording action has it's own instantiation and logic, including their own MediaStreams.
The problem occurs when I initialize an audio stream and then try to initialize a video stream. The recording works (both video and audio, it can be played back once done recording) but the video stream itself shows up blank.
This does not happen when I instantiate a video stream without instantiating an audio stream first - it works fine until I try to record audio and then come back to try to record video.
I've confirmed that the video stream has both audio and video tracks when it's instantiated. I cannot figure out why the video track appears empty.
What am I missing?
Setup to record video using the basic getUserMedia API:
// Start stream
var constraints = {
audio: true,
video: true
};
navigator.mediaDevices.getUserMedia(constraints)
.then(function(mediaStream) { // do stuff with stream }
// Kill stream, both audio and video
mediaStream.getTracks()[0].stop();
mediaStream.getTracks()[1].stop();
Setup to record audio. I am using RecorderJS for recording:
navigator.getUserMedia({ audio: true }, function (stream) {
var input = that.audio_context.createMediaStreamSource(stream);
// Initialize the Recorder Library
var recorder = new Recorder(input);
recorder.record();
}
// Stop recording
this.recorder.stop();
this.audio_stream.getTracks()[0].stop();
It turns out my problem was NOT with WebRTC or media streams, but rather with how I was serving the stream to the <video>
element. I got rid of the black stream by disposing and recreating the video element each time a stream is started rather than just assigning it's source
to the stream.