When a video is played during a webRTC call it causes the local stream to go mute without muting the track (audio track still has the enabled prop set to true). It does not happen when the 'muted' prop of the Video component is set to true or when the video stats playing before the call is started, but after muting or unmuting the video during the call, the microphone still disconnects. The Video component must be accessing the microphone. Do you know how to fix it? Happens on IOS 14, Xcode 12.2
I fixed it thanks to this post. It was a problem with AVAudioSession
in ios/Video/RTCVideo.m. It turns out that the default settings in AVFoundation
allow for the use of either Microphone or the Speaker (one at a time), so mounting Video component, or performing any actions that establish AVAudioSession
interferes with other components using the microphone.
I changed this part of the code:
- (void)setPaused:(BOOL)paused
{
if (paused) {
[_player pause];
[_player setRate:0.0];
} else {
if([_ignoreSilentSwitch isEqualToString:@"ignore"]) {
// [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil]; OLD
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord error:nil]; //NEW
} else if([_ignoreSilentSwitch isEqualToString:@"obey"]) {
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:nil];
}
[_player play];
[_player setRate:_rate];
}
_paused = paused;
}
This fix will work only if the ignoreSilentSwitch
in Video
component is set to 'ignore'