Search code examples
iosswiftwebrtcjanus-gatewaywebrtc-ios

Unable to render remote video with WebRTC


I am being unsuccessful on rendering a remote video using WebRTC. For context I'm using Janus's streaming plugin.

I'm following what I've been reading so far. Whenever peerConnection(_ peerConnection:, didAdd stream:) Is called on RTCPeerConnectionDelegate, I create a remote renderer and add it to the first videoTrack of the stream that the delegate provided me like this:

#if arch(arm64)
let remoteRenderer = RTCMTLVideoView(frame: self.view.frame)
remoteRenderer.videoContentMode = .scaleAspectFill
#else
let remoteRenderer = RTCEAGLVideoView(frame: self.view.frame)
#endif
            
stream.videoTracks.first?.add(remoteRenderer)
self.view.addSubview(remoteRenderer)

But the video will not show, only a black screen.

My delegate has also called peerConnection(_ peerConnection:, didChange newState:) with a newState of RTCIceConnectionState.connected which makes me think the connection is fine.


Solution

  • Try attaching the renderer when you receive the event "didStartReceivingOnTransceiver":

    func peerConnection(_ peerConnection: RTCPeerConnection, didStartReceivingOn transceiver: RTCRtpTransceiver) {
        switch transceiver.mediaType {
        case .video:
                DispatchQueue.main.async {[weak self] in
                    self?.remoteVideoTrack = transceiver.receiver.track as? RTCVideoTrack
                    if let renderer = self?.delegate?.viewForRemoteVideoTrack(){
                        self?.remoteVideoTrack?.add(renderer)
                    }
                }
        default:
            break
        }        
    }