Search code examples
pythonreactjswebrtcvideo-streamingdjango-channels

Video chat via WEBRTC does not work in both FireFox and Chrome


Im a bit new at this, I've spent about 4 straight days on this and I still cannot get this to work, so forgive me if I missing anything as Im abit exhuasted but Ill do my best to explain.

Goal:

As soon at the 2nd caller enters the room, the call is initiated. For simplicity the username and room is in the URL. I am trying to initiate a WEBRTC MESH network. I understand the flaws to this, but performance is not my concern. I want to be able to implement each style..I will address SFU and MCU later. Also the code isnt 100% bug free but this is simply a rough sketch.

As the icecandidates "trickle" in, I try and defer adding them if a remote description is not made. In the "many" logs to the console the code seems to eventually adding all the candidates, as I understand it should be doing. So, for each call to setLocalDescription about 40 candidates are produced, 80 in total . So being as though there will be 80 calls to the socket, each should be added to a single connection, which seems to be occurring. In the end, I cannot tell if the applcation is not working because of the code or server, any help is appreciated. Thank you in advance.


STACK:

Django backend using Django Channels for WebScokets

Coturn (running on Linux)

React frontend

What is happening:

I want to test to make sure I have a good configuration of course. Here are the screen shots from both types of browsers. Both browser product questionable results. Although they seem to "gather" ok. Each test has it own unique response

CHROME enter image description here Reponse: Returns a 701, although returns Done for srf and relay. I tihnk "relay" is what I want because its a TURN server. ; which I have

FIREFOX

enter image description here

Response is not reachable?

I have opened both ports 3478 and 5439 for TCP and UDP


CommunicationScreen.js

import React, { useState, useEffect } from 'react'
import useWebSocket from "react-use-websocket"

import {Row, Col, Button } from "react-bootstrap"




function RemoteVideo({stream}){
    const vidRef = React.useRef(null)
    useEffect(()=>{
        if (!vidRef.current){
            vidRef.current.srcObject = stream
            vidRef.current.play()
        }
    },[vidRef])

    return(<div>
        <video ref={vidRef} src={vidRef.current} autoPlay />

    </div>)
}

function MyVideo(props){

    const { username } = props

    const mediaConstraints = {
        audio:true,
        video: { width: 600, height: 350 }
    }

    const videoRef = React.useRef(null) 

    const startCamera = async ()=>{
        let stream = await navigator.mediaDevices.getUserMedia(mediaConstraints)
        videoRef.current.srcObject = stream
        console.log('my stream', stream)
        props.setLocalStream(stream)
    
    }
    
    useEffect(()=>{
        startCamera()
    }, [])

    return(
        <div>
            <h2>User {username}</h2>
            <video ref={videoRef} src={videoRef.current} autoPlay/>
            
        </div>)
}

const MyMemoVideo = React.memo(MyVideo)
const RemoteMemoVideo = React.memo(RemoteVideo)

export default function CommunicationScreen({history, match}) {
    const iceServers = {
        iceServers: [
            {
                urls: 'turn:picksixcity.com:3478',
                username: <username>, // these dont matter as turnserver.conf is set to no-auth for testing
                credential: <password>
            }
        ]
    }
    const { roomName, username } = match.params 

    const { sendMessage, lastMessage, readyState } = useWebSocket('ws://localhost:8000/ws/' + roomName + '/'+username+'/')
    const [message, setMessage] = useState("")
    const [chat, setChat] = useState("")

    const [peers, setPeers] = useState([])
    const [pendingCandidates, setPendingCandidates] = useState([])
    const [socketId, setSocketId] = useState(null)
    const [localStream, setLocalStream] = useState()

    const [remoteStreams, setRemoteStreams] = useState([])
    let peerId;
    

    const setMyLocalStreamCallback = React.useCallback((stream)=>{
        // PRevents re-create of function on each render
        // for memoized Video
        setLocalStream(stream)
    },[])

    const createPeerConnection = React.useCallback(()=> {
        return new RTCPeerConnection(iceServers)
    }, [])

    function addPendingCandidates(peerId){
        console.log('adding pendingCandidates to connection', peerId)
        pendingCandidates.forEach(candidate=>{ // each candidate iterate over peer
            let foundPeer = peers.find(peer=>peer.id === peerId)
            console.log('adding candidate from pending')
            foundPeer.connection.addIceCandidate(candidate).catch(err=>console.log('Err adding candi', err))
        })
        console.log('completed adding candidates')
    }

    function onIceCandidate(e, peerId){ // send ICE candidate
        if (e.candidate) {
            sendMessage(JSON.stringify({
                'webrtc_ice_candidate': {
                    room: roomName,
                    label: e.candidate.sdpMLineIndex,
                    candidate: e.candidate.candidate,
                    // peerId:peerId
                },
            }))
        }
    }

    function submitMessage(){
        sendMessage(JSON.stringify({'message':message}))
        setMessage("")
    }

    function addRemoteStream(e){
        // Called when tracks are added
        console.log('remote stream object', e)
        let newStream = e.streams[0]
        setRemoteStreams(streams=>[
            ...streams.filter(stream=> stream.id !== newStream.id),
            newStream])
    }

    function addLocalTracks(rtcPeerConnection){
        // Add local tracks, media Track and Audio track
        // Tracks can be added to PeerConnection object before connecting to 
        // a Peer
        localStream.getTracks().forEach(track=>
            rtcPeerConnection.addTrack(track, localStream))
    }

    async function createOffer(newPeer, myPeerConnection){
        //Send an SIP request containing SDP information
        // uwth this peerconnection object to 
        //everyone in the room

        //this invokdes sendIceCandidates
        console.log('Creating offer')
        let sessionDescription;
        try{
            sessionDescription = await myPeerConnection.createOffer()
            myPeerConnection.setLocalDescription(sessionDescription)
        }catch(error){
            console.log('Error creating offer')
        }
        sendMessage(JSON.stringify({"webrtc_offer" : { sdpInfo:sessionDescription, room:roomName, user:username, peerId:newPeer } }))
        console.log('offer sent')
    }


    async function createAnswer(newPeer, myPeerConnection){
        let sessionDescription;
        try{
            sessionDescription = await myPeerConnection.createAnswer()
            myPeerConnection.setLocalDescription(sessionDescription) // will initiate onicecandidateevent 
        }catch(error){
            console.log('Error creating answer', error)
            console.log('session description ', sessionDescription)
        }
        sendMessage(JSON.stringify({"webrtc_answer": { sdpInfo: sessionDescription, room: roomName, peerId:newPeer } }))
    }


    const handleLastMessage = React.useCallback((messageDict)=>{
        // Hanldes last Websocket sent Message (could be anything)
        // MEssageDict is a React built returned
        // websocket object that wraps .onmessage
        //access our own 'message' property
        let data = JSON.parse(messageDict.data)
        if (data.message){
            setChat(chat + "\n" + data.message)
        }else if (data.connection_made){
            // addLocalTracks(connection)
            setSocketId(data.connection_made.peerId)
            addPendingCandidates(data.connection_made.peerId)
            let {users} = data.connection_made
            if (users >= 2){
                // start call
                if (localStream){
                    sendMessage(JSON.stringify({ 'start_call': true }))
                }
            }
        }else if(data.start_call){
            // On 2nd person joining room, we signal to 'host'(or everyone not recently joined user)
            // to automatically start the call process
            let newPeer = data.start_call.skip
            if (newPeer !== socketId){
                // Once a user has joined, primeUser will invoke an offer   
                //    addLocalTracks(connection)
                let peerConnection = createPeerConnection()
                    peerConnection.addEventListener('icecandidateerror', e=>console.log('candiate error', e))
                    addLocalTracks(peerConnection)// side note: negotatiatneeded might actually be invoked here but we wont use it
                    peerConnection.ontrack = addRemoteStream

                    
                    peerConnection.onicecandidate = e=>onIceCandidate(e, newPeer)
                    
                    console.log('offer made')
                    createOffer(newPeer, peerConnection)
                    .then(res=>{
                        setPeers(peers=>([
                            ...peers.filter(peer=> peer.id !== newPeer), 
                            {
                                id:newPeer,
                            connection:peerConnection
                            }]
                        ))
                        addPendingCandidates(newPeer)
                    })
            }
        }  else if(data.webrtc_ice_candidate){
            // console.log('this is peer', data.peerId)
            let icePeerId = data.peerId
            let found = false;
            let add = false;
            peers.forEach(peer=>{
                // if (icePeerId === peer.id){
                    found = true;
                    // Offers and answers need to be set on connection 
                    // for this to work 
                    if (!peer.connection.remoteDescription){
                        console.log('remote confirmed')
                        if(!add){
                            setPendingCandidates(candidates=>[...candidates, data.webrtc_ice_candidate])
                            add = true; // Keep iterating but dont update the list
                        }
                    }else{
                        console.log('attach icecandidate to connection')
                        peer.connection.addIceCandidate(new RTCIceCandidate({
                        sdpMLineIndex: data.webrtc_ice_candidate.label,
                        candidate: data.webrtc_ice_candidate.candidate
                    }))
                    }
                // }
            })

            
        }else if (data.webrtc_offer){
            // offer is sent to indivudal channels
            // create new connection for each offer received
            console.log('got an offer')
            peerId  = data.peerId
            let newPeerConnection = createPeerConnection()
            newPeerConnection.addEventListener('icecandidateerror', e => console.log('candiate error', e))
            newPeerConnection.ontrack = addRemoteStream

            newPeerConnection.onicecandidate = e=>onIceCandidate(e, peerId)

            newPeerConnection.setRemoteDescription({type:"offer",sdp:data.webrtc_offer})
            .then(res=>{
                addLocalTracks(newPeerConnection)
                createAnswer(peerId, newPeerConnection) // answer with a connection
            }).catch(err=>console.log('Error creating offer', err))
        
            setPeers(peers=>([...peers, 
                { 
                    id:peerId,
                    connection:newPeerConnection}]))
            addPendingCandidates(peerId)

        }else if (data.webrtc_answer){
            let setPeer = peers.find(peer=>peer.id === data.peerId)
            setPeer.connection.setRemoteDescription({ type: "answer", sdp:data.webrtc_answer})
        }      
    }, [lastMessage, localStream])



    useEffect(()=>{
    
            if (lastMessage){
                handleLastMessage(lastMessage)
            }
        
    }, [lastMessage, handleLastMessage])


    return (
        <div>
            <h2>{roomName}</h2>
            <Row>
                <Col>
                    <MyMemoVideo setLocalStream={setMyLocalStreamCallback}  username={username} /> </Col>
                <Col>
                {!console.log(remoteStreams) && remoteStreams.map((stream, ix)=>(
                    <RemoteMemoVideo key={ix} stream={stream} />
                ))}</Col>
                <div style={{ textAlign: "center", margin: "0 auto", width: "100%" }}>
                    <input onChange={e => setMessage(e.target.value)} value={message} />
                    <Button onClick={submitMessage}>Submit</Button>
                </div>
                <Col>
                </Col>
                <Col>
                    <textarea value={chat} onChange={() => { }} rows="10" cols="50"/>
                </Col>
            </Row>
            
        </div>
    )
}



consumers.py

import json
from collections import defaultdict


from channels.generic.websocket import AsyncWebsocketConsumer
from channels.layers import get_channel_layer



room_count = defaultdict(lambda : 0)


class ChatConsumer(AsyncWebsocketConsumer):
    
    async def connect(self):
        self.room_name = self.scope['url_route']['kwargs']['room_name']
        self.user_name = self.scope['url_route']['kwargs']['user_name']
        self.room_group_name = 'chat_%s' % self.room_name

        # once a connection is made, join room to with channel layer 'utility' as 1st arg
        # async_to_sync(if used) becomes a callable when passed the 'channel layer utility'
        await self.channel_layer.group_add(
            self.room_group_name,
            self.channel_name
        )


        room_count[self.room_group_name] +=1
        # send message that connection made
        peerId = self.channel_name
        peers[peerId] = ''
        await self.send(text_data=json.dumps({
            "connection_made":{
                "users":count,
                "peerId": peerId
                }
        }))

    async def disconnect(self, close_code): # runs anytime frontend loses connection
        room_count[self.room_group_name] -=1
        # make room leave group
        await self.channel_layer.group_discard(
            self.room_group_name,
            self.channel_name
        )


    async def receive(self, text_data):
        print(text_data)
        text_data_json = json.loads(text_data)
        message = text_data_json.get('message')
        start_call = text_data_json.get('start_call')
        webrtc_ice_candidate = text_data_json.get('webrtc_ice_candidate')
        webrtc_offer = text_data_json.get('webrtc_offer')
        webrtc_answer = text_data_json.get('webrtc_answer')

        if message:

            # broadcast a message to the group
            await self.channel_layer.group_send(
                self.room_group_name,
                {
                    'type': 'chat_message', # this is mandatory and a function is needed below, name of method invoked by consumers that recieve 
                                            #  this message
                    'message':message
                }
            )
        elif start_call:
            await self.channel_layer.group_send(
                self.room_group_name,
                {
                    'type': 'start_call',
                    'skip': self.channel_name
                }
            )

        elif webrtc_ice_candidate: 
            # broacast to all peers there is a candidate
            await self.channel_layer.group_send(
                self.room_group_name,
                {
                    'type': 'webrtc_ice_candidate',
                    'candidate': webrtc_ice_candidate,
                    'peerId':webrtc_ice_candidate.get('peerId')
                }
            )

        elif webrtc_offer: # user 1 chennl, peerId == user2
            peer_to_send = webrtc_offer.get('peerId')
            if peer_to_send: # send offer back to requesting peer WITH callee peerId
                channel_layer = get_channel_layer()
                await channel_layer.send(peer_to_send,
                {
                    "type":"single_offer",
                    'webrtc_offer':webrtc_offer['sdpInfo']['sdp'], "peerId":self.channel_name})
            else:
                await self.channel_layer.group_send(
                    self.room_group_name,
                    {
                        'type': 'webrtc_offer',
                        'webrtc_offer': webrtc_offer['sdpInfo']['sdp']
                    }
                )            
        elif webrtc_answer:
            peer_to_send = webrtc_answer.get('peerId')
            if peer_to_send:
                channel_layer = get_channel_layer()
                
                await channel_layer.send(peer_to_send,
                {
                    'type':"single_answer", 
                    'webrtc_answer':webrtc_answer['sdpInfo']['sdp'], "peerId":self.channel_name})

            else:
                await self.channel_layer.group_send(
                self.room_group_name,
                {
                    'type': 'webrtc_answer',
                    'webrtc_answer': webrtc_answer['sdpInfo']['sdp']
                }
            )               

    async def start_call(self, event):
        # Broadcast to entire toom that call is start 
        await self.send(text_data=json.dumps({
            "start_call" : {'skip':event['skip']}
        }))
               
        
    async def chat_message(self, event):
        # this is the type of event described in the async_to_sync, instead of calling channel_layer.recieve, this func is a wrapper
        # every consumer instance will have this, this is like a 'listener' 
        message = event['message']

        await self.send(text_data=json.dumps({
            'message':message
        }))

    async def webrtc_ice_candidate(self, event):
        # this broadcasts to all channel a new arrival
        candidate = event['candidate']
        peerId = event['peerId']
        await self.send(text_data=json.dumps({
            'webrtc_ice_candidate':candidate,
        }))


    async def single_offer(self, event):
        print('single offer')
        webrtc_offer = event['webrtc_offer']
        await self.send(text_data=json.dumps({'webrtc_offer':webrtc_offer,'peerId':event['peerId']}))

    async def webrtc_offer(self, event):
        print('group offer')
        sdp = event['webrtc_offer']
        await self.send(text_data=json.dumps({
            'webrtc_offer': sdp
        }))

    async def webrtc_answer(self, event):
        sdp = event['webrtc_answer']
        await self.send(text_data=json.dumps({
            'webrtc_answer': sdp
        })) 

    async def single_answer(self, event):
        webrtc_answer = event['webrtc_answer']
        await self.send(text_data=json.dumps({'webrtc_answer':webrtc_answer, 'peerId':event['peerId']}))

<hr>

Final Result
----

**Firefox** 
displays "there has been an ICE error"


**chrome**
The icecandidateerror callback is iniated and returns about the Host does not resolve

In both results, one can see the **remoteStreams** containing a media  stream object.

Server Logs Im not sure where each browsers starts and ends, I can re-test if necesarry, point is it seems like they are hitting the server so host is 'resolving' no?

...

18889: session 000000000000000035: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
18889: session 000000000000000034: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
18890: session 000000000000000036: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
18891: handle_udp_packet: New UDP endpoint: local addr 142.11.199.8:3478, remote addr 68.80.137.255:63401
18891: IPv4. Local relay addr: 142.11.199.8:63707
18891: session 000000000000000037: new, realm=<picksixcity.com>, username=<>, lifetime=3600
18891: session 000000000000000037: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
...
18894: session 000000000000000034: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
18895: session 000000000000000037: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
18897: session 000000000000000037: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
18902: session 000000000000000037: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19507: handle_udp_packet: New UDP endpoint: local addr 142.11.199.8:3478, remote addr 68.80.137.255:55567
19507: session 000000000000000038: realm <picksixcity.com> user <>: incoming packet BINDING processed, success
19507: IPv4. Local relay addr: 142.11.199.8:56232
19507: session 000000000000000038: new, realm=<picksixcity.com>, username=<>, lifetime=600
19507: session 000000000000000038: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19507: session 000000000000000038: refreshed, realm=<picksixcity.com>, username=<>, lifetime=0
19507: session 000000000000000038: realm <picksixcity.com> user <>: incoming packet REFRESH processed, success
19508: session 000000000000000038: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:55567, reason: allocation timeout
19508: session 000000000000000038: delete: realm=<picksixcity.com>, username=<>
19589: handle_udp_packet: New UDP endpoint: local addr 142.11.199.8:3478, remote addr 68.80.137.255:53749
19589: IPv4. Local relay addr: 142.11.199.8:58705
19589: session 000000000000000039: new, realm=<picksixcity.com>, username=<>, lifetime=3600
19589: session 000000000000000039: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19589: handle_udp_packet: New UDP endpoint: local addr 142.11.199.8:3478, remote addr 68.80.137.255:53751
19589: IPv4. Local relay addr: 142.11.199.8:60027
19589: session 000000000000000040: new, realm=<picksixcity.com>, username=<>, lifetime=3600
19589: session 000000000000000040: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
...
19591: session 000000000000000039: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19591: session 000000000000000040: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19592: session 000000000000000039: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19592: session 000000000000000040: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19596: session 000000000000000039: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19596: session 000000000000000040: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19626: handle_udp_packet: New UDP endpoint: local addr 142.11.199.8:3478, remote addr 68.80.137.255:65336
19626: IPv4. Local relay addr: 142.11.199.8:59891
19626: session 001000000000000025: new, realm=<picksixcity.com>, username=<>, lifetime=3600
19626: session 001000000000000025: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19626: handle_udp_packet: New UDP endpoint: local addr 142.11.199.8:3478, remote addr 68.80.137.255:60802
19626: IPv4. Local relay addr: 142.11.199.8:54372
19626: session 000000000000000041: new, realm=<picksixcity.com>, username=<>, lifetime=3600
19626: session 000000000000000041: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19626: session 001000000000000025: realm <picksixcity.com> user <>: 
...

19626: session 001000000000000025: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
...
19716: handle_udp_packet: New UDP endpoint: local addr 142.11.199.8:3478, remote addr 68.80.137.255:50814
19716: session 000000000000000042: realm <picksixcity.com> user <>: incoming packet BINDING processed, success
19716: IPv4. Local relay addr: 142.11.199.8:64560
19716: session 000000000000000042: new, realm=<picksixcity.com>, username=<>, lifetime=600
19716: session 000000000000000042: realm <picksixcity.com> user <>: incoming packet ALLOCATE processed, success
19716: session 000000000000000042: refreshed, realm=<picksixcity.com>, username=<>, lifetime=0
19716: session 000000000000000042: realm <picksixcity.com> user <>: incoming packet REFRESH processed, success
19717: session 000000000000000042: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:50814, reason: allocation timeout
19717: session 000000000000000042: delete: realm=<picksixcity.com>, username=<>
20814: session 000000000000000029: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:54339, reason: allocation timeout
20814: session 000000000000000029: delete: realm=<picksixcity.com>, username=<>
20814: session 000000000000000030: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:54347, reason: allocation timeout
20814: session 000000000000000030: delete: realm=<picksixcity.com>, username=<>
20815: session 000000000000000031: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:54377, reason: allocation timeout
20815: session 000000000000000031: delete: realm=<picksixcity.com>, username=<>
20815: session 001000000000000019: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:54357, reason: allocation timeout
20815: session 001000000000000019: delete: realm=<picksixcity.com>, username=<>
20815: session 000000000000000032: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:54367, reason: allocation timeout
20815: session 000000000000000032: delete: realm=<picksixcity.com>, username=<>
21237: IPv4. tcp or tls connected to: 34.230.34.4:29009
21237: HTTPS connection has been disabled due Vulnerability in the Web interface !!!
...
21463: session 001000000000000021: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:60122, reason: allocation timeout
21463: session 001000000000000021: delete: realm=<picksixcity.com>, username=<>
21463: session 000000000000000033: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:53251, reason: allocation timeout
21463: session 000000000000000033: delete: realm=<picksixcity.com>, username=<>
21464: session 001000000000000022: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:53261, reason: allocation timeout
21464: session 001000000000000022: delete: realm=<picksixcity.com>, username=<>
21486: session 001000000000000023: closed (2nd stage), user <> realm <picksixcity.com> origin <>, local 142.11.199.8:3478, remote 68.80.137.255:60417, reason: allocation timeout
21486: session 001000000000000023: delete: realm=<picksixcity.com>, username=<>

Solution

  • Wow. Turns out I was so much closer than I thought. 1st off. I just went and re-checked someone elses "simple example". https://github.com/shanet/WebRTC-Example. Surprisingly, even with deprecated code, this example worked flawlessley.

    ICE candidates

    "You dont want our own" Meaning the channel to signal the call to start sending iceCandidates should not also be recieving that SAME signal. So depending the backend, you can omit the channel that should not recieve it. So whether or not filtering is done on the back/frontend, to implement this, I sent the ID of the channel that created the offer/answer, along with the ICE candidates and made sure to disregard those messages 'ice_candidate' coming back to the same channel. If channel A sends offer, chaneel A disregards messages return after invoking (setLocalDescription).

    RemoteVideo

    With React. I think it was the way I was using useEffect in the main component (CommunicationScreen), mixed with using RemoteVideo as a memoized component. I removed the memoization and added remoteStreams to the dependency of useEffect in the main component. I also just used vanilla JS to add in the remote videos to verify they were actually working, and then I "worked by my way back"

    WEBRTC

    The best way for me to test if the configuration is correct, is with working code. I took the link above, simply added my own TURN urls and viola!! I did read ALL OVER that you CAN get 701 errors and 'onicecandidateerror' if set with a callback function, then if using Chrome, onicecanddiate can/will get called, but to ignore it. Which didnt make sense to me, to ignore it, until I ran the code. My console, even with a good working connection, STILL displays 701 errors and the onicecandidateerror is still firing (Since I have added a callback to verify).

    My best advice is to if your trying this with your own TURN Server is, after the candidates seem to gather on 8Trickle-Ice* test site (this task probably even could be skipped), just take someones working example, and add in your own TURN server configuration. That's a better indicator if it works or not. I spent too much time trying to get a more "thorough" understand of why I was getting mixed responses in Chrome/Firefox using the Trickle-ICE tool.