Search code examples
node.jsfirebasewebrtc

Online webRTC audio stream server configuration


I'd like to build a simple online webRTC application where:

  • User A connects to the webpage, grants access to his microphone and starts streaming
  • User B connects to another tab, collects the audio stream and is able to hear it.

This, as I've been studying for a couple of days, should be fairly simple, but I'm having severe problems with actually sending and recieving the streams. In particular, I've been following this guide which uses firestore to store and retrieve offers, but it is all very confusing, and aimed to 2 very different, changing devices which need to call each other.
My question is, with my first device being fixed, what are the steps I should take from the recieving end, (user B) to actually connect to the webpage and starting hearing the audio?
Thank you!


Solution

  • Found a good implemented solution using MultiRTC