I'm working on a school project which needs to send an audio signal through WebRTC. The server itself is a RaspberryPi 3 (ARM Chip) and the system should work without internet, only intranet. We have devices connected to the Server and need to cast audio signal from certain devices to all the others.
Peer to server to peer schema ( ^.^)āŖ
/ --> š WebRTC Client 1
š¤ WebRTC Capture ---\ The / --> š WebRTC Client 2
š¤ WebRTC Capture ---- Raspberry ---> š WebRTC Client ...
š¤ WebRTC Capture ---/ Pi 3 Server \ --> š WebRTC Client 20
\ --> š WebRTC Client 21
I found node-webrtc
but there is very little documentation and can't get it to working...and the performance with audio channel only seem crappy... have you ever made something like that? The alternative seems to be using a DataChannel instead of MediaStream which seems to be not well supported. But there will the problem with choppy audio maybe cos it will be sent in chunks, no?
How would you implement it? I would like to stick with node.js if possible.
The thing which you are looking for is a media server with WebRTC support.
After a bit of googling, I found that there is one written by medooze for node.js and has support for RPi, but I haven't tested it before.
https://github.com/medooze/media-server-node
It looks like there are some examples and docs.
Other widely used media servers: