Search code examples
javascriptnetwork-programmingudpwebrtc

UDP stream into webRTC


I am just starting to learn web/network programming (hence not sure what information is relevant), but my goal is to play a stream video acquired by a computer X to a webpage hosted on computer Y as close to realtime as possible. I currently have an awkward solution that is just about ok for my needs whereby the video acquired on computer X is sent to computer Y through a UDP socket. This data is then sent (via ffmpeg) into a 'fake webcam' created using v4l2loopback which is then read using the getUserMedia(). Doing this I have a choppy-ish video that lags less than 2s.

My question, simply, is whether it is possible to read in the UDP stream directly in the webRTC framework (somehow, and if so how) rather than the going awkwardly through the webcam.


Solution

  • You can't do that directly with WebRTC since it doesn't expose a raw socket but your options are:

    Convert it to an HLS live stream

    • Convert the UDP stream (I assume RTMP? What is ffmpeg outputting?) to an HLS stream on your server via nginx-hls-rtmp.
    • Use hls.js to play said video.

    Convert it to a DataChannel and send it over SCTP

    • Set up a media streaming WebRTC server for example wrtc
    • Connect to the web client via WebRTC (for example simple peer can help).
    • Set up a DataChannel that works over SCTP (implemented with libsctp over udp in Chrome Safari and Firefox).
    • Send the data on the data channel and decode it there.

    Connect directly from X to Y via WebRTC

    • You can also set up a WebRTC server and connect ffmpeg to that
    • Open a Media channel between the WebRTC 'server' peer and the client.
    • Stream the video.

    Of these options - the first is by far the simplest. The disadvantage is that it isn't very suitable for low latency streaming + there is some overhead.

    I recommend you start with it and work your way up.