I am just starting to learn web/network programming (hence not sure what information is relevant), but my goal is to play a stream video acquired by a computer X to a webpage hosted on computer Y as close to realtime as possible. I currently have an awkward solution that is just about ok for my needs whereby the video acquired on computer X is sent to computer Y through a UDP socket. This data is then sent (via ffmpeg) into a 'fake webcam' created using v4l2loopback which is then read using the getUserMedia(). Doing this I have a choppy-ish video that lags less than 2s.
My question, simply, is whether it is possible to read in the UDP stream directly in the webRTC framework (somehow, and if so how) rather than the going awkwardly through the webcam.
You can't do that directly with WebRTC since it doesn't expose a raw socket but your options are:
nginx-hls-rtmp
. Of these options - the first is by far the simplest. The disadvantage is that it isn't very suitable for low latency streaming + there is some overhead.
I recommend you start with it and work your way up.