Search code examples
ffmpegvideo-streamingstreamingwebrtch.264

WebRTC H264 video live streaming (w FFMPEG) from OpenGL


I am trying to make a peer-to-peer game streaming platform. At this point I managed to capture the OpenGL frames and I have a functional Java websockets server, I can have 2 clients that establish a peer to peer connection (I have solved the STUN/TURN servers part) and transfer text at this point.

I do not quite understand how I could stream a video made out of the Opengl frames with a low latency (<100ms). The problem mainly lies in the FFMPEG part, I want to use this to encode the frames, get the result (stdin/stdout redirect for ffmpeg ?), somehow link to the the JS API of the host (maybe a local websocket to which the JS of the hoster will connect to).

I tried several FFMPEG arguements/commands with stdin and stdout pipes and they did not work.

enter image description here


Solution

  • What WebRTC Client are you using? What is the H264 Live stream flowing into?

    WebRTC in the browser has a few restrictions (just because the implementation is naive). Try doing constrained-baseline, and do a very small keyframe interval (every second is usually good for a prototype!)

    If you don't have a WebRTC client you can do something like webrtc-remote-screen