Search code examples
androidjavafxwebrtcandroid-camera2tornadofx

Advice on streaming application for Android via WebRTC


I have to build an application for android to stream video and audio to a desktop application through a server. Latency is important. I also have to make sure that android streaming can be controlled from pc (user should be able to switch the camera or turn off the microphone).

I thought to use the WebRTC protocol for communication but it seems I'm gonna have to write signalling server myself to support that requirement mentioned above.

Is there a better way to implement this whole thing? Also, I can't find any good docs or libraries for android streaming (no retrofit analogies obviously).

P.S. I'm thinking about using Javafx via Tornadofx for a desktop application.


Solution

  • You certainly don't need to create your own signaling server. I would suggest using something like Kurento Streaming Server or a derivation of Kurento like OpenVidu. It's open source and free and has lot's of great and active support via google groups. Depending on how much specific customization you may need one or the other might be better for you. OpenVidu allows for less customization since most of the stuff under the hood is already done for you, whereas Kurento allows you to modify and customize almost everything under the hood and on the front end using examples that can be changed at the code level. I have used it extensive on projects on the past and would think it meets most, if not all of your requirements. Scaling can be a bit challenging, but is still mush easier than just P2P webRTC since everything is relayed through a central server and most certainly doable depending on your requirements and implementation. Additionally you can record, process and transcode video server side.