Search code examples
androidrtsphttp-live-streaminglive-streaming

How do I prepare a file for live video streaming on Android?


I see a lot of resources outlining how to view live video streams on Android, according to various protocols like HLS and RTSP.

But I can't find a clear outline of how to prepare a file from the Android device to the server so that it can then be distributed.

I understand the file needs to be compressed into H264 (to be compatible with most streaming protocols) and then ideally cut into .ts chunks. But at which point in the flow is this done? Does the compressed H264 content get streamed to the server and the "chunking" occur there, or should the "chunking" be performed on the device then streamed?

Ideally this question could serve as a repository for basic instructions on how to create a live streaming feed from an Android device, regardless of playback protocols.

I hope this question makes sense - happy to amend given any feedback from the community.


Solution

  • Libraries like Libstreaming are fantastic wrappers around a basic streaming protocol that works like this in Android:

    • Create a media recorder.
    • Set the outputfile to a remote location with an RTSP:// header.

    This will force Android to push the file to that remote location.

    This is the basic way in which a broadcast is sent remotely from the phone to a media server location. It's the job of the media server to then transcode the content so it can be consumed live.

    Libraries out there are probably using custom tools, and different protocols for this - but for anyone who wonders how this would be done manually - that's the basic process.