Search code examples
iosswifthttp-live-streaming

How can live video be streamed from an iOS device to a server?


I want to be able to stream live video from an iOS device to a server. I tried to use an AVCaptureOutput that captures each frame as a CMSampleBuffer and appends it using an AVAssetWriter, but I don't know when or how to take the input from the file and send it to the server. How should it be formatted? How do I know when to send it?


Solution

  • Though i am not sharing any code with you, I am sharing my logic with you what i have done in one of my app.

    First way(The easy one): There are lots of low cost third party library available for your use.

    Second way(The hard one): Create small chunk of video for example 2sec or less, keep them in queue and upload it on the server(don't use afnetworking or http method it will slow down the process use some chat server like node.js or other). And keep one text file or db entry where you keep the track of the chunk file and its sequence. And once your first chunk is uploaded you can use ffmpg to make a video from the actual chunk, the more chunk you got add them in the main video file, and if you play the actual video on the device you don't have to do any more modification it will automatically fetch the new part once it is changed on the server.

    Thank You. Hope it helps you.