Search code examples
iosvideo-streaminghttp-live-streaming

How can I convert iPhone-captured-video to HTTP Live Streaming files?


I have an iOS app and Django backend on Amazon Web Services EC2/S3. Current process:

  1. The iOS app captures video using UIImagePickerController which outputs an MP4.
  2. The MP4 gets uploaded to my EC2 (Ubuntu) server running Django.
  3. Django reads the file and uploads to S3 for storage.
  4. Now, the iOS app can access the S3 movie to watch at a later time. This uses progressive streaming (i.e. fake streaming, just plays while it downloads).

Goal: My goal is to utilize HTTP Live Streaming (HLS) here.

Can someone offer suggestions on how to alter my current workflow to get HLS files (.M3U8 and .TS) on S3 to allow streaming? Thanks.


Solution

  • The question is really old and I guess you have moved on...

    But just for the sake of completeness - you have at least two options:

    a. Convert the files just ONCE to the HLS format with all required bitrates, host the converted files in S3.

    You can do this using a 3rd party encoding service like Encoding.com or Zencoder, deploy your own stack using a platform like Kaltura (there are quite a few other platforms I can't recall instantly) or roll your own transcoding server with ffmpeg.


    b. Use a media server that can transcode from the MP4 you already have, to the format of choice on the fly (depending on the requesting client).

    Wowza Media Server is a great example of this. Microsoft's IIS Media Services, and Adobe's Flash Media Server are also widely used for delivering to iOS PLUS several other platforms. There are literally endless options here, all you have to do is configure each server correctly for HLS. Google can easily find the right samples for you.


    In both cases, you can setup CloudFront to then read the files from S3 or from the media server of your choice. The latter is relatively harder and I haven't done it, so unfortunately I don't have useful links for you.