Search code examples
iosvideo-streaminghttp-live-streaming

Encoding Video for HTTP Live Streaming with Subtitles


I've been following Apple's guide for HTTP Live Streaming and have it working nicely in my app but I'd like to embed subtitles in the video. This thread on Quora suggests that Netflix have done it.

My source videos have subtitles embedded via a .srt file using QuickTime Pro and the subtitles can be seen when playing the video in QuickTime and in my iOS app. Once I split the video into smaller .ts files using mediafilesegmenter (as required for HLS), the subtitles disappear. I'm using MPMoviePlayerController.

Is there a special way to encode the source video or to use mediafilesegmenter for subtitling to work?


Solution

  • It is possible to have captions in the ts stream (e.g. EIA-608 or EIA-708). I don't know of any good free tool for inserting such subtitles into a ts stream, but probably Manzanita will sell you something (for an absurd price).

    Unfortunately, many HLS players will ignore the subtitles because the HLS specification has no mention of how subtitle tracks should be handled. In your own app, you can test it by finding a ts file that already contains EIA-608/EIA-708 text, and segmenting it.

    To be frank, I think embedding subtitles in the TS is a dead end, and you'll find it easier to write your own subtitle displayer triggered by the TimedMetadata.

    === UPDATE ===

    Since version 9 (September 22, 2012) of the HLS draft, WebVTT subtitles are supported in HLS. I don't know of any tool for preparing streams with them.