I am totally new to video encoding and options, and just learned about Apple's HLS requirements.
So far, I've been able to get something working for my iOS app. However, I find the entire process to be very slow and manual. Now, having to repeat this for several more locales (video translations), I can't imagine there isn't a better way.
To have control over the bitrate, I use HandBrake to create a new .mp4 file with the appropriate video encoder setting, for each bitrate I want (192k, 400k, 1m, etc.). THEN, I move onto creating the playlists. This alone takes several minutes--is there a better way? tsrecompressor
seemed close, but it just streams to a local port and doesn't save any playlists.
Then I use Apple's suite of command-line tools (mediafilesegmenter
, variantplaylistcreator
, mediastreamvalidator
, hlsreport
) to generate the playlists, combine into a master playlist, validate, etc. I suppose this part could be somewhat automated with a script. I've seen others use FFMPEG, but I think the latter 3 Apple tools would still need to be sequentially applied.
Do you see anything that can be obviously optimized?
From reading - it looks like you are creating different mp4s at each size, and then the HLS streams?
With FFMPEG, you can create all the HLS streams/playlists without creating all the mp4s first. That will cut out ~ 50% of the encoding you are doing :)
There are also services that can create host and deliver your HLS streams for you - saving you all the time of creating the videos (I work for one - api.video).