Search code examples
ffmpegavplayerdecodingdecoder

FFmpeg, videotoolbox and avplayer in iOS


I have a question how these things are connected and what they exactly do.

FYI, I have a few experience about video player and encoding and decoding.

In my job I deal udp streaming from server and take it with ffmpeg and decodes it and draw it with openGL. And also using ffmpeg for video player.

These are the questions...

1. Only ffmpeg can decodes UDP streaming (encoded with ffmpeg from the server) or not?

I found some useful information about videotoolbox which can decode streaming with hardware acceleration in iOS. so could I also decode the streaming from the server with videotoolbox?

2. If it is possible to decode with videotoolbox (I mean if the videotoolbox could be the replacement for ffmpeg), then what is the videotoolbox source code in ffmpeg? why it is there?

In my decoder I make AVCodecContext from the streaming and it has hwaccel and hwaccel_context field which set null both of them. I thought this videotoolbox is kind of API which can help ffmpeg to use hwaccel of iOS. But it looks not true for now...

3. If videotoolbox can decode streaming, Does this also decode for H264 in local? or only streaming possible?

AVPlayer is a good tool to play a video but if videotoolbox could replace this AVPlayer then, what's the benefit? or impossible?

4. FFmpeg only uses CPU for decoding (software decoder) or hwaccel also?

When I play a video with ffmpeg player, CPU usage over 100% and Does it means this ffmpeg uses only software decoder? or there is a way to use hwaccel?

Please understand my poor english and any answer would be appreciated.

Thanks.


Solution

  • 1. Only ffmpeg can decodes UDP streaming (encoded with ffmpeg from the server) or not?

    I don't think so, since videotoolbox can do that too.

    2. If it is possible to decode with videotoolbox (I mean if the videotoolbox could be the replacement for ffmpeg), then what is the videotoolbox source code in ffmpeg? why it is there?

    There is a solution which could use ffmpeg to read the data from tcp socket and transfer it to videotoolbox.

    This solution avoids the high CPU usage when using only ffmpeg and also avoids some faults from using only vdeotoolbox such as delays from reading the data from socket.

    3. If videotoolbox can decode streaming, Does this also decode for H264 in local? or only streaming possible?

    videotoolbox could play for both local and streaming. (correct me if I'm wrong)

    4. FFmpeg only uses CPU for decoding (software decoder) or hwaccel also?

    Yes, ffmpeg uses CPU only which is software codec while videotoolbox is hardware codec.