I've developed an application in iOS that plays a list of videos. Data structure is that I send a request to my own server and get the URL for all the videos at once after I create the views of my viewController.
After that, I start playing the first item of my URLs one after the other. Problem is, for the first video, it takes more than 10 secs to actually load; but for the rest of the videos that are sometimes longer and larger than the first video, it takes a lot less time to load (maybe 1 or 2 secs).
Usually my first video is very short and very small (200 KB average) yet it still takes a lot longer time to load than say my second video that is 1 mb (5 times longer, yet 5 times smaller).
I've been studying the issue for the last 3 days and I've tried a lot of different ways that I will mention below, but my question is "Why this happens?" not a way that might solve it. I want to know why it happens so I can solve it with my knowledge of AVFoundation or else write my own player that will finally solve that problem for me.
This is my code to initialize it:
self.player = [AVPlayer new];
self.playerView = [[NZPlayerView alloc] initWithPlayer:self.player];
self.playerView.videoGravity = AVLayerVideoGravityResizeAspect;
self.playerView.translatesAutoresizingMaskIntoConstraints = NO;
[self.view addSubview:self.playerView];
// Constraints
Note that what you see above as NZPlayerView
is mine and it's simply a view that has an AVPlayerLayer inside it and handles some extra thing for me like the application going to background and coming back, etc.
I don't think that this view is causing any problems because the problem seems persistent with other developers using other methods of initializing their player.
After I create my view and send a request to my own server, I get a list of URLs that I then play one by one using below code
AVPlayerItem *newCurrentItem = [[AVPlayerItem alloc] initWithURL:[NSURL URLWithString:[[self.selectedMovementModel videoURL] url]]];
[self.player replaceCurrentItemWithPlayerItem:newCurrentItem];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playerDidFinishPlaying:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.player.currentItem];
Then, I play the video player using the method [self.player playImmediatelyAtRate:1];
which is one of the ways I actually tried to minimize the initial stall, I used to play my video player using the method [self.player play];
After this video is finished playing I decide if I should loop the video or not and if not it comes back to this same function and plays the next selectedMovementModel
.
I'm also observing self.player.status
, self.player.rate
, self.player.timeControlStatus
and self.player.reasonForWaitingToPlay
using RACObserve
which works the same way as KVO
except that I don't have to get rid of it after I dismiss my viewController so no issues there either.
The ways I've tried are using: loadValuesAsynchronouslyForKeys:completionHandler:
after I create my new currentItem
and on it's Asset. I set preferredForwardBufferDuration
to a very small number for my new currentItem
.
I also tried setting the automaticallyWaitsToMinimizeStalling
property of my player to false, but all to no avail.
I've also profiled my network using xCode instruments and it seems like there is nothing different between my first video and others. iOS starts downloading the video in chunks and then plays those chunks as soon as they are available and even if I configure my player in a way that it downloads the whole video before starting, my first video should start a lot earlier than say my second one because it's much smaller.
One of my guesses is that it might take longer because player is establishing a connection to my HOST for the first time and then the player maintains that session for an unknown amount of time. which would make sense if two videos from two different websites took relatively similar times to load, but they don't. The first one still takes a longer time.
Another solution that I'd rather avoid at all cost is that I would create an instance of my player before my server responds or even the page before and load it with a very small and short video, so that it gets to do what it has to do before the user has to see the long loading.
But I'd much rather know what's causing the problem before taking such desperate measures.
EDIT 1 **: I created a singleton in the application which has an instance of the video player I'm using in all my view controllers. I initialized it with a URL and loaded it. But I still have the same problem in my view controllers for the first video of the list. I also thought that maybe my video player isn't loading lacking an interface but no change in the results. I'm going to try to read Telegram's code because I know that they are using AVPlayer, even though they are downloading the videos before playing it in most cases, there might be some clue as to how they initialize their video players.
Thank you guys in advance.
The reason for the AVPlayer initial delay was a very rare case that might not happen for anyone ever, but considering that it might be interesting and oddly satisfying to have found the answer myself, I'm going to post it.
After further investigation, I realized that there are logs in my Xcode debugger saying that a certain task with my video's URLs is being "canceled" with error code "-999". Which was certainly interesting because I had not told my video player to start playing yet.
I also realized that my application is using more than 180 MB of memory roughly around the time that the viewController's viewDidLoad
method was called.
Considering I did not have any access as to which task was being canceled(My own request was not being canceled). I decided to swizzle the cancel method of NSURLSessionDataTask
class and I found my problem. When initializing my model's list, I was adding a function to each model that would determine if my video should loop using the below code.
_defineWeakSelf;
AVAsset *asset = [AVAsset assetWithURL:[NSURL URLWithString:[self.videoURL url]]];
[asset loadValuesAsynchronouslyForKeys:@[@"duration"] completionHandler:^{
AVKeyValueStatus status = [asset statusOfValueForKey:@"duration" error:nil];
weakSelf._shouldLoop = YES;
if (status == AVKeyValueStatusLoaded)
{
CGFloat fullDurationSeconds = CMTimeGetSeconds([asset duration]);
if (weakSelf._movementType == MovementTypeTime)
weakSelf._shouldLoop = (fullDurationSeconds < weakSelf.duration);
else
weakSelf._shouldLoop = YES;
}
}];
Note that _defineWeakSelf
is something like @strongify and @weakify except that it's a lot easier to use.
And that was the problem. each time this function was being used when creating my model, a new background thread was being created and a new request was being sent to actually retrieve the video all at the same time. which on my laptop wouldn't be that much of a problem if there are at most 30-40 models, but there were more than 90 models and imagine this happening to an iOS phone. naturally, no device has the ability to send so many requests at the same time and some of them were being canceled (like my first video) over and over again, and when they were done, the delay would go away.
Now that this problem has been solved. all the other suggestions including the solutions provided by other members of the community is a lot more effective and actually reduces the initial instantiation delay to a bare minimum. So, I thank you all.
*I also checked Telegram's use of AVPlayer and found that my initialization is very much similar to it, so you can check that as well on their Github.