Search code examples
iosswiftvideoavfoundationavplayerlooper

AVPlayerLoop not seamlessly looping - Swift 4


My problem is: I am trying to do seamless looping (I intend to make my AVPlayer or AVPlayerQueue, loop without any delay between playbacks). So for example, if I do a video and go to the playback, it should be looping endlessly without any blips in between or delays in the looping.

I have written the code below (its straight from example code too):

    var playerQQ: AVQueuePlayer!
    var playerLayur: AVPlayerLayer!
    var playerEyetem: AVPlayerItem!
    var playerLooper: AVPlayerLooper!

    func playRecordedVideo(videoURL: URL) {

        playerQQ = AVQueuePlayer()
        playerLayur = AVPlayerLayer(player: playerQQ)
        playerLayur.frame = (camBaseLayer?.bounds)! 
       camBaseLayer?.layer.insertSublayer(playerLayur, above: previewLayer) 

       playerEyetem = AVPlayerItem(url: videoURL)
        playerLooper = AVPlayerLooper(player: playerQQ, templateItem: playerEyetem)
        playerQQ.play()

    }

The code above does not loop seamlessly; it has blips in-between the end of the current Player and the next one. I have tried a lot to find the problem and searched it online and have not found a solution. Also, I've been trying NSNotifications and other methods including setting Player.seek(to: zero) when the player finishes playback. But nothing has worked at all.

Any help would be appreciated :)


Solution

  • One thing to to keep in mind when looping assets is that audio and video tracks can have different offsets and different durations, resulting in 'blips' when looping. Such small differences are quite common in recorded assets.

    Iterating over the tracks and printing the time ranges can help to detect such situations: for track in asset.tracks { print( track.mediaType); CMTimeRangeShow( track.timeRange); }

    To trim audio and video tracks to equal start times and equal durations, get the common time range of the tracks, and then insert this time range from the original asset into a new AVMutableComposition. Normally, you also want to preserve properties like the orientation of the video track:

    let asset: AVAsset = (your asset initialization here)
    
    let videoTrack: AVAssetTrack = asset.tracks(withMediaType: .video).first!
    let audioTrack: AVAssetTrack = asset.tracks(withMediaType: .audio).first!
    
    // calculate common time range of audio and video track
    let timeRange: CMTimeRange = CMTimeRangeGetIntersection( (videoTrack.timeRange), (audioTrack.timeRange))
    
    let composition: AVMutableComposition = AVMutableComposition()
    
    try composition.insertTimeRange(timeRange, of: asset, at: kCMTimeZero)
    
    // preserve orientation
    composition.tracks(withMediaType: .video).first!.preferredTransform = videoTrack.preferredTransform
    

    Since AVMutableComposition is a subclass of AVAsset, it can be used for AVPlayerLooper-based looping playback, or exporting with AVAssetExportSession.

    I've put a more complete trimming implementation on github: https://github.com/fluthaus/NHBAVAssetTrimming. It's more robust, handles multiple tracks, preserves more properties and can either be easily integrated in projects or be build as a standalone macOS command line movie trimming utility.