Search code examples
iosswiftuikitavfoundation

Cannot update UIViewImage in while loop with AVAssetTrackReaderOutput


I am trying to extract each frame from a video do some image manipulation and then display the image back to a UIImageView.

If I click on the UIButton manually, the image will show and iterate over the whole video and show each frame.

However, if I embed the update display in a while loop, then the view does not update (i.e., on the device it doesn't update).

I thought it was because the processing of the frames was too fast with respect to drawing the updated image on the screen, so I put in a sleep line to slow it down to the frame rate of the video, but that doesn't work.

Here is the code:

import UIKit
import Foundation
import Vision
import AVFoundation
import Darwin

class ViewController: UIViewController {

    var uiImage: UIImage?

    var displayLink: CADisplayLink?

    var videoAsset: AVAsset!
    var videoTrack: AVAssetTrack!
    var videoReader: AVAssetReader!
    var videoOutput: AVAssetReaderTrackOutput!


    @IBOutlet weak var topView: UIImageView!

    @IBOutlet weak var bottomView: UIImageView!

    @IBOutlet weak var rightLabel: UILabel!

    @IBOutlet weak var leftLabel: UILabel!

    @IBAction func tapButton(_ sender: Any) {

        while let sampleBuffer = videoOutput.copyNextSampleBuffer() {
            print ("sample at time \(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))")
            if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {

                let ciImage = CIImage(cvImageBuffer: imageBuffer)
                uiImage = UIImage(ciImage: ciImage)
                self.topView.image = uiImage
                self.topView.setNeedsDisplay()
                usleep(useconds_t(24000))

            }

        }
    }


    override func viewDidLoad() {

        super.viewDidLoad()

    }



    override func viewDidAppear(_ animated: Bool) {




        guard let urlPath = Bundle.main.path(forResource: "video1", ofType: "mp4")  else {
            print ("No File")
            return
        }

        videoAsset = AVAsset(url: URL(fileURLWithPath: urlPath))
        let array = videoAsset.tracks(withMediaType: AVMediaType.video)
        videoTrack = array[0]


        do {
            videoReader = try AVAssetReader(asset: videoAsset)
        } catch {
            print ("No reader created")
        }

        videoOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange])
        videoReader.add(videoOutput)
        videoReader.startReading()


    }

}

Solution

  • Instead of sleeping you should use a timer in your case. Try something like the following:

    let frameDuration: TimeInterval = 1.0/60.0 // Using 60 FPS
    Timer.scheduledTimer(withTimeInterval: frameDuration, repeats: true) { timer in
        guard let sampleBuffer = videoOutput.copyNextSampleBuffer() else  {
            timer.invalidate()
            return
        }
    
        print ("sample at time \(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))")
        if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
            let ciImage = CIImage(cvImageBuffer: imageBuffer)
            uiImage = UIImage(ciImage: ciImage)
            self.topView.image = uiImage
            self.topView.setNeedsDisplay()
        }
    }
    

    The problem in your case is that you are still on main thread when doing this. So when you sleep you do not let your main thread proceed. So basically your main thread is building images and sleeping but you do not give it time to actually update the user interface.

    You could do a similar thing using a separate thread. A simple dispatch would do for instance:

    DispatchQueue(label: "Updating images").async {
        // Now on separate thread
        while let sampleBuffer = videoOutput.copyNextSampleBuffer() {
            print ("sample at time \(CMSampleBufferGetPresentationTimeStamp(sampleBuffer))")
            if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
                let ciImage = CIImage(cvImageBuffer: imageBuffer)
                uiImage = UIImage(ciImage: ciImage)
    
                // The UI part now needs to go back to main thread
                DispatchQueue.main.async {
                    self.topView.image = uiImage
                    self.topView.setNeedsDisplay()
                }
    
                usleep(useconds_t(24000))
    
            }
        }
    }
    

    but it is important that you still update (at least) UIKit part on main thread. It might be that even the CI part needs to be on main thread. It is best to just try it.