Search code examples
iosswiftavfoundationvideo-processingcmsamplebufferref

Square video output in iOS


Is there a way to get square video output by AVFoundation in iOS?

I use OpenGL for processing every frame(CMSampleBuffer) of video. Every frame
is rotated, so I need to crop and rotate CMSampleBuffer. But I don't know how to do that, so I believe that there is a way to get already cropped and rotated frames by setting properties(videoSettings) in AVCaptureVideoDataOutput.

I googled, googled and googled it but found nothing. Code example in swift would be great.

Update:

My full final solution in Swift:

override func viewWillAppear(animated: Bool) {
    super.viewWillAppear(animated)

    captureSession = AVCaptureSession()
    captureSession!.sessionPreset = AVCaptureSessionPreset640x480

    let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    var error: NSError?
    var input: AVCaptureDeviceInput!
    do {
        input = try AVCaptureDeviceInput(device: backCamera)
    } catch let error1 as NSError {
        error = error1
        input = nil
    }
    if error == nil && captureSession!.canAddInput(input) {
        captureSession!.addInput(input)
        stillImageOutput = AVCaptureStillImageOutput()
        stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG, kCVPixelBufferPixelFormatTypeKey: Int(kCVPixelFormatType_32BGRA)]
        if captureSession!.canAddOutput(stillImageOutput) {
            captureSession!.addOutput(stillImageOutput)
        }
    }
    videoOutput = AVCaptureVideoDataOutput()
    videoOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey: Int(kCVPixelFormatType_32BGRA), AVVideoWidthKey : 100, AVVideoHeightKey: 100]
    videoOutput!.setSampleBufferDelegate(self, queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))

    if captureSession!.canAddOutput(self.videoOutput) {
        captureSession!.addOutput(self.videoOutput)
    }

    videoOutput!.connectionWithMediaType(AVMediaTypeVideo).videoOrientation = AVCaptureVideoOrientation.PortraitUpsideDown
    videoOutput!.connectionWithMediaType(AVMediaTypeVideo).videoMirrored = true
    captureSession!.startRunning();
}

It's mirroring and rotating video output perfectly for me! But it isn't cropping!


Solution

  • To rotate the CMSampleBuffer, you should attend to this Apple technote:

    https://developer.apple.com/library/ios/qa/qa1744/_index.html

    In particular, if you want to physically rotate the video (as opposed to just setting an orientation flag) you can..

    for example in the callback :

    - (void)captureOutput:(AVCaptureOutput *)captureOutput 
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
           fromConnection:(AVCaptureConnection *)connection 
    

    if you do this:

        [connection setVideoOrientation:AVCaptureVideoOrientationPortraitUpsideDown];
    

    you will get an upside-down video.

    To crop the video, you need to use an AVAssetWriterInput, in which you can set the crop using the videoSettings dictionary.

    For example:

    NSDictionary *videoSettings = @{
          AVVideoCodecKey  : AVVideoCodecH264
        , AVVideoWidthKey  : @(100)
        , AVVideoHeightKey : @(100)
             };
    

    used here:

      self.assetWriterVideoInput = [[AVAssetWriterInput alloc]
                                   initWithMediaType:AVMediaTypeVideo
                                      outputSettings:videoSettings];
    

    will give you a video sized to 100 x 100 px, full width but cropped height to square.

    Check out AVVideoSettings.h for the full list of keys