Search code examples
iosswiftvideoavcapturesession

How can i record an mp4 video in mpeg4 container with AVCaptureSession with swift (IOS 8)


I will answer my own question to share my experience, since there is no complete working code on the internet.

IOS devices usually record videos in .mov files with quicktime format. Even the output video has AVC baseline video codec and AAC audio codec, the resulting file will be in quicktime container. And those videos may not play in android devices. Apple has Avfoundation classes like AvCaptureSession and AVCaptureMovieFileOutput but they do not directly support mp4 file output. How can i record an actual mp4 video in mpeg4 container with swift and IOS 8 support?


Solution

  • First things first: This may not be the best solution, but this is a complete solution.

    The code below captures video and audio with AvCaptureSession and converts it into mpeg4 with AvExportSession. There is also zoom in, zoom out and switch camera functionality and permission checking. You can record in 480p or 720p. You can also set minimum and maximum frame rates to create smaller videos. Hope this helps as a complete guide.

    Note: There are keys to add to info.plist to ask for camera and photos library permission:

    <key>NSCameraUsageDescription</key>
    <string>Yo, this is a cam app.</string>
    
    <key>NSPhotoLibraryUsageDescription</key>
    <string>Yo, i need to access your photos.</string>
    
    <key>NSMicrophoneUsageDescription</key>
    <string>Yo, i can't hear you</string>
    

    And the code:

    import UIKit
    import Photos
    import AVFoundation
    
    class VideoAct: UIViewController, AVCaptureFileOutputRecordingDelegate
    {
        let captureSession : AVCaptureSession = AVCaptureSession()
        var captureDevice : AVCaptureDevice!
        var microphone : AVCaptureDevice!
        var previewLayer : AVCaptureVideoPreviewLayer!
        let videoFileOutput : AVCaptureMovieFileOutput = AVCaptureMovieFileOutput()
        var duration : Int = 30
        var v_path : URL = URL(fileURLWithPath: "")
        var my_timer : Timer = Timer()
        var cameraFront : Bool = false
        var cameras_number : Int = 0
        var max_zoom : CGFloat = 76
        var devices : [AVCaptureDevice] = []
        var captureInput : AVCaptureDeviceInput = AVCaptureDeviceInput()
        var micInput : AVCaptureDeviceInput = AVCaptureDeviceInput()
    
        @IBOutlet weak var cameraView: UIView!
    
        override func viewDidLoad()
        {
            super.viewDidLoad()
            if (check_permissions())
            {
                initialize()
            }
            else
            {
                AVCaptureDevice.requestAccess(forMediaType: AVMediaTypeVideo, completionHandler: { (granted) in
                    if (granted)
                    {
                        self.initialize()
                    }
                    else
                    {
                        self.dismiss(animated: true, completion: nil)
                    }
                })
            }
        }
    
        func check_permissions() -> Bool
        {
            return AVCaptureDevice.authorizationStatus(forMediaType: AVMediaTypeVideo) ==  AVAuthorizationStatus.authorized
        }
    
        @available(iOS 4.0, *)
        func capture(_ captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAt outputFileURL: URL!, fromConnections connections: [Any]!, error: Error!)
        {
        //you can implement stopvideoaction here if you want
        }
    
        func initialize()
        {
            let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
            v_path = directory.appendingPathComponent("temp_video.mp4")
        // we just set the extension .mp4 but
        // actually it is a mov file with QT container !! May not play in Android devices.
        // it will be ceonverted
            self.duration = 30
            devices = AVCaptureDevice.devices() as! [AVCaptureDevice]
            for device in devices
            {
                if (device.hasMediaType(AVMediaTypeVideo))
                {
                    if (device.position == AVCaptureDevicePosition.back)
                    {
                        captureDevice = device as AVCaptureDevice
                    }
                    if (device.position == AVCaptureDevicePosition.front)
                    {
                        cameras_number = 2
                    }
                }
                if (device.hasMediaType(AVMediaTypeAudio))
                {
                    microphone = device as AVCaptureDevice
                }
            }
            if (cameras_number == 1)
            {
            //only 1 camera available
                btnSwitchCamera.isHidden = true
            }
            if captureDevice != nil
            {
                beginSession()
            }
            max_zoom = captureDevice.activeFormat.videoMaxZoomFactor
        }
    
        func beginSession()
        {
            if (captureSession.isRunning)
            {
                captureSession.stopRunning()
            }
            do
            {
                try captureInput = AVCaptureDeviceInput(device: captureDevice)
                try micInput = AVCaptureDeviceInput(device: microphone)
                try captureDevice.lockForConfiguration()
            }
            catch
            {
                print("errorrrrrrrrrrr \(error)")
            }
        // beginconfig before adding input and setting settings
            captureSession.beginConfiguration()
            captureSession.addInput(captureInput)
            captureSession.addInput(micInput)
            previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
            previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.init(rawValue: UIDevice.current.orientation.rawValue)!
            if (previewLayer.connection.isVideoStabilizationSupported)
            {
                previewLayer.connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationMode.auto
            }
            if (captureDevice.isSmoothAutoFocusSupported)
            {
                captureDevice.isSmoothAutoFocusEnabled = false
            }
            if (captureDevice.isFocusModeSupported(AVCaptureFocusMode.continuousAutoFocus))
            {
                captureDevice.focusMode = .continuousAutoFocus
            }
            set_preview_size_thing()
            set_quality_thing()
            if (captureDevice.isLowLightBoostSupported)
            {
                captureDevice.automaticallyEnablesLowLightBoostWhenAvailable = true
            }
            if (cameraView.layer.sublayers?[0] is AVCaptureVideoPreviewLayer)
            {
            //to prevent previewlayers stacking on every camera switch
                cameraView.layer.sublayers?.remove(at: 0)
            }
            cameraView.layer.insertSublayer(previewLayer, at: 0)
            previewLayer?.frame = cameraView.layer.frame
            captureSession.commitConfiguration()
            captureSession.startRunning()
        }
    
        func duration_thing()
        {
        // there is a textview to write remaining time left
            self.duration = self.duration - 1
            timerTextView.text = "remaining seconds: \(self.duration)"
            timerTextView.sizeToFit()
            if (self.duration == 0)
            {
                my_timer.invalidate()
                stopVideoAction()
            }
        }
    
        func switch_cam()
        {
            captureSession.removeInput(captureInput)
            captureSession.removeInput(micInput)
            cameraFront = !cameraFront
        // capturedevice will be locked again
            captureDevice.unlockForConfiguration()
            for device in devices
            {
                if (device.hasMediaType(AVMediaTypeVideo))
                {
                    if (device.position == AVCaptureDevicePosition.back && !cameraFront)
                    {
                        captureDevice = device as AVCaptureDevice
                    }
                    else if (device.position == AVCaptureDevicePosition.front && cameraFront)
                    {
                        captureDevice = device as AVCaptureDevice
                    }
                }
            }
            beginSession()
        }
    
        func zoom_in()
        {
        // 10x zoom would be enough
            if (captureDevice.videoZoomFactor * 1.5 < 10)
            {
                captureDevice.videoZoomFactor = captureDevice.videoZoomFactor * 1.5
            }
            else
            {
                captureDevice.videoZoomFactor = 10
            }
        }
    
        func zoom_out()
        {
            if (captureDevice.videoZoomFactor * 0.67 > 1)
            {
                captureDevice.videoZoomFactor = captureDevice.videoZoomFactor * 0.67
            }
            else
            {
                captureDevice.videoZoomFactor = 1
            }
        }
    
        func set_quality_thing()
        {
        // there is a switch in the screen (30-30 fps high quality or 15-23 fps normal quality)
        // you may not have to do this because export session also has some presets and a property called “optimizefornetwork” or something. But it would be better to make sure the output file is not huge with unnecessary 90 fps video
            captureDevice.activeVideoMinFrameDuration = CMTimeMake(1, switch_quality.isOn ? 30 : 15)
            captureDevice.activeVideoMaxFrameDuration = CMTimeMake(1, switch_quality.isOn ? 30 : 23)
        }
    
        func set_preview_size_thing()
        {
        //there is a switch for resolution (720p or 480p)
            captureSession.sessionPreset = switch_res.isOn ? AVCaptureSessionPreset1280x720 : AVCaptureSessionPreset640x480
        //this for loop is probably unnecessary and ridiculous but you can make sure you are using the right format
            for some_format in captureDevice.formats as! [AVCaptureDeviceFormat]
            {
                let some_desc : String = String(describing: some_format)
                if (switch_res.isOn)
                {
                    if (some_desc.contains("1280x") && some_desc.contains("720") && some_desc.contains("420v") && some_desc.contains("30 fps"))
                    {
                        captureDevice.activeFormat = some_format
                        break
                    }
                }
                else
                {
                    if (some_desc.contains("640x") && some_desc.contains("480") && some_desc.contains("420v"))
                    {
                        captureDevice.activeFormat = some_format
                        break
                    }
                }
            }
        }
    
        func takeVideoAction()
        {
        // movieFragmentInterval is important !! or you may end up with a video without audio
            videoFileOutput.movieFragmentInterval = kCMTimeInvalid
            captureSession.addOutput(videoFileOutput)
            (videoFileOutput.connections.first as! AVCaptureConnection).videoOrientation = returnedOrientation()
            videoFileOutput.maxRecordedDuration = CMTime(seconds: Double(self.duration), preferredTimescale: 1)
            videoFileOutput.startRecording(toOutputFileURL: v_path, recordingDelegate: self)
        //timer will tell the remaining time
            my_timer = Timer.scheduledTimer(timeInterval: 1, target: self, selector: #selector(duration_thing), userInfo: nil, repeats: true)
        }
    
        func stopVideoAction()
        {
            captureDevice.unlockForConfiguration()
            videoFileOutput.stopRecording()
            captureSession.stopRunning()
        // turn temp_video into an .mpeg4 (mp4) video
            let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
            let avAsset = AVURLAsset(url: v_path, options: nil)
        // there are other presets than AVAssetExportPresetPassthrough
            let exportSession = AVAssetExportSession(asset: avAsset, presetName: AVAssetExportPresetPassthrough)!
            exportSession.outputURL = directory.appendingPathComponent("main_video.mp4")
        // now it is actually in an mpeg4 container
            exportSession.outputFileType = AVFileTypeMPEG4
            let start = CMTimeMakeWithSeconds(0.0, 0)
            let range = CMTimeRangeMake(start, avAsset.duration)
            exportSession.timeRange = range
            exportSession.exportAsynchronously(completionHandler: {
                if (exportSession.status == AVAssetExportSessionStatus.completed)
                {
            // you don’t need temp video after exporting main_video
                    do
                    {
                        try FileManager.default.removeItem(atPath: self.v_path.path)
                    }
                    catch
                    {
                    }
            // v_path is now points to mp4 main_video
                    self.v_path = directory.appendingPathComponent("main_video.mp4")
                    self.performSegue(withIdentifier: "ShareVideoController", sender: nil)
                }
            })
        }
    
        func btn_capture_click_listener()
        {
            if (videoFileOutput.isRecording)
            {
                stopVideoAction()
            }
            else
            {
                takeVideoAction()
            }
        }
    
        func returnedOrientation() -> AVCaptureVideoOrientation
        {
            var videoOrientation: AVCaptureVideoOrientation!
            let orientation = UIDevice.current.orientation
            switch orientation
            {
            case .landscapeLeft:
                videoOrientation = .landscapeRight
            case .landscapeRight:
                videoOrientation = .landscapeLeft
            default:
                videoOrientation = .landscapeLeft
            }
            return videoOrientation
        }
    
        override func prepare(for segue: UIStoryboardSegue, sender: Any?)
        {
            if (segue.identifier == "ShareVideoController")
            {
                //to make it visible in the camera roll (main_video.mp4)
    PHPhotoLibrary.shared().performChanges({PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: self.v_path)}) { completed, error in}
                let destVC : ShareVideoController = segue.destination as! ShareVideoController
            // use the path in other screen to upload it or whatever
                destVC.videoFilePath = v_path
            // bla bla
            }
        }
    
        override var supportedInterfaceOrientations: UIInterfaceOrientationMask
        {
        // screen will always be in landscape (remove this override if you want)
            return .landscape
        }
    }