Use Case
I'm using iOS 11 Replaykit framework to try to record frames from the screen, and audio from both the app and the microphone.
Problem
Randomly, when I call my .append(sampleBuffer) get AVAssetWriterStatus.failed
with the AssetWriter.Error
showing
Error Domain=AVFoundationErrorDomain Code=-11823 "Cannot Save" UserInfo={NSLocalizedRecoverySuggestion=Try saving again., NSLocalizedDescription=Cannot Save, NSUnderlyingError=0x1c044c360 {Error Domain=NSOSStatusErrorDomain Code=-12412 "(null)"}}
Side issue: I play a repeating sound when the app is recording to try to verify the audio is recorded, but the sound stops when I start recording, even where I the video and external audio mic is working.
If you require more info, I can upload the other code to GitHub too.
Ideas
Since sometimes the recording saves (I can export to Photos app and replay the video) I think it must be async issues where I'm loading things out of order. Please let me know if you see any!
One I idea I will be trying is saving to my own folder in /Documents instead of directly to /Documents in case of weird permissions errors. Although I believe this would be causing consistent errors, instead of only sometimes breaking.
My Code
func startRecording() {
guard let firstDocumentDirectoryPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first else { return }
let directoryContents = try! FileManager.default.contentsOfDirectory(at: URL(fileURLWithPath: firstDocumentDirectoryPath), includingPropertiesForKeys: nil, options: [])
print(directoryContents)
videoURL = URL(fileURLWithPath: firstDocumentDirectoryPath.appending("/\(arc4random()).mp4"))
print(videoURL.absoluteString)
assetWriter = try! AVAssetWriter(url: videoURL, fileType: AVFileType.mp4)
let compressionProperties:[String:Any] = [...]
let videoSettings:[String:Any] = [...]
let audioSettings:[String:Any] = [...]
videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoSettings)
audioMicInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioSettings)
audioAppInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioSettings)
guard let assetWriter = assetWriter else { return }
guard let videoInput = videoInput else { return }
guard let audioAppInput = audioAppInput else { return }
guard let audioMicInput = audioMicInput else { return }
videoInput.mediaTimeScale = 60
videoInput.expectsMediaDataInRealTime = true
audioMicInput.expectsMediaDataInRealTime = true
audioAppInput.expectsMediaDataInRealTime = true
if assetWriter.canAdd(videoInput) {
assetWriter.add(videoInput)
}
if assetWriter.canAdd(audioAppInput) {
assetWriter.add(audioAppInput)
}
if assetWriter.canAdd(audioMicInput) {
assetWriter.add(audioMicInput)
}
assetWriter.movieTimeScale = 60
RPScreenRecorder.shared().startCapture(handler: recordingHandler(sampleBuffer:sampleBufferType:error:)) { (error:Error?) in
if error != nil {
print("RPScreenRecorder.shared().startCapture: \(error.debugDescription)")
} else {
print("start capture complete")
}
}
}
func recordingHandler (sampleBuffer:CMSampleBuffer, sampleBufferType:RPSampleBufferType, error:Error?){
if error != nil {
print("recordingHandler: \(error.debugDescription)")
}
if CMSampleBufferDataIsReady(sampleBuffer) {
guard let assetWriter = assetWriter else { return }
guard let videoInput = videoInput else { return }
guard let audioAppInput = audioAppInput else { return }
guard let audioMicInput = audioMicInput else { return }
if assetWriter.status == AVAssetWriterStatus.unknown {
print("AVAssetWriterStatus.unknown")
if !assetWriter.startWriting() {
return
}
assetWriter.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
}
if assetWriter.status == AVAssetWriterStatus.failed {
print("AVAssetWriterStatus.failed")
print("assetWriter.error: \(assetWriter.error.debugDescription)")
return
}
if sampleBufferType == RPSampleBufferType.video {
if videoInput.isReadyForMoreMediaData {
print("=appending video data")
videoInput.append(sampleBuffer)
}
}
if sampleBufferType == RPSampleBufferType.audioApp {
if audioAppInput.isReadyForMoreMediaData {
print("==appending app audio data")
audioAppInput.append(sampleBuffer)
}
}
if sampleBufferType == RPSampleBufferType.audioMic {
if audioMicInput.isReadyForMoreMediaData {
print("===appending mic audio data")
audioMicInput.append(sampleBuffer)
}
}
}
}
func stopRecording() {
RPScreenRecorder.shared().stopCapture { (error) in
guard let assetWriter = self.assetWriter else { return }
guard let videoInput = self.videoInput else { return }
guard let audioAppInput = self.audioAppInput else { return }
guard let audioMicInput = self.audioMicInput else { return }
if error != nil {
print("recordingHandler: \(error.debugDescription)")
} else {
videoInput.markAsFinished()
audioMicInput.markAsFinished()
audioAppInput.markAsFinished()
assetWriter.finishWriting(completionHandler: {
print(self.videoURL)
self.saveToCameraRoll(URL: self.videoURL)
})
}
}
}
I got it to work. I believe it was indeed an async issue. The problem, for some reason is you must make sure
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
happen strictly serially.
Change your code from this:
if assetWriter.status == AVAssetWriterStatus.unknown {
print("AVAssetWriterStatus.unknown")
if !assetWriter.startWriting() {
return
}
assetWriter.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
}
to this:
DispatchQueue.main.async { [weak self] in
if self?.assetWriter.status == AVAssetWriterStatus.unknown {
print("AVAssetWriterStatus.unknown")
if !self?.assetWriter.startWriting() {
return
}
self?.assetWriter.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer))
}
}
Or even better, the whole block inside CMSampleBufferDataIsReady
ie.
if CMSampleBufferDataIsReady(sampleBuffer) {
DispatchQueue.main.async { [weak self] in
...
...
}
}
Let me know if it works!