Search code examples
swiftavfoundationavaudiorecorderavaudioengineavaudiofile

How to use AVFAudio's SDK to Record, Play and save audio


Ive been trying to implement the AVFoundation's framework AVFAudio in order to record audio, play audio, as well as change the audio data per the user's selected presets. Ive also been trying to find out how to save files locally to the user's device, however, upon reading apple's documentation on AVFAudio, I can hardly make any sense of which steps to take when creating these files. Ive been following along with https://www.raywenderlich.com/21868250-audio-with-avfoundation/lessons/1 and managed to set up some functions here.

Here I have set up saving the audio, but as you can see, this would only save the audio to a temporary directory. I am wondering how I can save the audio file locally to the user's device.

// MARK: Saving audio
    var urlForVocals: URL {
        let fileManger = FileManager.default
        let tempDirectory = fileManger.temporaryDirectory
        let filePath = "TempVocalRecording.caf"
        return tempDirectory.appendingPathComponent(filePath)
    }
    

I am generally confused about the AVFoundation's framework when using AVFAudio and the documentation https://developer.apple.com/documentation/avfaudio does not go into specifics of how to implement each method. For Example; The Doc states that for Creating an Audio Player: We need to init(contentsOf:url), but does not go into what the url is and why we are using it? Can anyone help me understand what steps to take further, I feel like i'm running around in circles trying to understand this framework and the apple documentation.


Solution

  • Here's a relatively bare-bones version. See inline comments for what is happening.

    cclass AudioManager : ObservableObject {
        @Published var canRecord = false
        @Published var isRecording = false
        @Published var audioFileURL : URL?
        private var audioPlayer : AVAudioPlayer?
        private var audioRecorder : AVAudioRecorder?
        
        init() {
            //ask for record permission. IMPORTANT: Make sure you've set `NSMicrophoneUsageDescription` in your Info.plist
            AVAudioSession.sharedInstance().requestRecordPermission() { [unowned self] allowed in
                DispatchQueue.main.async {
                    if allowed {
                        self.canRecord = true
                    } else {
                        self.canRecord = false
                    }
                }
            }
        }
    
        //the URL where the recording file will be stored
        private var recordingURL : URL {
            getDocumentsDirectory().appendingPathComponent("recording.caf")
        }
    
        private func getDocumentsDirectory() -> URL {
            let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
            return paths[0]
        }
        
        func recordFile() {
            do {
                //set the audio session so we can record
                try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default)
                try AVAudioSession.sharedInstance().setActive(true)
                
            } catch {
                print(error)
                self.canRecord = false
                fatalError()
            }
            //this describes the format the that the file will be recorded in
            let settings = [
                AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
                AVSampleRateKey: 12000,
                AVNumberOfChannelsKey: 1,
                AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
            ]
            do {
                //create the recorder, pointing towards the URL from above
                audioRecorder = try AVAudioRecorder(url: recordingURL,
                                                    settings: settings)
                audioRecorder?.record() //start the recording
                isRecording = true
            } catch {
                print(error)
                isRecording = false
            }
        }
        
        func stopRecording() {
            audioRecorder?.stop()
            isRecording = false
            audioFileURL = recordingURL
        }
        
        func playRecordedFile() {
            guard let audioFileURL = audioFileURL else {
                return
            }
            do {
                //create a player, again pointing towards the same URL
                self.audioPlayer = try AVAudioPlayer(contentsOf: audioFileURL)
                self.audioPlayer?.play()
            } catch {
                print(error)
            }
        }
    }
    
    struct ContentView: View {
        
        @StateObject private var audioManager = AudioManager()
        
        var body: some View
        {
            VStack {
                if !audioManager.isRecording && audioManager.canRecord {
                    Button("Record") {
                        audioManager.recordFile()
                    }
                } else {
                    Button("Stop") {
                        audioManager.stopRecording()
                    }
                }
                
                if audioManager.audioFileURL != nil && !audioManager.isRecording {
                    Button("Play") {
                        audioManager.playRecordedFile()
                    }
                }
            }
        }
    }