I have been experimenting with AudioKit and have made a sample app to try and plot recording audio and audio from playback. I am seeing an issue though, when I record or playback audio, the rolling waveform doesn't show up in the view on a device. It shows up perfectly fine on sim (11.4) however. I've provided the recording view controller code below for context in how I'm trying to implement this while recording audio.
Any help or being pointed in the general direction would be greatly appreciated.
RecordingVC.m code:
#import "FirstViewController.h"
@interface FirstViewController ()
@end
@implementation FirstViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[self setupConfig];
[self setupUI];
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
- (void) setupUI
{
//Configure waveform view
self.recordingPlotView.gain = 2;
self.recordingPlotView.backgroundColor = [UIColor colorWithRed: .10 green: .10 blue: .10 alpha: 1];
self.recordingPlotView.color = [UIColor colorWithRed: .44 green: .44 blue: .44 alpha: 1];
self.recordingPlotView.plotType = EZPlotTypeRolling;
self.recordingPlotView.shouldFill = YES;
self.recordingPlotView.shouldMirror = YES;
[self.view addSubview: self.recordingPlotView];
}
- (void) setupConfig
{
self.isRecording = NO;
[AKSettings setAudioInputEnabled: true];
[AKSettings setPlaybackWhileMuted: true];
[AVAudioSession.sharedInstance setCategory: AVAudioSessionCategoryAmbient withOptions: kAudioSessionProperty_OverrideCategoryDefaultToSpeaker error: nil];
self.mic = [[EZMicrophone alloc] initWithMicrophoneDelegate: self];
}
#pragma mark - EZMicrophone Delegate methods
- (void) microphone:(EZMicrophone *)microphone
hasAudioReceived:(float **)buffer
withBufferSize:(UInt32)bufferSize
withNumberOfChannels:(UInt32)numberOfChannels
{
__weak typeof (self) weakSelf = self;
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf.recordingPlotView updateBuffer:buffer[0]
withBufferSize:bufferSize];
});
}
- (void) microphone:(EZMicrophone *)microphone
hasBufferList:(AudioBufferList *)bufferList
withBufferSize:(UInt32)bufferSize
withNumberOfChannels:(UInt32)numberOfChannels
{
if (self.isRecording)
{
[self.recorder appendDataFromBufferList:bufferList
withBufferSize:bufferSize];
}
}
#pragma mark - EZRecorder Delegate methods
- (void)recorderDidClose:(EZRecorder *)recorder
{
self.recorder.delegate = nil;
}
#pragma mark - Utils
- (NSArray *)applicationDocuments
{
return NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
}
- (NSString *)applicationDocumentsDirectory
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *basePath = ([paths count] > 0) ? [paths objectAtIndex:0] : nil;
return basePath;
}
- (NSURL *)testFilePathURL
{
return [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@/%@",
[self applicationDocumentsDirectory],
@"test2.m4a"]];
}
#pragma mark - user Interaction
- (IBAction)playButtonTapped:(id)sender {
if (self.isRecording)
{
self.isRecording = NO;
self.playButton.titleLabel.text = @"Record";
[self.mic stopFetchingAudio];
}
else
{
self.isRecording = YES;
self.playButton.titleLabel.text = @"Pause";
[self.mic startFetchingAudio];
self.recorder = [EZRecorder recorderWithURL: [self testFilePathURL] clientFormat: [self.mic audioStreamBasicDescription] fileType: EZRecorderFileTypeM4A delegate: self];
}
}
- (IBAction)stopButtonTapped:(id)sender {
if (self.isRecording)
{
self.isRecording = NO;
self.playButton.titleLabel.text = @"Record";
[self.mic stopFetchingAudio];
[self.recorder closeAudioFile];
}
[self.recordingPlotView clear];
self.recorder = nil;
}
@end
RecordingVC.h code:
#import <UIKit/UIKit.h>
@import AudioKit;
@import AudioKitUI;
@interface FirstViewController : UIViewController <EZMicrophoneDelegate, EZRecorderDelegate>
@property (strong, nonatomic) IBOutlet EZAudioPlot *recordingPlotView;
@property (nonatomic, strong) EZMicrophone* mic;
@property (nonatomic, strong) EZRecorder* recorder;
@property (nonatomic, assign) BOOL isRecording;
@property (strong, nonatomic) IBOutlet UIButton *playButton;
@end
Small Update: I've managed to get the playback waveform displaying on device by setting the gain in interface builder, even though I was setting it in code during viewDidLoad().
I've tried doing the same (setting the gain for the plot in interface builder) for the recording VC (the code above) but that did solve this as it did for the playback VC.
I ran your project and it works on the device the same as the simulator except that the simulator's microphone is the computer's and seems much more sensitive than on the device, so I had to set the gain higher:
self.recordingPlotView.gain = 20;
before I noticed the waveform.