Search code examples
watchkitavaudioplayeravaudioengineavaudioplayernode

Repeating Audio in WatchKit (AVAudioPlayer?)?


I am wanting to loop a local audio file in my Apple Watch App. Currently I am using AVAudioPlayerNode and AVAudioEngine which works well but I cannot figure out how to loop the sound.

I noticed that I can use AVAudioPlayer, which has the handy "numberOfLoops" but, for some reason AVAudioPlayer is not working on the watch. I have no idea why.

Here is my current code to play a sound:

_audioPlayer = [[AVAudioPlayerNode alloc] init];
_audioEngine = [[AVAudioEngine alloc] init];
[_audioEngine attachNode:_audioPlayer];

AVAudioFormat *stereoFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100 channels:2];
[_audioEngine connect:_audioPlayer to:_audioEngine.mainMixerNode format:stereoFormat];

if (!_audioEngine.isRunning) {
    NSError* error;
    [_audioEngine startAndReturnError:&error];
}

NSError *error;
NSBundle* appBundle = [NSBundle mainBundle];
NSURL *url = [NSURL fileURLWithPath:[appBundle pathForResource:@"FILE_NAME" ofType:@"mp3"]];
AVAudioFile *asset = [[AVAudioFile alloc] initForReading:url error:&error];

[_audioPlayer scheduleFile:asset atTime:nil completionHandler:nil];
[_audioPlayer play];

Here is the code i've tried to use for AVAudioPlayer, but does not work:

NSError *audioError;
AVAudioPlayer* player = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"FILE_NAME" ofType:@"mp3"]] error:&audioError];
player.numberOfLoops = MAXFLOAT;
player.delegate = self;
[player play];

I am using WatchKit 5.0(+).


Solution

  • If your audio file fits into memory, you could schedule playback as an AVAudioBuffer with the AVAudioPlayerNodeBufferLoops option (N.B. only tested on simulator!):

    AVAudioFormat *outputFormat = [_audioPlayer outputFormatForBus:0];
    
    __block AVAudioPCMBuffer *srcBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:asset.processingFormat frameCapacity:(AVAudioFrameCount)asset.length];
    
    if (![asset readIntoBuffer:srcBuffer error:&error]) {
        NSLog(@"Read error: %@", error);
        abort();
    }
    
    AVAudioPCMBuffer *dstBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)asset.length];
    
    AVAudioConverter *converter = [[AVAudioConverter alloc] initFromFormat:srcBuffer.format toFormat:dstBuffer.format];
    AVAudioConverterOutputStatus status = [converter convertToBuffer:dstBuffer error:&error withInputFromBlock:^AVAudioBuffer * _Nullable(AVAudioPacketCount inNumberOfPackets, AVAudioConverterInputStatus * _Nonnull outStatus) {
        if (srcBuffer) {
            AVAudioBuffer *result = srcBuffer;
            srcBuffer = NULL;
            *outStatus = AVAudioConverterInputStatus_HaveData;
            return result;
        } else {
            *outStatus = AVAudioConverterInputStatus_EndOfStream;
            return NULL;
        }
    }];
    
    assert(status != AVAudioConverterOutputStatus_Error);
    
    [_audioPlayer scheduleBuffer:dstBuffer atTime:nil options:AVAudioPlayerNodeBufferLoops completionHandler:nil];
    [_audioPlayer play];