Search code examples
androidandroid-studiokotlinvoipandroid-broadcast

Why this error in Android AudioTrack? AudioTrackShared: releaseBuffer: mUnreleased out of range


I'm trying to broadcast audio between apps. This is my AudioRecord:

private var audiorecord: AudioRecord? = null
private val SAMPLER = 16000 //Sample Audio Rate
private val CHANNEL_CONFIG: Int = AudioFormat.CHANNEL_IN_MONO
private val AUDIO_FORMAT: Int = AudioFormat.ENCODING_PCM_16BIT
private var BUFFER_SIZE = AudioRecord.getMinBufferSize(SAMPLER, CHANNEL_CONFIG, AUDIO_FORMAT)

    audiorecord = AudioRecord(
        MediaRecorder.AudioSource.MIC,
        SAMPLER,
        CHANNEL_CONFIG,
        AUDIO_FORMAT,
        BUFFER_SIZE
    )

Send this over WebSocket connection and trying to play in a AudioTrack I got this error with non-audio: (I'm using phone, so I hear some noise, but not sound)

A/AudioTrackShared: releaseBuffer: mUnreleased out of range, !(stepCount:4 <= mUnreleased:0 <= mFrameCount:22050), BufferSizeInFrames:22050

A/libc: Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 9569 (Thread-3), pid 18292 (mple.testaudio2)

My AudioTrack:

    var BUFFER_SIZE = AudioTrack.getNativeOutputSampleRate(AudioManager.STREAM_VOICE_CALL)

    val attribBuilder = AudioAttributes.Builder()
    attribBuilder.setContentType(AudioAttributes.CONTENT_TYPE_SPEECH)
    attribBuilder.setUsage(AudioAttributes.USAGE_VOICE_COMMUNICATION)

    val attributes = attribBuilder.build()

    // Build audio format

    val afBuilder = AudioFormat.Builder()
    afBuilder.setChannelMask(AudioFormat.CHANNEL_OUT_MONO)
    afBuilder.setEncoding(AudioFormat.ENCODING_PCM_8BIT)
    afBuilder.setSampleRate(BUFFER_SIZE)

    val format = afBuilder.build()


    audioTrack = AudioTrack(
        attributes,
        format,
        BUFFER_SIZE,
        AudioTrack.MODE_STREAM,
        AudioManager.AUDIO_SESSION_ID_GENERATE
    )

I'm not sure about the audio atributes and audio formats. This code break my app. Why?


Solution

  • One possible problem is the third argument that you are passing in the constructor:

    audioTrack = AudioTrack(
        attributes,
        format,
        BUFFER_SIZE, <-- this one
        AudioTrack.MODE_STREAM,
        AudioManager.AUDIO_SESSION_ID_GENERATE
    )
    

    According to the docs, this should be the buffer size in bytes. You are passing BUFFER_SIZE, which based on the name seems correct, but that variable is defined as

    var BUFFER_SIZE = AudioTrack.getNativeOutputSampleRate(AudioManager.STREAM_VOICE_CALL)
    

    So it is actually a sample rate, not a buffer size. I would recommend that you rename the variable, the name is rather misleading. You can try setting a large number for the buffer size and see if it fixes your problem.