Search code examples
androidkotlinbuffer-overflowtensorflow-lite

Why do I get a BufferOverflowException when running a TensorFlowLite Model?


I want to run a custom tflite model on Android using TensorFlowLite (and using Kotlin). Despite using the TFLite support library to create a supposedly correctly shaped input and output buffer I get the following error message everytime I'm calling my run() method.

Here is my class:

class Inference(context: Context) {
    private val tag = "Inference"
    private var interpreter: Interpreter
    private var inputBuffer: TensorBuffer
    private var outputBuffer: TensorBuffer

    init {
        val mappedByteBuffer= FileUtil.loadMappedFile(context, "CNN_ReLU.tflite")
        interpreter = Interpreter(mappedByteBuffer as ByteBuffer)
        interpreter.allocateTensors()

        val inputShape = interpreter.getInputTensor(0).shape()
        val outputShape = interpreter.getOutputTensor(0).shape()

        inputBuffer = TensorBuffer.createFixedSize(inputShape, DataType.FLOAT32)
        outputBuffer = TensorBuffer.createFixedSize(outputShape, DataType.FLOAT32)
    }

    fun run() {
        interpreter.run(inputBuffer.buffer, outputBuffer.buffer) // XXX: generates error message
    }
}

And this is the error Message:

W/System.err: java.nio.BufferOverflowException
W/System.err:     at java.nio.ByteBuffer.put(ByteBuffer.java:615)
W/System.err:     at org.tensorflow.lite.Tensor.copyTo(Tensor.java:264)
W/System.err:     at org.tensorflow.lite.Tensor.copyTo(Tensor.java:254)
W/System.err:     at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:170)
W/System.err:     at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:347)
W/System.err:     at org.tensorflow.lite.Interpreter.run(Interpreter.java:306)

I have only initialized the input and output buffers and did not write any data to it yet.

I'm using these gradle dependencies:

implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly'
implementation 'org.tensorflow:tensorflow-lite-gpu:0.0.0-nightly'
implementation 'org.tensorflow:tensorflow-lite-support:0.0.0-nightly'

The .tflite model was built with these TensorFlow versions:

tensorflow                        2.3.0
tensorflow-cpu                    2.2.0
tensorflow-datasets               3.1.0
tensorflow-estimator              2.3.0
tensorflow-gan                    2.0.0
tensorflow-hub                    0.7.0
tensorflow-metadata               0.22.0
tensorflow-probability            0.7.0
tensorflowjs                      1.7.4.post1

Any thoughts or hints are highly appreciated, thank you.


Solution

  • Does adding .rewind() to your input and output buffer make it work? If not, I wonder if your input or output tensor is dynamic tensor? In which case the return shape is not usable this way.