Search code examples
androidkotlinlocalfirebase-mlkit

Custom model [MLKit] - FirebaseMLException: Internal error has occured when executing Firebase ML tastks


I want to use a custom ML model in Android using MLKit, and neither using the local or the remote model works. I'm focusing here on the local one since I have a FirebaseMLException.

I've tried to follow step by step the official documentation.

I believe the error is in the input/output format but I can't figure out what it is. Having 143 classes labeled by integers. Here's how I've built my input/output :

inputOutputOptions = FirebaseModelInputOutputOptions.Builder()
            .setInputFormat(
                0,
                FirebaseModelDataType.FLOAT32,
                intArrayOf(
                    DIM_BATCH_SIZE,
                    DIM_IMG_SIZE_X,
                    DIM_IMG_SIZE_Y,
                    DIM_PIXEL_SIZE
                )
            )
            .setOutputFormat(
                0,
                FirebaseModelDataType.INT32,
                intArrayOf(1, 143)
            )
            .build()

Find below the hole stacktrace :

W/System.err: com.google.firebase.ml.common.FirebaseMLException: Internal error has occurred when executing Firebase ML tasks
W/System.err:     at com.google.firebase.ml.common.internal.zze.zza(com.google.firebase:firebase-ml-common@@20.0.1:38)
        at com.google.firebase.ml.common.internal.zzh.run(Unknown Source:4)
W/System.err:     at android.os.Handler.handleCallback(Handler.java:873)
W/System.err:     at android.os.Handler.dispatchMessage(Handler.java:99)
        at com.google.android.gms.internal.firebase_ml.zzf.dispatchMessage(com.google.firebase:firebase-ml-common@@20.0.1:6)
W/System.err:     at android.os.Looper.loop(Looper.java:193)
        at android.os.HandlerThread.run(HandlerThread.java:65)
W/System.err: Caused by: java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite tensor with type FLOAT32 and a Java object of type [[I (which is compatible with the TensorFlowLite type INT32).
W/System.err:     at org.tensorflow.lite.Tensor.throwExceptionIfTypeIsIncompatible(Tensor.java:233)
W/System.err:     at org.tensorflow.lite.Tensor.copyTo(Tensor.java:116)
        at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:157)
W/System.err:     at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:250)
        at com.google.android.gms.internal.firebase_ml.zzpz.runForMultipleInputsOutputs(com.google.firebase:firebase-ml-model-interpreter@@20.0.1:4)
W/System.err:     at com.google.android.gms.internal.firebase_ml.zzpu.zza(com.google.firebase:firebase-ml-model-interpreter@@20.0.1:85)
W/System.err:     at com.google.android.gms.internal.firebase_ml.zzpu.zza(com.google.firebase:firebase-ml-model-interpreter@@20.0.1:145)
        at com.google.firebase.ml.common.internal.zzi.zza(com.google.firebase:firebase-ml-common@@20.0.1:33)
        at com.google.firebase.ml.common.internal.zzk.call(Unknown Source:8)
W/System.err:     at com.google.firebase.ml.common.internal.zze.zza(com.google.firebase:firebase-ml-common@@20.0.1:32)
        ... 6 more

I don't know how to interpret this line :

Cannot convert between a TensorFlowLite tensor with type FLOAT32 and a Java object of type [[I (which is compatible with the TensorFlowLite type INT32)

Solution

  • I found where was the problem, in face MLKit waits an array of probabilities as output, and that means a set of float values in a [0.0, 1.0] range. That means that this snippet (containing a wrong line) :

    .setOutputFormat(
        0,
        FirebaseModelDataType.INT32, // Wrong line
        intArrayOf(1, 143)
    )
    

    Should be replaced with this one :

    .setOutputFormat(
        0,
        FirebaseModelDataType.FLOAT32, // <== Expected array type is a float
        intArrayOf(1, 143)
    )