Search code examples
tensorflowarduinoarduino-nano

TFLite: Micro mutable Op Resolver does not name a type


I am trying to compile a TFLite micro-based Arduino sketch using MicroMutableOpsResolver class (to only include required operations for reducing the memory usage).

Though see similar usage in TF lite example here - https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/micro/examples/micro_speech/micro_speech_test.cc

But keep hitting the below compilation error.

IMU_Classifier_TinyML:22:1: error: 'micro_op_resolver' does not name a type
 micro_op_resolver.AddFullyConnected();
 ^~~~~~~~~~~~~~~~~
IMU_Classifier_TinyML:23:1: error: 'micro_op_resolver' does not name a type
 micro_op_resolver.AddSoftmax();
 ^~~~~~~~~~~~~~~~~
IMU_Classifier_TinyML:24:1: error: 'micro_op_resolver' does not name a type
 micro_op_resolver.AddRelu();
 ^~~~~~~~~~~~~~~~~
Using library Arduino_LSM9DS1 at version 1.1.0 in folder: /home/balaji/Arduino/libraries/Arduino_LSM9DS1 
Using library Wire in folder: /home/balaji/.arduino15/packages/arduino/hardware/mbed/1.3.2/libraries/Wire (legacy)
Using library Arduino_TensorFlowLite at version 2.4.0-ALPHA in folder: /home/balaji/Arduino/libraries/Arduino_TensorFlowLite 
exit status 1
'micro_op_resolver' does not name a type

The code snippet looks as below:

#include <Arduino_LSM9DS1.h>
#include <TensorFlowLite.h>
#include <tensorflow/lite/micro/micro_mutable_op_resolver.h>
#include <tensorflow/lite/micro/kernels/micro_ops.h>
#include <tensorflow/lite/micro/micro_error_reporter.h>
#include <tensorflow/lite/micro/micro_interpreter.h>
#include <tensorflow/lite/schema/schema_generated.h>
#include <tensorflow/lite/version.h>

// Include the TFlite converted model header file
#include "model.h"

const float accelThreshold = 2.5;
const int numOfSamples = 119; // acceleration sample-rate

int samplesRead = numOfSamples;

tflite::MicroErrorReporter tfLiteErrorReporter;

/*Import only the required ops to reduce the memory usage*/
static tflite::MicroMutableOpResolver<3> micro_op_resolver;
micro_op_resolver.AddFullyConnected();
micro_op_resolver.AddSoftmax();
micro_op_resolver.AddRelu();

Am I missing any dependency or could this be due to TF lite version mismatch?


Solution

  • At least the function calls like micro_op_resolver.AddFullyConnected(); must be placed into a function body. Something like this should compile:

    #include <Arduino_LSM9DS1.h>
    #include <TensorFlowLite.h>
    #include <tensorflow/lite/micro/micro_mutable_op_resolver.h>
    #include <tensorflow/lite/micro/kernels/micro_ops.h>
    #include <tensorflow/lite/micro/micro_error_reporter.h>
    #include <tensorflow/lite/micro/micro_interpreter.h>
    #include <tensorflow/lite/schema/schema_generated.h>
    #include <tensorflow/lite/version.h>
    
    // Include the TFlite converted model header file
    #include "model.h"
    
    const float accelThreshold = 2.5;
    const int numOfSamples = 119; // acceleration sample-rate
    
    int samplesRead = numOfSamples;
    
    tflite::MicroErrorReporter tfLiteErrorReporter;
    
    /*Import only the required ops to reduce the memory usage*/
    static tflite::MicroMutableOpResolver<3> micro_op_resolver;
    
    void setup() {
      micro_op_resolver.AddFullyConnected();
      micro_op_resolver.AddSoftmax();
      micro_op_resolver.AddRelu();
    }
    
    void loop() {
      // put your main code here, to run repeatedly:
    
    }