Search code examples
pythontensorflow2.0tensorflow-litedataset

Convert .pb to .tflite for a model of variable input shape


I was working on a problem where I trained a model using Tensorflow Object detection API using a custom dataset. I am using tf version 2.2.0

output_directory = 'inference_graph'
!python /content/models/research/object_detection/exporter_main_v2.py \
--trained_checkpoint_dir {model_dir} \
--output_directory {output_directory} \
--pipeline_config_path {pipeline_config_path}

I was able to get a .pb file successfully along with the .ckpt file. But now I need to convert it to .tflite. I am unable to do so, there is some error or another.

I tried the basic way which was written on TensorFlow documentation but that didn't work either. Another code which I tried is below:

    import tensorflow as tf
from tensorflow.keras.models import Sequential, Model
from tensorflow.keras.layers import Conv2D, Flatten, MaxPooling2D, Dense, Input, Reshape, Concatenate, GlobalAveragePooling2D, BatchNormalization, Dropout, Activation, GlobalMaxPooling2D
from tensorflow.keras.utils import Sequence

model = tf.saved_model.load(f'/content/drive/MyDrive/FINAL DNET MODEL/inference_graph/saved_model/')
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.post_training_quantize=True
converter.inference_type=tf.uint8
tflite_model = converter.convert()
open("val_converted_model_int8.tflite", "wb").write(tflite_model)

The error I am getting is:

AttributeError Traceback (most recent call last) in () 8 converter.post_training_quantize=True 9 converter.inference_type=tf.uint8 ---> 10 tflite_model = converter.convert() 11 open("val_converted_model_int8.tflite", "wb").write(tflite_model)

/usr/local/lib/python3.6/dist-packages/tensorflow/lite/python/lite.py in convert(self) 837 # to None. 838 # Once we have better support for dynamic shapes, we can remove this. --> 839 if not isinstance(self._keras_model.call, _def_function.Function): 840 # Pass keep_original_batch_size=True will ensure that we get an input 841 # signature including the batch dimension specified by the user.

AttributeError: '_UserObject' object has no attribute 'call'

Can anyone please help me with this?


Solution

  • I think the problem is not about the variable input shape (while the error message is confusing).

    tf.saved_model.load returns a SavedModel, but tf.lite.TFLiteConverter.from_keras_model expects a Keras model so it couldn't handle it.

    You need to use the TFLiteConverter.from_saved_model API. Something like this:

    saved_model_dir = '/content/drive/MyDrive/FINAL DNET MODEL/inference_graph/saved_model/'
    converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
    

    Let us know if you run into other issues.