Search code examples
tensorflowtensorflow-liteonnx

Can't convert onnx model to tflite using TF 2.4.1


I'm having an ONNX model, which I can successfully convert to TF with TF 2.4.1. But when it comes to the conversion of that saved model to TFLite an error happens.

The code:

import onnx
import tensorflow as tf
from onnx_tf.backend import prepare

print(tf.__version__)

# Convert model.onnx to Tensorflow
onnx_model = onnx.load('model.onnx')
onnx.checker.check_model(onnx_model) 
tf_rep = prepare(onnx_model)  
tf_rep.export_graph('model')  

# Convert saved model to tflite
converter = tf.lite.TFLiteConverter.from_saved_model('model')
tf_lite_model = converter.convert()
open('model.tflite', 'wb').write(tf_lite_model)

Everything goes OK until the conversion step, which ends like so:

 >>> tf_lite_model = converter.convert()
    2021-04-22 18:18:14.715046: W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:316] Ignored output_format.
    2021-04-22 18:18:14.715072: W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:319] Ignored drop_control_dependency.
    2021-04-22 18:18:14.715078: W tensorflow/compiler/mlir/lite/python/tf_tfl_flatbuffer_helpers.cc:325] Ignored change_concat_input_ranges.
    2021-04-22 18:18:14.716044: I tensorflow/cc/saved_model/reader.cc:32] Reading SavedModel from: model
    2021-04-22 18:18:14.778050: I tensorflow/cc/saved_model/reader.cc:55] Reading meta graph with tags { serve }
    2021-04-22 18:18:14.778083: I tensorflow/cc/saved_model/reader.cc:93] Reading SavedModel debug info (if present) from: model
    2021-04-22 18:18:14.998062: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:196] None of the MLIR optimization passes are enabled (registered 0 passes)
    2021-04-22 18:18:15.043862: I tensorflow/cc/saved_model/loader.cc:206] Restoring SavedModel bundle.
    2021-04-22 18:18:15.438804: I tensorflow/cc/saved_model/loader.cc:190] Running initialization op on SavedModel bundle at path: model
    2021-04-22 18:18:15.809851: I tensorflow/cc/saved_model/loader.cc:277] SavedModel load for tags { serve }; Status: success: OK. Took 1093808 microseconds.
    2021-04-22 18:18:18.757257: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:194] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
    loc(callsite(callsite("Pad_16@__inference___call___16503" at "PartitionedCall@__inference_signature_wrapper_16752") at "PartitionedCall")): error: operand #0 does not dominate this use
    Traceback (most recent call last):
      File "/Users/decades/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/lite/python/convert.py", line 210, in toco_convert_protos
        model_str = wrap_toco.wrapped_toco_convert(model_flags_str,
      File "/Users/decades/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/lite/python/wrap_toco.py", line 32, in wrapped_toco_convert
        return _pywrap_toco_api.TocoConvert(
    Exception: <unknown>:0: error: loc(callsite(callsite("Pad_16@__inference___call___16503" at "PartitionedCall@__inference_signature_wrapper_16752") at "PartitionedCall")): operand #0 does not dominate this use
    <unknown>:0: note: loc("PartitionedCall"): called from
    <unknown>:0: note: loc(callsite(callsite("Pad_16@__inference___call___16503" at "PartitionedCall@__inference_signature_wrapper_16752") at "PartitionedCall")): operand defined here


    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/Users/decades/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/lite/python/lite.py", line 739, in convert
        result = _convert_saved_model(**converter_kwargs)
      File "/Users/decades/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/lite/python/convert.py", line 632, in convert_saved_model
        data = toco_convert_protos(
      File "/Users/decades/anaconda3/envs/py38/lib/python3.8/site-packages/tensorflow/lite/python/convert.py", line 216, in toco_convert_protos
        raise ConverterError(str(e))
    tensorflow.lite.python.convert.ConverterError: <unknown>:0: error: loc(callsite(callsite("Pad_16@__inference___call___16503" at "PartitionedCall@__inference_signature_wrapper_16752") at "PartitionedCall")): operand #0 does not dominate this use
    <unknown>:0: note: loc("PartitionedCall"): called from
    <unknown>:0: note: loc(callsite(callsite("Pad_16@__inference___call___16503" at "PartitionedCall@__inference_signature_wrapper_16752") at "PartitionedCall")): operand defined here

    

I have no idea, what this message means, but if I switch to TF 2.2 the conversion passes w/o errors. The bad thing is, that due to another problem now the initial ONNX to TF conversion fails.

Anybody having an idea, what this message means and what could be done with it?

TIA


Solution

  • Is it possible to share your the saved model directory to me? I can help debugging.

    The general advise is that, there are two possibilities that

    (1) TF Lite converter may not handle the saved model correctly.

    (2) onnx conversion tool may not create a valid TF saved model.

    Using the recent TF version (2.5 or tf-nightly) might help resolve this problem in the (1) case but it's not guaranteed.


    I confirmed that the tf-nightly version could convert the attached saved model without any issue:

    converter = tf.lite.TFLiteConverter.from_saved_model(
          "/tmp/onnx_model")
    tflite_model = converter.convert()
    with open("/tmp/onnx.tflite", "wb") as f:
      f.write(tflite_model)