Search code examples
pythontensorflowprotocol-bufferstensorrt

KeyError: Frozen Tensorflow Model to UFF graph


I have trained a custom CNN model with Tensorflow Estimator API. I have successfully Frozen the graph, but the conversion to UFF fails and throws following error:

'KeyError: u'IteratorGetNext:1'

The code to do the said conversion:

frozen_graph_filename = "Frozen_model.pb"
TMP_UFF_FILENAME = "output.uff"
output_name = "sigmoid"

uff_model = uff.from_tensorflow_frozen_model(
    frozen_file=frozen_graph_filename,
    output_nodes=[output_name],
    output_filename=TMP_UFF_FILENAME,
    text=False,
)

The names of the nodes in the graph are,

prefix/OneShotIterator
prefix/IteratorGetNext
prefix/Reshape/shape
prefix/Reshape
prefix/Reshape_1/shape
prefix/Reshape_1
prefix/conv1/kernel
prefix/conv1/bias
.
.
.
prefix/logits/MatMul
prefix/logits/BiasAdd
prefix/sigmoid

So is there a way to remove the first two Iterator nodes? They are useless outside of the training context. I have also used tf.graph_util.remove_training_nodes but it does not alleviate the problem I am facing.


Solution

  • The Graph Transform Tool can be used to do exactly what I wanted to achieve, in order to be able to use the tool one has to clone tensorflow repository and run the configure file in order to setup a workspace. Follow the instructions provided in 1 to build the tool. Once done invoke the tool by,

    bazel-bin/tensorflow/tools/graph_transforms/transform_graph \
    --in_graph=tensorflow_inception_graph.pb \
    --out_graph=optimized_inception_graph.pb \
    --inputs='Mul:0' \
    --outputs='softmax:0' \
    --transforms='
    strip_unused_nodes(type=float, shape="1,299,299,3")
    remove_nodes(op=Identity, op=CheckNumerics)
    fold_old_batch_norms
    '
    

    Once you have the optimized graph pass it to

    uff.from_tensorflow_frozen()
    

    https://github.com/tensorflow/tensorflow/blob/master/tensorflow/tools/graph_transforms/README.md 1 https://www.tensorflow.org/mobile/prepare_models#how_do_you_get_a_model_you_can_use_on_mobile