Search code examples
protocol-bufferstensorrtonnxnvidia-jetsonnvidia-jetson-nano

Failed to parse ONNX model to TensorRT


I'm using a jetson nano

I tried to convert the onnx model https://github.com/onnx/models/tree/master/vision/body_analysis/emotion_ferplus

Ran into this error:

https://user-images.githubusercontent.com/28679735/86281506-a75e5380-bbab-11ea-8608-9bf8e2f50cc6.png

Additional Info:

https://user-images.githubusercontent.com/28679735/86281617-d674c500-bbab-11ea-8bbe-16f6d3db7203.png


Solution

  • After you create the model use this code:

    TRT_LOGGER = trt.Logger(trt.Logger.WARNING)
    EXPLICIT_BATCH = 1 << (int)(trt.NetworkDefinitionCreationFlag.EXPLICIT_BATCH)
    with trt.Builder(TRT_LOGGER) as builder, builder.create_network(EXPLICIT_BATCH) as network, trt.OnnxParser(network, TRT_LOGGER) as parser:
            with open("modelfile.onnx", 'rb') as model:
                if not parser.parse(model.read()):
                    for error in range(parser.num_errors):
                        print(parser.get_error(error))
            engine = builder.build_cuda_engine(network)
    

    You can use the engine directly or save and reuse it later.

    with open("output.engine", "wb") as f:
                f.write(engine.serialize())