I am converting a frozen pb model to onnx, but onnx does not support uint8. How do I go about replacing uint8 with int8 in the model file? I can also retrain if I have to, but I am not sure which file to modify and how. I would have guessed that I needed to modify the file which contains the neural net architecture: https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py
but that does not seem to be the case...
There's a MobileNet v2 ONNX Model in the ONNX model zoo that has already been converted (although the source model was originally from mxnet). This might be useful.
ONNX operators do support uint8 datatypes, as described in the ONNX IR documentation.