Search code examples
tensorflowtensorflow-litequantization-aware-training

Can we use TF-lite to do retrain?


I converted a pretrained model to TF-lite and would like to deploy to the edge device. If we got new training data and would like to improve the pretrained model, is it possible to do on the edge device? Ex. Is there any method to train the model and save to TF-lite(FlatBuffer) again on edge device?

Thanks for any inputs!


Solution

  • On-device training is not fully supported yet on TF Lite but you can refer to this blog post to see how it can be done. https://blog.tensorflow.org/2019/12/example-on-device-model-personalization.html

    The basic idea is:

    • Split your model to a base subgraph (e.g. feature extractor in an image classification model) and a trainable head.
    • Convert the base subgraph to TF Lite as normal. Convert the trainable head to TF Lite using the experimental tflite-transfer-convert tool.
    • Retrain the trainable head on-device as you wish.