Search code examples
pythontensorflowraspberry-pi3tensorflow-lite

Tensorflow Lite Python Binding on Raspberry Pi 3B+


I am trying to use Tensorflow Lite Python interpreter for object detection in raspberry pi 3B+ like this

from tensorflow.contrib.lite.python import interpreter as interpreter_wrapper

But when I run this line interpreter=interpreter_wrapper.Interpreter(model_path="mobilenet.tflite")

I am getting this error:

Traceback (most recent call last):
  File "<pyshell#5>", line 1, in <module>
interpreter = interpreter_wrapper.Interpreter(model_path="mobilenet.tflite")
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/contrib/lite/python/interpreter.py", line 50, in __init__ 
_interpreter_wrapper.InterpreterWrapper_CreateWrapperCPPFromFile(
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/util/lazy_loader.py", line 53, in __getattr__
module = self._load()
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/util/lazy_loader.py", line 42, in _load
module = importlib.import_module(self.__name__)
  File "/usr/lib/python3.5/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 986, in _gcd_import
  File "<frozen importlib._bootstrap>", line 969, in _find_and_load
  File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 673, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 673, in exec_module
  File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/contrib/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 28, in <module>
_tensorflow_wrap_interpreter_wrapper = swig_import_helper()
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/contrib/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 24, in swig_import_helper
_mod = imp.load_module('_tensorflow_wrap_interpreter_wrapper', fp, pathname, description)
  File "/usr/lib/python3.5/imp.py", line 242, in load_module
return load_dynamic(name, filename, file)
  File "/usr/lib/python3.5/imp.py", line 342, in load_dynamic
return _load(spec)
  File "<frozen importlib._bootstrap>", line 693, in _load
  File "<frozen importlib._bootstrap>", line 666, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 577, in module_from_spec
  File "<frozen importlib._bootstrap_external>", line 914, in create_module
  File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed
ImportError: /usr/local/lib/python3.5/dist-packages/tensorflow/contrib/lite/python/interpreter_wrapper/_tensorflow_wrap_interpreter_wrapper.so: undefined symbol: _ZN6tflite12tensor_utils39NeonMatrixBatchVectorMultiplyAccumulateEPKfiiS2_iPfi

If anyone tried this please help me to fix this error

Thanks


Solution

  • That is an error that ocurred when using TensorFlow v1.14 or lower to run TensorFlow Lite. To work around the error, we can uninstall regular TensorFlow and then use the tflite_runtime package provided by Google instead.

    First of all I describe my environment:

    • Raspberry Pi Model 3B+.
    • Raspbian OS.
    • I'm working in a folder named "Project".
    • Inside this folder I've created an isolated environment in a folder named "Project-env" using virtualenv.
    • My python project in raspberry pi has this component "import tensorflow as tf", which is causing the error.

    As I used tensorflow to build the model in Google Colab, it shouldn't be necessary to use it here, at least in my program.

    • So first go to your working folder: "cd Project" in my case.
    • Then activate your environment: "source Project-env/bin/activate" in my case.
    • Uninstall tensorflow: "pip3 uninstall tensorflow".
    • Get this wheel: "wget https://dl.google.com/coral/python/tflite_runtime-1.14.0-cp35-cp35m-linux_armv7l.whl" this is suitable for the Raspberry Pi Model 3B+.
    • Now install the wheel: "pip3 install tflite_runtime-1.14.0-cp35-cp35m-linux_armv7l.whl".
    • Finally in your python program erase this line "import tensorflow as tf" and replace it with "from tflite_runtime.interpreter import Interpreter".

    So instead of using "interpreter=tf.lite.Interpreter(modelpath)", use this "interpreter=Interpreter(modelpath)".

    That's all. All credits for EdjeElectronics who helped me to solve this problem. Here its YouTube channel: https://www.youtube.com/watch?v=aimSGOAUI8Y&t=26s