Search code examples
dockerdeeppavlov

Running deeppavlov model in a container results in TypeError: Descriptors cannot not be created directly


I'm trying to run one of deeppavlov's models in a docker container on Windows 10, but I'm getting an error: 'TypeError: Descriptors cannot not be created directly.' Could someone please explain what's going wrong here? At first I typed in "docker pull deeppavlov/base-cpu" to get the image, and then this:

   PS C:\Users\user> docker run -e CONFIG=ner_ontonotes -p 5555:5000 -v ~/my_dp_components:/root/.deeppavlov -v ~/my_dp_envs:/venv deeppavlov/base-cpu
2022-07-10 12:13:50.324 INFO in 'deeppavlov.core.common.file'['file'] at line 32: Interpreting 'ner_ontonotes' as '/base/DeepPavlov/deeppavlov/configs/ner/ner_ontonotes.json'
Collecting tensorflow==1.15.2
  Downloading tensorflow-1.15.2-cp37-cp37m-manylinux2010_x86_64.whl (110.5 MB)
Collecting keras-applications>=1.0.8
  Downloading Keras_Applications-1.0.8-py3-none-any.whl (50 kB)
Collecting tensorboard<1.16.0,>=1.15.0
  Downloading tensorboard-1.15.0-py3-none-any.whl (3.8 MB)
Collecting astor>=0.6.0
  Downloading astor-0.8.1-py2.py3-none-any.whl (27 kB)
Requirement already satisfied: six>=1.10.0 in ./venv/lib/python3.7/site-packages/six-1.16.0-py3.7.egg (from tensorflow==1.15.2) (1.16.0)
Collecting opt-einsum>=2.3.2
  Downloading opt_einsum-3.3.0-py3-none-any.whl (65 kB)
Collecting absl-py>=0.7.0
  Downloading absl_py-1.1.0-py3-none-any.whl (123 kB)
Collecting keras-preprocessing>=1.0.5
  Downloading Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB)
Collecting grpcio>=1.8.6
  Downloading grpcio-1.47.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.5 MB)
Requirement already satisfied: wheel>=0.26 in ./venv/lib/python3.7/site-packages (from tensorflow==1.15.2) (0.36.2)
Collecting wrapt>=1.11.1
  Downloading wrapt-1.14.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB)
Collecting tensorflow-estimator==1.15.1
  Downloading tensorflow_estimator-1.15.1-py2.py3-none-any.whl (503 kB)
Requirement already satisfied: numpy<2.0,>=1.16.0 in ./venv/lib/python3.7/site-packages/numpy-1.18.0-py3.7-linux-x86_64.egg (from tensorflow==1.15.2) (1.18.0)
Collecting google-pasta>=0.1.6
  Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB)
Collecting termcolor>=1.1.0
  Downloading termcolor-1.1.0.tar.gz (3.9 kB)
Collecting protobuf>=3.6.1
  Downloading protobuf-4.21.2-cp37-abi3-manylinux2014_x86_64.whl (407 kB)
Collecting gast==0.2.2
  Downloading gast-0.2.2.tar.gz (10 kB)
Requirement already satisfied: h5py in ./venv/lib/python3.7/site-packages/h5py-2.10.0-py3.7-linux-x86_64.egg (from keras-applications>=1.0.8->tensorflow==1.15.2) (2.10.0)
Requirement already satisfied: setuptools>=41.0.0 in ./venv/lib/python3.7/site-packages (from tensorboard<1.16.0,>=1.15.0->tensorflow==1.15.2) (57.0.0)
Collecting werkzeug>=0.11.15
  Downloading Werkzeug-2.1.2-py3-none-any.whl (224 kB)
Collecting markdown>=2.6.8
  Downloading Markdown-3.3.7-py3-none-any.whl (97 kB)
Collecting importlib-metadata>=4.4
  Downloading importlib_metadata-4.12.0-py3-none-any.whl (21 kB)
Collecting zipp>=0.5
  Downloading zipp-3.8.0-py3-none-any.whl (5.4 kB)
Requirement already satisfied: typing-extensions>=3.6.4 in ./venv/lib/python3.7/site-packages/typing_extensions-3.10.0.0-py3.7.egg (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<1.16.0,>=1.15.0->tensorflow==1.15.2) (3.10.0.0)
Building wheels for collected packages: gast, termcolor
  Building wheel for gast (setup.py): started
  Building wheel for gast (setup.py): finished with status 'done'
  Created wheel for gast: filename=gast-0.2.2-py3-none-any.whl size=7553 sha256=669a2d92bdd23f624a8ead4e4353fa016514b23fad922f801b1109678bfd7d78
  Stored in directory: /root/.cache/pip/wheels/21/7f/02/420f32a803f7d0967b48dd823da3f558c5166991bfd204eef3
  Building wheel for termcolor (setup.py): started
  Building wheel for termcolor (setup.py): finished with status 'done'
  Created wheel for termcolor: filename=termcolor-1.1.0-py3-none-any.whl size=4847 sha256=fed5779a43e12fb9fcc5daab1ad2edd126970ddcf1c270954e198d2000f28e42
  Stored in directory: /root/.cache/pip/wheels/3f/e3/ec/8a8336ff196023622fbcb36de0c5a5c218cbb24111d1d4c7f2
Successfully built gast termcolor
Installing collected packages: zipp, importlib-metadata, werkzeug, protobuf, markdown, grpcio, absl-py, wrapt, termcolor, tensorflow-estimator, tensorboard, opt-einsum, keras-preprocessing, keras-applications, google-pasta, gast, astor, tensorflow
Successfully installed absl-py-1.1.0 astor-0.8.1 gast-0.2.2 google-pasta-0.2.0 grpcio-1.47.0 importlib-metadata-4.12.0 keras-applications-1.0.8 keras-preprocessing-1.1.2 markdown-3.3.7 opt-einsum-3.3.0 protobuf-4.21.2 tensorboard-1.15.0 tensorflow-1.15.2 tensorflow-estimator-1.15.1 termcolor-1.1.0 werkzeug-2.1.2 wrapt-1.14.1 zipp-3.8.0
WARNING: You are using pip version 21.1.2; however, version 22.1.2 is available.
You should consider upgrading via the '/base/venv/bin/python -m pip install --upgrade pip' command.
Collecting gensim==3.8.1
  Downloading gensim-3.8.1-cp37-cp37m-manylinux1_x86_64.whl (24.2 MB)
Collecting smart-open>=1.8.1
  Downloading smart_open-6.0.0-py3-none-any.whl (58 kB)
Requirement already satisfied: numpy>=1.11.3 in ./venv/lib/python3.7/site-packages/numpy-1.18.0-py3.7-linux-x86_64.egg (from gensim==3.8.1) (1.18.0)
Requirement already satisfied: scipy>=0.18.1 in ./venv/lib/python3.7/site-packages/scipy-1.4.1-py3.7-linux-x86_64.egg (from gensim==3.8.1) (1.4.1)
Requirement already satisfied: six>=1.5.0 in ./venv/lib/python3.7/site-packages/six-1.16.0-py3.7.egg (from gensim==3.8.1) (1.16.0)
Installing collected packages: smart-open, gensim
Successfully installed gensim-3.8.1 smart-open-6.0.0
WARNING: You are using pip version 21.1.2; however, version 22.1.2 is available.
You should consider upgrading via the '/base/venv/bin/python -m pip install --upgrade pip' command.
2022-07-10 12:14:42.20 INFO in 'deeppavlov.core.common.file'['file'] at line 32: Interpreting 'ner_ontonotes' as '/base/DeepPavlov/deeppavlov/configs/ner/ner_ontonotes.json'
2022-07-10 12:14:43.7 INFO in 'deeppavlov.core.data.utils'['utils'] at line 95: Downloading from http://files.deeppavlov.ai/embeddings/glove.6B.100d.txt?config=ner_ontonotes to /root/.deeppavlov/downloads/embeddings/glove.6B.100d.txt
347MB [00:13, 25.1MB/s]
2022-07-10 12:14:57.596 INFO in 'deeppavlov.core.data.utils'['utils'] at line 95: Downloading from http://files.deeppavlov.ai/deeppavlov_data/ner_ontonotes_v3_cpu_compatible.tar.gz?config=ner_ontonotes to /root/.deeppavlov/ner_ontonotes_v3_cpu_compatible.tar.gz
100%|██████████| 8.13M/8.13M [00:01<00:00, 7.53MB/s]
2022-07-10 12:14:59.90 INFO in 'deeppavlov.core.data.utils'['utils'] at line 272: Extracting /root/.deeppavlov/ner_ontonotes_v3_cpu_compatible.tar.gz archive into /root/.deeppavlov/models
2022-07-10 12:14:59.749 INFO in 'deeppavlov.core.data.simple_vocab'['simple_vocab'] at line 115: [loading vocabulary from /root/.deeppavlov/models/ner_ontonotes/tag.dict]
2022-07-10 12:14:59.751 INFO in 'deeppavlov.core.data.simple_vocab'['simple_vocab'] at line 115: [loading vocabulary from /root/.deeppavlov/models/ner_ontonotes/char.dict]
2022-07-10 12:14:59.825 INFO in 'deeppavlov.models.embedders.glove_embedder'['glove_embedder'] at line 52: [loading GloVe embeddings from `/root/.deeppavlov/downloads/embeddings/glove.6B.100d.txt`]
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/local/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/base/DeepPavlov/deeppavlov/__main__.py", line 4, in <module>
    main()
  File "/base/DeepPavlov/deeppavlov/deep.py", line 113, in main
    start_model_server(pipeline_config_path, args.https, args.key, args.cert, port=args.port)
  File "/base/DeepPavlov/deeppavlov/utils/server/server.py", line 179, in start_model_server
    model = build_model(model_config)
  File "/base/DeepPavlov/deeppavlov/core/commands/infer.py", line 62, in build_model
    component = from_params(component_config, mode=mode, serialized=component_serialized)
  File "/base/DeepPavlov/deeppavlov/core/common/params.py", line 95, in from_params
    obj = get_model(cls_name)
  File "/base/DeepPavlov/deeppavlov/core/common/registry.py", line 74, in get_model
    return cls_from_str(_REGISTRY[name])
  File "/base/DeepPavlov/deeppavlov/core/common/registry.py", line 42, in cls_from_str
    return getattr(importlib.import_module(module_name), cls_name)
  File "/base/venv/lib/python3.7/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 967, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/base/DeepPavlov/deeppavlov/models/ner/network.py", line 19, in <module>
    import tensorflow as tf
  File "/base/venv/lib/python3.7/site-packages/tensorflow/__init__.py", line 99, in <module>
    from tensorflow_core import *
  File "/base/venv/lib/python3.7/site-packages/tensorflow_core/__init__.py", line 28, in <module>
    from tensorflow.python import pywrap_tensorflow  # pylint: disable=unused-import
  File "<frozen importlib._bootstrap>", line 1019, in _handle_fromlist
  File "/base/venv/lib/python3.7/site-packages/tensorflow/__init__.py", line 50, in __getattr__
    module = self._load()
  File "/base/venv/lib/python3.7/site-packages/tensorflow/__init__.py", line 44, in _load
    module = _importlib.import_module(self.__name__)
  File "/base/venv/lib/python3.7/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/base/venv/lib/python3.7/site-packages/tensorflow_core/python/__init__.py", line 52, in <module>
    from tensorflow.core.framework.graph_pb2 import *
  File "/base/venv/lib/python3.7/site-packages/tensorflow_core/core/framework/graph_pb2.py", line 16, in <module>
    from tensorflow.core.framework import node_def_pb2 as tensorflow_dot_core_dot_framework_dot_node__def__pb2
  File "/base/venv/lib/python3.7/site-packages/tensorflow_core/core/framework/node_def_pb2.py", line 16, in <module>
    from tensorflow.core.framework import attr_value_pb2 as tensorflow_dot_core_dot_framework_dot_attr__value__pb2
  File "/base/venv/lib/python3.7/site-packages/tensorflow_core/core/framework/attr_value_pb2.py", line 16, in <module>
    from tensorflow.core.framework import tensor_pb2 as tensorflow_dot_core_dot_framework_dot_tensor__pb2
  File "/base/venv/lib/python3.7/site-packages/tensorflow_core/core/framework/tensor_pb2.py", line 16, in <module>
    from tensorflow.core.framework import resource_handle_pb2 as tensorflow_dot_core_dot_framework_dot_resource__handle__pb2
  File "/base/venv/lib/python3.7/site-packages/tensorflow_core/core/framework/resource_handle_pb2.py", line 16, in <module>
    from tensorflow.core.framework import tensor_shape_pb2 as tensorflow_dot_core_dot_framework_dot_tensor__shape__pb2
  File "/base/venv/lib/python3.7/site-packages/tensorflow_core/core/framework/tensor_shape_pb2.py", line 42, in <module>
    serialized_options=None, file=DESCRIPTOR),
  File "/base/venv/lib/python3.7/site-packages/google/protobuf/descriptor.py", line 560, in __new__
    _message.Message._CheckCalledFromGeneratedFile()
TypeError: Descriptors cannot not be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:
 1. Downgrade the protobuf package to 3.20.x or lower.
 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

Solution

  • The image has just been updated. Please, try again.