Search code examples
python-3.xdeep-learningnlp

Can't import bert.tokenization


I am using Google Colab and the following import doesn't work somehow:

from bert.tokenization import FullTokenizer

I am getting this error:

ModuleNotFoundError: No module named 'bert.tokenization'

I tried to install bert by running the following command:

!pip install  --upgrade bert

Any idea how to resolve this error?


Solution

  • I found it:

    !pip install bert-tensorflow