Search code examples
pythonnltktokenize

Why am I unable to tokenize or import tokenize from nltk?


I am receiving the below ImportError:

     1 import nltk
---->2 from nltk.tokenize import tokenize
     3 import re

ImportError: cannot import name 'tokenize' from 'nltk.tokenize'

Solution

  • you have to import like this:
    from nltk.tokenize import word_tokenize