While learning R, I am asked to use the package "quanteda" and apply the function "tokens". Unfortunately, when I try to do so, I get the message
Error: could not find function "tokens".
But I can use, for example, "tokenize".
My code is:
train.tokens <- tokens(train$Text, what = "word", remove_numbers = TRUE,
remove_punct = TRUE,remove_symbols = TRUE, remove_hyphens = TRUE).
As a side note, when I try to automatically update the quanteda package, it says that I have the 0.9.8.3 version and that the newest available is 0.9.8.5. After it gets updated, nothing happens.
Thank you!
You need to make sure you have a current version of quanteda and the packages that it imports. Then this will work fine:
> quanteda::tokens("This is a test")
tokens from 1 document.
text1 :
[1] "This" "is" "a" "test"
> packageVersion("quanteda")
[1] ‘0.99.22’