Search code examples
pythonnlpword2vecword-embedding

where can i download a pretrained word2vec map?


I have been learning about NLP models and came across word embedding, and saw the examples in which it is possible to see relations between words by calculating their dot products and such.

What I am looking for is just a dictionary, mapping words to their representative vectors, so I can play around with it. I know that I can build a model and train it and create my own map but I just want the already trained map as a python variable.


Solution

  • You can try out Google's word2vec model trained with about 100 billion words from various news articles.

    An interesting fact about word vectors, w2v(king) - w2v(man) + w2v(woman) ≈ w2v(queen)