Search code examples
python-3.xnlpk-meansgensimword2vec

Difference between Text Embedding and Word Embedding


I am working on a dataset of amazon alexa reviews and wish to cluster them in positive and negative clusters. I am using Word2Vec for vectorization so wanted to know the difference between Text Embedding and Word Embedding. Also, which one of them will be useful for my clustering of reviews (Please consider that I want to predict the cluster of any reviews that I enter.) Thanks in advance!


Solution

  • Text Embeddings are typically a way to aggregate a number of Word Embeddings for a sentence or a paragraph of text. There are various ways this can be done. The easiest way is to average word embeddings but not necessarily yielding best results.

    Application-wise: