Search code examples
machine-learningnlpword2vecword-embedding

Fine-tuning Glove Embeddings


Has anyone tried to fine-tune Glove embeddings on a domain-specific corpus?
Fine-tuning word2vec embeddings has proven very efficient for me in a various NLP tasks, but I am wondering whether generating a cooccurrence matrix on my domain-specific corpus, and training glove embeddings (initialized with pre-trained embeddings) on that corpus would generate similar improvements.


Solution

  • I myself am trying to do the exact same thing. You can try mittens.

    They have succesfully built a framework for it. Christopher D. Manning(co-author of GloVe) is associated with it.