I was looking for all the models being developed for NLP since word2vec till now. I was thinking of writing a detailed guide for the timeline of models in NLP as an article. Please help me here. It would be great if answer given will be as: Model name, year published, link to the paper and little summary about the model.
This repository contains landmark research papers in Natural Language Processing that came out in this century.
Efficient Estimation of Word Representations in Vector Space, Google
Distributed Representations of Words and Phrases, Google
Distributed Representations of Sentences and Documents, Google
Enriching Word Vectors with Subword Information, Facebook
Bag of Tricks for Efficient Text Classification, Facebook
Hierarchical Probabilistic Neural Network Language Model
A Scalable Hierarchical Distributed Language Model
BERT Pre-training of Deep Bidirectional Transformers for Language Understanding, Google
Language Models are Unsupervised Multitask Learners, OpenAI
Wav2Letter, Facebook
Misspelling Oblivious Word Embeddings, Facebook
refer to this repo: https://github.com/Akshat4112/NLP-research-papers