under construction ...
•A Neural Probabilistic Language Model
•𝑛-gram Model for NLP
•Drawbacks associated with 𝑛-gram models
•Distributed representation of words as vectors
•Why word vector model?
•Learning distributed word vector representations
•A Neural Probabilistic Language Model
•New Log-linear Models
•Continuous Bag-of-words Model
•Continuous Skip-gram Model
•Analyzing language models
•Enriching Word Vectors with Subword Information
•Bag of Tricks for Efficient Text Classification
•CNN Architecture for Sentence Classification
•Recurrent Neural Network Language Model
•Sequence to Sequence Learning with Neural Networks
•Recurrent NN Encoder–Decoder for Statistical Machine Translation