Abstract. Introduces the first neural probabilistic language model, which concatenates word embeddings of a fixed context window and passes them through a feedforward neural network to predict the next word. The paper introduced learned word embeddings into language modelling.
Tags:language-modelsembeddingsnlmurl-only
This site is currently in Beta. Contact: Chris Paton