Abstract. Introduces the Word2Vec family of models (CBOW and skip-gram), which learn dense word embeddings from shallow neural networks trained on vast text corpora. The resulting vectors famously exhibit linear analogical structure: king - man + woman ≈ queen.
Tags:embeddingsword2vecnlp
This site is currently in Beta. Contact: Chris Paton