References

Efficient Estimation of Word Representations in Vector Space

Tomas Mikolov, Kai Chen, Greg Corrado, & Jeffrey Dean (2013)

arXiv.

DOI: https://doi.org/10.48550/arxiv.1301.3781

Abstract. Introduces the Word2Vec family of models (CBOW and skip-gram), which learn dense word embeddings from shallow neural networks trained on vast text corpora. The resulting vectors famously exhibit linear analogical structure: king - man + woman ≈ queen.

Tags: embeddings word2vec nlp

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).