References

Deep Contextualized Word Representations

Matthew Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, & Luke Zettlemoyer (2018)

Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), 2227-2237.

DOI: https://doi.org/10.18653/v1/n18-1202

Abstract. Introduces ELMo, which generates contextualised word representations from a bidirectional LSTM language model. ELMo embeddings, which depend on the surrounding sentence, dramatically improved performance across many NLP benchmarks.

Tags: embeddings elmo contextualised nlp

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).