References

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, & Illia Polosukhin (2017)

arXiv.

DOI: https://doi.org/10.48550/arxiv.1706.03762

Abstract. Introduces the Transformer, an encoder-decoder architecture based entirely on self-attention with no recurrence or convolution. The Transformer achieved state-of-the-art machine translation and became the foundational architecture for modern language, vision, and multimodal AI.

Tags: transformer attention foundational

Cited in:

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).