People

Noam Shazeer

1976–, Computer scientist

Noam Shazeer is an American computer scientist who has been one of the most consistently influential individual contributors to large-scale machine learning. A long-time Google researcher, he co-authored Attention Is All You Need (the Transformer, 2017) and was a major contributor to the Mixture-of-Experts paper Outrageously Large Neural Networks: The Sparsely- Gated Mixture-of-Experts Layer (2017) that introduced sparse routing as an alternative scaling strategy.

He left Google in 2021 to co-found Character.AI with Daniel De Freitas, building on Google's LaMDA work to create a consumer-facing dialogue product. In 2024 he and De Freitas returned to Google as part of a major licensing-and-acquisition deal, becoming co-leads of the Gemini large-language-model effort. He is one of the few researchers who has been at the cutting edge of LLMs continuously from 2015 to the present.

Video

Related people: Ashish Vaswani, Jeff Dean

Works cited in this book:

Discussed in:

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).