People

Yann LeCun

1960–, Computer scientist

Yann LeCun is a French-American computer scientist whose late-1980s work at AT&T Bell Labs introduced the convolutional neural network for image processing. The 1989 paper Backpropagation Applied to Handwritten Zip Code Recognition trained a CNN to read US Postal Service ZIP codes; the 1998 paper Gradient-Based Learning Applied to Document Recognition (with Léon Bottou, Yoshua Bengio and Patrick Haffner) introduced LeNet-5, the architectural template, alternating convolutional and pooling layers, ending in fully-connected layers, for nearly every CNN built between then and AlexNet's 2012 result.

LeCun also contributed to the early backpropagation literature (his 1985 PhD work in Paris independently developed a form of backprop), to convolutional and recurrent network theory, to modern energy-based models, and to self-supervised learning (his post-2020 advocacy for joint embedding architectures has become an influential alternative to autoregressive LLM scaling). He created the MNIST dataset (1998) and many other standard benchmarks.

LeCun moved to NYU in 2003 and became Chief AI Scientist at Facebook (Meta) in 2013. He left Meta in November 2025 to launch a new laboratory focused on world-model architectures. He shares the 2018 Turing Award with Geoffrey Hinton and Yoshua Bengio for the joint work on deep learning. He has been a vocal sceptic of pure-LLM approaches to AGI and an advocate for grounded, world-model-based architectures.

Video

Related people: Geoffrey Hinton, Yoshua Bengio

Works cited in this book:

Discussed in:

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).