People

John Hopfield

1933–, Physicist, biologist

Also known as: John J. Hopfield

John Joseph Hopfield is an American physicist whose 1982 PNAS paper Neural Networks and Physical Systems with Emergent Collective Computational Abilities introduced the Hopfield network, a fully-connected recurrent network of binary threshold units with symmetric weights, demonstrated to function as an associative memory that retrieves stored patterns from partial or noisy cues. The paper showed that the network's dynamics minimise an explicit energy function drawn from spin-glass physics, putting neural-network research on a quantitative mathematical footing.

A 1984 follow-up extended the analysis to continuous-valued units. The Hopfield network's storage capacity (~0.14 N for a network of N units) was characterised by Amit, Gutfreund and Sompolinsky using the methods of statistical mechanics. The model is the direct ancestor of the Boltzmann machine (Hopfield-network with stochastic units, Hinton and Sejnowski 1985), of every subsequent energy-based model in deep learning, and of modern Hopfield-network revivals such as the modern Hopfield networks of Ramsauer et al. (2020) which connect the model to attention mechanisms.

Hopfield's earlier scientific career was in molecular biophysics, where his 1974 paper on kinetic proofreading explained how biological systems achieve high-fidelity copying. He shared the 2024 Nobel Prize in Physics with Geoffrey Hinton for the foundational contributions to neural network theory.

Video

Related people: Geoffrey Hinton, Terry Sejnowski

Discussed in:

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).