Glossary

Parallel Distributed Processing

Parallel Distributed Processing (PDP) is the framework set out in the 1986 two-volume work Parallel Distributed Processing: Explorations in the Microstructure of Cognition, edited by David Rumelhart, James McClelland and the PDP Research Group (a fluctuating roster including Geoffrey Hinton, Terry Sejnowski, Paul Smolensky, Michael Jordan, Jay McClelland, Jeffrey Elman and others). The two volumes , Foundations and Psychological and Biological Models, became the foundational text of the modern connectionist programme.

Core tenets

The PDP framework articulated three commitments:

  1. Cognition is parallel propagation of activation through networks of simple interconnected units. No central executive, no sequential program counter, just many simple processors operating concurrently.
  2. Representations are distributed. Concepts and percepts are encoded by patterns of activity across many units, not by individual symbol-bearing nodes. The representation of "dog" is a vector, not a single neuron.
  3. Learning is local modification of connection weights. No homunculus inside the head writing rules; learning emerges from local updates that propagate global behaviour.

These commitments stood in deliberate contrast to the reigning symbolic paradigm of cognitive science, which treated mental representations as language-like discrete symbols manipulated by rules.

Contents of the volumes

The two volumes brought together:

  • McClelland and Rumelhart's interactive-activation model of letter perception (1981), in which letters and words mutually constrain each other through bidirectional activation flow, explaining the word-superiority effect.
  • Sejnowski and Hinton's Boltzmann machine work and stochastic constraint satisfaction.
  • Rumelhart, Hinton and Williams's backpropagation paper (Nature, 1986), the chapter that, more than any other, would prove transformative , introducing efficient gradient-based training of multi-layer networks to a wide audience.
  • Comprehensive expositions of pattern association, autoassociation, competitive learning, and sequence learning.
  • The famous (and controversial) past-tense morphology model (Rumelhart and McClelland), arguing that English past-tense formation could be learned without explicit rules, provoking Pinker and Prince's lengthy 1988 critique and a debate that raged through the 1990s.
  • Smolensky's harmony theory chapter and a sustained methodological case for neural-network models of cognition.

Reception and impact

PDP was a watershed in cognitive science. It legitimised neural-network models in psychology after a generation in which symbolic and rule-based approaches had dominated, and it provided the conceptual and mathematical vocabulary that the modern deep-learning revival would build on twenty years later. The pre-AlexNet generation of deep-learning researchers, Hinton, LeCun, Bengio, Sejnowski, were direct intellectual heirs of the PDP movement; Hinton and Sejnowski were members of the original PDP Research Group, and McClelland's Stanford lab continued to produce influential connectionist cognitive models throughout the intervening decades.

The volumes also reignited the long-running connectionism vs symbolism debate, with Fodor and Pylyshyn's 1988 critique Connectionism and Cognitive Architecture responding directly to PDP and prompting decades of subsequent argument about whether distributed representations can support systematic, compositional thought, a debate that has acquired new urgency in the era of large language models.

Related terms: Connectionism, Backpropagation, geoffrey-hinton, david-rumelhart, terry-sejnowski, Boltzmann Machine

Discussed in:

This site is currently in Beta. Contact: Chris Paton

Textbook of Usability · Textbook of Digital Health

Auckland Maths and Science Tutoring

AI tools used: Claude (research, coding, text), ChatGPT (diagrams, images), Grammarly (editing).