1928–1971, Psychologist, neural-network pioneer
Also known as: F. Rosenblatt
Frank Rosenblatt was an American psychologist who, working at the Cornell Aeronautical Laboratory, invented the perceptron, the first practical learning algorithm for a neural network and an immediate ancestor of every supervised neural-network model in use today. His 1958 paper The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain introduced the model: a layer of weighted threshold units whose weights are adjusted by an iterative error-correction rule. He proved the perceptron convergence theorem: if the training data is linearly separable, the algorithm finds a separating hyperplane in finite steps.
Rosenblatt built the Mark I Perceptron at Cornell in 1958, a hardware machine with a 20×20 array of photocells, motor-driven potentiometers as adjustable weights, and a press-conference debut that the New York Times reported (with characteristic restraint) as "the embryo of an electronic computer that … will be able to walk, talk, see, write, reproduce itself and be conscious of its existence." His 1962 book Principles of Neurodynamics set out the mathematical theory of single-layer and (less successfully) multi-layer perceptrons.
Marvin Minsky and Seymour Papert's 1969 book Perceptrons rigorously proved the limitations of single-layer perceptrons (notably their inability to compute XOR), a result that, despite Rosenblatt's vigorous response, contributed to a decade-long pause in neural-network research. Rosenblatt drowned in a sailing accident on Chesapeake Bay in 1971 at the age of 43, before the backpropagation algorithm and multilayer perceptrons would vindicate the broader research programme he had launched.
Related people: Warren McCulloch, Walter Pitts, Marvin Minsky, Donald Hebb
Works cited in this book:
Discussed in:
- Chapter 1: What Is AI?, A Brief History of AI
- Chapter 9: Neural Networks, The Origins of Neural Networks