1963–, Computer scientist
Also known as: Robert E. Schapire
Robert Elias Schapire is an American computer scientist who, in 1990, proved the boosting theorem: any "weak" learning algorithm , one that produces hypotheses slightly better than random, can be combined into a "strong" learner that achieves arbitrarily small error. The theorem answered a question Michael Kearns and Leslie Valiant had posed in 1988.
With Yoav Freund he developed AdaBoost (Adaptive Boosting) in 1995, the first practical boosting algorithm. AdaBoost iteratively trains weak learners on reweighted versions of the data, increasing the weight of examples misclassified by the previous learner, and combines them into a weighted majority vote. AdaBoost was an immediate practical success and remains the basis of most modern boosting systems including XGBoost and LightGBM, which dominate tabular-data competitions on Kaggle.
Schapire and Freund won the 2003 Gödel Prize and the 2004 ACM Paris Kanellakis Prize for the boosting work. Schapire spent his career at AT&T Labs, Princeton, and Microsoft Research New York.
Video
Related people: Yoav Freund, Leo Breiman
Works cited in this book:
- A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting (1997) (with Yoav Freund)
Discussed in:
- Chapter 7: Supervised Learning, Supervised Learning