1936–, Mathematician, statistician
Also known as: Vladimir Naumovich Vapnik
Vladimir Naumovich Vapnik is a Russian-American mathematician who founded statistical learning theory and co-invented the support vector machine (SVM). With Alexey Chervonenkis at the Moscow Institute of Control Sciences in the 1960s and 1970s, he developed the VC theory of generalisation, including the VC dimension measure of a hypothesis class's capacity and the related uniform convergence bounds that justify learning from finite samples.
After emigrating to the US in 1990, Vapnik joined AT&T Bell Labs, where with Bernhard Boser, Isabelle Guyon and Corinna Cortes he extended his earlier ideas into the modern SVM. The 1995 Cortes-Vapnik paper Support-Vector Networks introduced the soft-margin SVM and the kernel trick in a form that was immediately practically useful. Through the late 1990s and early 2000s SVMs were the dominant supervised-learning method, displacing neural networks until the deep-learning revival of the 2010s.
Vapnik's 1995 book The Nature of Statistical Learning Theory and 1998 Statistical Learning Theory gave the field its mathematical foundations. He has remained at NEC Labs and Facebook AI Research, and has been notably critical of the deep-learning era's reliance on heuristics rather than provable generalisation guarantees.
Video
Related people: Corinna Cortes, Bernhard Schölkopf, Leo Breiman
Works cited in this book:
- Support-vector networks (1995) (with Corinna Cortes)
Discussed in:
- Chapter 6: ML Fundamentals, Machine Learning Fundamentals