Vladimir N. Vapnik
- Associated organizations
- AT&T, NEC Laboratories
- Fields of study
- Artificial intelligence
Biography
Vladimir N. Vapnik’s pioneering work became the foundation of a new research field known as “statistical learning theory” that has transformed how computers learn in tackling complex problems. Working with Alexey Chervonenkis in Moscow during the late 1960s/early 1970s, Dr. Vapnik developed the Vapnik-Chervonenkis (VC) learning theory. This theory established a fundamental quantity to represent the limitations of learning machines. Dr. Vapnik later created principles to handle the generalization factors defined by VC theory, known as structural risk minimization. Dr. Vapnik’s research was unknown to the Western world until his arriving in the United States shortly before the collapse of the Soviet Union. Working with AT&T Laboratories in Holmdel, NJ, during the 1990s, he put his theories into practical use with support vector machine (SVM) algorithms for recognizing complex patterns in data for classification and regression analysis tasks. SVMs have become the method of choice for machine learning.
A member of the U.S. National Academy of Engineering and NEC Laboratories America Fellow, Dr. Vapnik is currently a professor with Columbia University in New York.