Vladimir Vapnik Explained

Vladimir N. Vapnik
Birth Date:6 December 1936
Birth Place:Tashkent, Uzbek SSR
Fields:Machine learning
Statistics
Workplaces:Facebook Artificial Intelligence Research
Vencore Labs
NEC Laboratories America
Adaptive Systems ResearchDepartment, AT&T Bell Laboratories
Royal Holloway, University of London
Columbia University
Alma Mater:Institute of Control Sciences, Russian Academy of Sciences
Uzbek State University
Doctoral Advisor:Alexander Lerner
Known For:Vapnik–Chervonenkis theory
Vapnik–Chervonenkis dimension
Support-vector machine
Support-vector clustering algorithm
Statistical learning theory
Structural risk minimization
Awards:Kolmogorov Medal (2018)
IEEE John von Neumann Medal (2017)
Kampé de Fériet Award (2014)
C&C Prize (2013)
Benjamin Franklin Medal (2012)
IEEE Frank Rosenblatt Award (2012)
IEEE Neural Networks Pioneer Award (2010)
Paris Kanellakis Award (2008)
Fellow of the U.S. National Academy of Engineering (2006)
Gabor Award, International Neural Network Society (2005)
Alexander Humboldt Research Award (2003)

Vladimir Naumovich Vapnik (Russian: Владимир Наумович Вапник; born 6 December 1936) is a computer scientist, researcher, and academic. He is one of the main developers of the Vapnik–Chervonenkis theory of statistical learning[1] and the co-inventor of the support-vector machine method and support-vector clustering algorithms.[2]

Early life and education

Vladimir Vapnik was born to a Jewish family[3] in the Soviet Union. He received his master's degree in mathematics from the Uzbek State University, Samarkand, Uzbek SSR in 1958 and Ph.D in statistics at the Institute of Control Sciences, Moscow in 1964. He worked at this institute from 1961 to 1990 and became Head of the Computer Science Research Department.

Academic career

At the end of 1990, Vladimir Vapnik moved to the USA and joined the Adaptive Systems Research Department at AT&T Bell Labs in Holmdel, New Jersey. While at AT&T, Vapnik and his colleagues did work on the support-vector machine (SVM), which he also worked on much earlier before moving to the USA. They demonstrated its performance on a number of problems of interest to the machine learning community, including handwriting recognition. The group later became the Image Processing Research Department of AT&T Laboratories when AT&T spun off Lucent Technologies in 1996. In 2000, Vapnik and Hava Siegelmann developed Support-Vector Clustering, which enabled the algorithm to categorize inputs without labels—becoming one of the most ubiquitous data clustering applications in use. Vapnik left AT&T in 2002 and joined NEC Laboratories in Princeton, New Jersey, where he worked in the Machine Learning group. He also holds a Professor of Computer Science and Statistics position at Royal Holloway, University of London since 1995, as well as a position as Professor of Computer Science at Columbia University, New York City since 2003.[4] As of February 1, 2021, he has an h-index of 86 and, overall, his publications have been cited 226597 times.[5] His book on "The Nature of Statistical Learning Theory" alone has been cited 91650 times.

On November 25, 2014, Vapnik joined Facebook AI Research,[6] where he is working alongside his longtime collaborators Jason Weston, Léon Bottou, Ronan Collobert, and Yann LeCun.[7] In 2016, he also joined Peraton Labs.

Honors and awards

Vladimir Vapnik was inducted into the U.S. National Academy of Engineering in 2006. He received the 2005 Gabor Award from the International Neural Network Society,[8] the 2008 Paris Kanellakis Award, the 2010 Neural Networks Pioneer Award,[9] the 2012 IEEE Frank Rosenblatt Award, the 2012 Benjamin Franklin Medal in Computer and Cognitive Science from the Franklin Institute,[10] the 2013 C&C Prize from the NEC C&C Foundation,[11] the 2014 Kampé de Fériet Award, the 2017IEEE John von Neumann Medal.[12] In 2018, he received the Kolmogorov Medal[13] from University of London and delivered the Kolmogorov Lecture. In 2019, Vladimir Vapnik receivedBBVA Foundation Frontiers of Knowledge Award.

Selected publications

See also

External links

Notes and References

  1. Book: The Nature of Statistical Learning Theory Vladimir Vapnik Springer. 2000. 10.1007/978-1-4757-3264-1. en. Vapnik. Vladimir N.. 978-1-4419-3160-3. 7138354.
  2. Cortes. Corinna. Vapnik. Vladimir. 1995-09-01. Support-vector networks. Machine Learning. en. 20. 3. 273–297. 10.1007/BF00994018. 0885-6125. 10.1.1.15.9362. 206787478.
  3. Estimation of Dependences Based on Empirical Data, (Springer Science & Business Media, 28 Sep 2006), By V. Vapnik, page 424
  4. Book: Scholkopf, Bernhard. 2013 . Empirical Inference: Festschrift in Honor of Vladimir N. Vapnik . https://www.springer.com/computer/ai/book/978-3-642-41135-9 . Springer . Preface . 978-3-642-41136-6 .
  5. Web site: Google Scholar Record of Vapnik.
  6. Web site: Facebook AI Research. FAIR. 2016-09-20.
    "see also" Web site: Facebook Research, ("People" entry for "Vladimir Vapnik") . 2017-09-06.
  7. Web site: Facebook's AI team hires Vladimir Vapnik, father of the popular support vector machine algorithm . 2014 . VentureBeat . .
  8. Web site: INNS awards recipients . 2005. International Neural Network Society. .
  9. http://ieee-cis.org/awards/recipients/ IEEE Computational Intelligence Society.
  10. Web site: Benjamin Franklin Medal in Computer and Cognitive Science . 2012 . Franklin Institute . .
  11. Web site: NEC C&C Foundation Awards 2013 C&C Prize . 2013 . NEC . .
  12. Web site: IEEE JOHN VON NEUMANN MEDAL RECIPIENTS. https://web.archive.org/web/20100619223921/http://ieee.org/documents/von_neumann_rl.pdf. dead. June 19, 2010. Institute of Electrical and Electronics Engineers (IEEE).
  13. Web site: Kolmogorov Lecture and Medal.