Peter Richtarik Explained
Peter Richtarik is a Slovak mathematician and computer scientist[1] working in the area of big data optimization and machine learning, known for his work on randomized coordinate descent algorithms, stochastic gradient descent and federated learning. He is currently a Professor of Computer Science at the King Abdullah University of Science and Technology.
Education
Richtarik earned a master's degree in mathematics from Comenius University, Slovakia, in 2001, graduating summa cum laude.[2] In 2007, he obtained a PhD in operations research from Cornell University, advised by Michael Jeremy Todd.[3] [4]
Career
Between 2007 and 2009, he was a postdoctoral scholar in the Center for Operations Research and Econometrics and Department of Mathematical Engineering at Universite catholique de Louvain, Belgium, working with Yurii Nesterov.[5] [6] Between 2009 and 2019, Richtarik was a Lecturer and later Reader in the School of Mathematics at the University of Edinburgh. He is a Turing Fellow.[7] Richtarik founded and organizes a conference series entitled "Optimization and Big Data".[8] [9]
Academic work
Richtarik's early research concerned gradient-type methods, optimization in relative scale, sparse principal component analysis and algorithms for optimal design. Since his appointment at Edinburgh, he has been working extensively on building algorithmic foundations of randomized methods in convex optimization, especially randomized coordinate descent algorithms and stochastic gradient descent methods. These methods are well suited for optimization problems described by big data and have applications in fields such as machine learning, signal processing and data science.[10] [11] Richtarik is the co-inventor of an algorithm generalizing the randomized Kaczmarz method for solving a system of linear equations, contributed to the invention of federated learning, and co-developed a stochastic variant of the Newton's method.
Awards and distinctions
Bibliography
- News: Efficient serial and parallel coordinate descent methods for huge-scale truss topology design . Peter Richtarik . Martin Takac . amp . Springer-Verlag . 2012 . Operations Research Proceedings 2011 . Operations Research Proceedings . 27–32 . 10.1007/978-3-642-29210-1_5 . 978-3-642-29209-5 .
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function . Peter Richtarik . Martin Takac . amp . 2014 . Springer . Mathematical Programming . 144 . 1 . 1–38 . 10.1007/s10107-012-0614-z . 1107.2848 . 254137101 .
- Accelerated, parallel and proximal coordinate descent . Olivier Fercoq . Peter Richtarik . amp . 2015 . SIAM Journal on Optimization . 25 . 4 . 1997–2023 . 10.1137/130949993 . 8068556 . 1312.5799 .
- News: Stochastic Dual Coordinate Ascent with Adaptive Probabilities . Dominik Csiba . Zheng Qu . Peter Richtarik . 2015 . Proceedings of the 32nd International Conference on Machine Learning . 674–683 . pdf .
- Randomized Iterative Methods for Linear Systems . Robert M Gower . Peter Richtarik . amp . 2015 . SIAM Journal on Matrix Analysis and Applications . 36. 4 . 1660–1690 . 10.1137/15M1025487 . 20.500.11820/5c673b9e-8cf3-482c-8602-da8abcb903dd . 8215294 . free .
- Parallel coordinate descent methods for big data optimization . Peter Richtarik . Martin Takac . amp . 2016 . Mathematical Programming . 156 . 1 . 433–484 . 10.1007/s10107-015-0901-6 . 254133277 . 20.500.11820/a5649cad-b6b8-4ccc-9ca2-b368131dcbe5 . free .
- Coordinate descent with arbitrary sampling I: algorithms and complexity . Zheng Qu . Peter Richtarik . amp . 2016 . Optimization Methods and Software . 31 . 5 . 829–857 . 10.1080/10556788.2016.1190360. 1412.8060 . 2636844 .
- Coordinate descent with arbitrary sampling II: expected separable overapproximation . Zheng Qu . Peter Richtarik . amp . 2016 . Optimization Methods and Software . 31 . 5 . 858–884 . 10.1080/10556788.2016.1190361. 1412.8063 . 11048560 .
- News: SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization . Zheng Qu . Peter Richtarik . Martin Takac . Olivier Fercoq . 2016 . Proceedings of the 33rd International Conference on Machine Learning . 1823–1832 . pdf .
- News: Even faster accelerated coordinate descent using non-uniform sampling . Zeyuan Allen-Zhu . Zheng Qu . Peter Richtarik . Yang Yuan . 2016 . Proceedings of the 33rd International Conference on Machine Learning . 1110–1119 . pdf .
- Importance sampling for minibatches . Dominik Csiba . Peter Richtarik . amp . 2016 . 1602.02283 . cs.LG .
- Coordinate descent face-off: primal or dual? . Dominik Csiba . Peter Richtarik . amp . 2016 . 1605.08982. math.OC .
External links
Notes and References
- Web site: Richtarik's DBLP profile . December 23, 2020.
- Web site: Richtarik's CV . August 21, 2016.
- Web site: Mathematics Genealogy Project . August 20, 2016.
- Web site: Cornell PhD Thesis . August 22, 2016.
- Web site: Postdoctoral Fellows at CORE . August 22, 2016.
- Web site: Simons Institute for the Theory of Computing, UC Berkeley . August 22, 2016.
- Web site: Alan Turing Institute Faculty Fellows . August 22, 2016.
- Web site: Optimization and Big Data 2012 . August 20, 2016.
- Web site: Optimization and Big Data 2015 . August 20, 2016.
- Book: Doing Data Science: Straight Talk from the Frontline . O'Reilly . Cathy O'Neil . Rachel Schutt . amp . 2013 . Modeling and Algorithms at Scale . August 21, 2016. 9781449358655 .
- Book: Convex Optimization: Algorithms and Complexity . Foundations and Trends in Machine Learning . Now Publishers . Sebastien Bubeck . 2015 . 978-1601988607 .
- Web site: Google Scholar. December 28, 2020.
- Web site: The h Index for Computer Science. December 28, 2020.
- Web site: SIGEST Award . August 20, 2016.
- Web site: EPSRC Fellowship . August 21, 2016.
- Web site: EUSA Awards 2015 . August 20, 2016.
- Web site: 46th Conference of Slovak Mathematicians . August 22, 2016.