Yee-Whye Teh | |
Alma Mater: | University of Waterloo (BMath) University of Toronto (PhD) |
Thesis Title: | Bethe free energy and contrastive divergence approximations for undirected graphical models |
Thesis Year: | 2003 |
Thesis Url: | https://hdl.handle.net/1807/122253 |
Known For: | Hierarchical Dirichlet process Deep belief networks |
Field: | Machine learning Artificial intelligence Statistics Computer science |
Work Institution: | University of Oxford DeepMind University College London University of California, Berkeley National University of Singapore |
Yee-Whye Teh is a professor of statistical machine learning in the Department of Statistics, University of Oxford.[1] Prior to 2012 he was a reader at the Gatsby Charitable Foundation computational neuroscience unit at University College London.[2] His work is primarily in machine learning, artificial intelligence, statistics and computer science.
Teh was educated at the University of Waterloo and the University of Toronto where he was awarded a PhD in 2003 for research supervised by Geoffrey Hinton.[3]
Teh was a postdoctoral fellow at the University of California, Berkeley and the National University of Singapore before he joined University College London as a lecturer.[4]
Teh was one of the original developers of deep belief networks and of hierarchical Dirichlet processes.
Teh was a keynote speaker at Uncertainty in Artificial Intelligence (UAI) 2019, and was invited to give the Breiman lecture at the Conference on Neural Information Processing Systems (NeurIPS) 2017.[5] He served as program co-chair of the International Conference on Machine Learning (ICML) in 2017, one of the premier conferences in machine learning.