Thomas M. Cover | |
Birth Date: | August 7, 1938 |
Birth Place: | San Bernardino, California, U.S. |
Death Place: | Palo Alto, California, U.S. |
Field: | Information theory Electrical engineering Statistics Pattern recognition |
Work Institution: | Stanford University |
Alma Mater: | Massachusetts Institute of Technology (BS) Stanford University (MS, PhD) |
Doctoral Students: | Joy A. Thomas Mohammad Reza Aref Martin Hellman Peter E. Hart Abbas El Gamal |
Known For: | Information theory Nearest neighbors algorithm Cover's theorem |
Awards: | IEEE Fellow (1974) IMS Fellow (1981) Claude E. Shannon Award (1990) AAAS Fellow (1991) Member of the National Academy of Engineering (1995) |
Thesis Title: | Geometrical and Statistical Properties of Linear Threshold Devices |
Thesis Url: | https://isl.stanford.edu/~cover/papers/paper1.pdf |
Thesis Year: | 1964 |
Doctoral Advisor: | Norman Abramson |
Thomas M. Cover [ˈkoʊvər] (August 7, 1938 – March 26, 2012) was an American information theorist and professor jointly in the Departments of Electrical Engineering and Statistics at Stanford University. He devoted almost his entire career to developing the relationship between information theory and statistics.
He received his B.S. in Physics from MIT in 1960 and Ph.D. in electrical engineering from Stanford University in 1964. His doctoral studies were supervised by Norman Abramson.[1]
Cover was President of the IEEE Information Theory Society and was a Fellow of the Institute of Mathematical Statistics and of the Institute of Electrical and Electronics Engineers. He received the Outstanding Paper Award in Information Theory for his 1972 paper "Broadcast Channels"; he was selected in 1990 as the Shannon Lecturer, regarded as the highest honor in information theory; in 1997 he received the IEEE Richard W. Hamming Medal;[2] and in 2003 he was elected to the American Academy of Arts and Sciences.
During his 48-year career as a professor of Electrical Engineering and Statistics at Stanford University, he graduated 64 PhD students, authored over 120 journal papers in learning, information theory, statistical complexity, pattern recognition, and portfolio theory; and he partnered with Joy A. Thomas to coauthor the book Elements of Information Theory,[3] which has become the most widely used textbook as an introduction to the topic since the publication of its first edition in 1991.[4] He was also coeditor of the book Open Problems in Communication and Computation.