H. T. Kung Chinese: 孔祥重[1] | |
Birth Date: | 9 November 1945 |
Field: | Computer science |
Work Institutions: | Carnegie Mellon University Harvard University |
Alma Mater: | National Tsing Hua University Carnegie Mellon University |
Doctoral Advisor: | Joseph F. Traub |
Thesis Title: | Topics in Analytic Computational Complexity |
Thesis Year: | 1974 |
Doctoral Students: | Brad Karp Monica S. Lam Charles E. Leiserson Robert T. Morris |
Prizes: | Member of National Academy of Engineering Academician of Academia Sinica Guggenheim Fellowship IEEE Computer Society Charles Babbage Award |
Hsiang-Tsung Kung (; born November 9, 1945) is a Taiwanese-born American computer scientist. He is the William H. Gates professor of computer science at Harvard University.[2] His early research in parallel computing produced the systolic array in 1979, which has since become a core computational component of hardware accelerators for artificial intelligence, including Google's Tensor Processing Unit (TPU).[3] Similarly, he proposed optimistic concurrency control in 1981, now a key principle in memory and database transaction systems, including MySQL, Apache CouchDB, Google's App Engine, and Ruby on Rails. He remains an active researcher, with ongoing contributions to computational complexity theory, hardware design, parallel computing, routing, wireless communication, signal processing, and artificial intelligence.[4]
Kung is well-known as an influential mentor. His 1987 advice on Ph.D. research remains well cited. Throughout his career, he has been equally regarded for the role of his own research as for the legacy of his students, who have gone on to become pillars at Y Combinator, Google Brain, IBM, Intel, Akamai, MediaTek, Stanford, and MIT.
He was elected a member of the US National Academy of Engineering 1993 for introducing the idea of systolic computation, contributions to parallel computing, and applying complexity analysis to very-large-scale integrated (VLSI) computation.[5] Kung is also a Guggenheim Fellow,[6] member of the Academia Sinica in Taiwan,[7] and president of the Taiwan AI Academy.[8] He has been awarded the IEEE Charles Babbage award, Inventor of the Year by the Pittsburgh Intellectual Property Law Association in 1991, and the ACM SIGOPS Hall of Fame award in 2015.[9]
Kung was born in Shanghai on November 9, 1945, and grew up in Taiwan. Kung received his bachelor's degree in mathematics from National Tsing Hua University in 1968, before moving to the United States. In 1971, he moved from University of Washington to Carnegie Mellon with Joseph F. Traub, when the latter was appointed head of CMU's computer science department.[10] Kung's graduate research at Carnegie Mellon focused on computational complexity and parallel computation, and he completed his thesis "Topics in Analytic Computation Complexity" in 1973.[11]
In 1974, Kung and Traub published the Kung-Traub algorithm for solving non-linear equations,[12] relying on a key insight that Isaac Newton had overlooked when working on the same problem. His students at Carnegie Mellon included Charles E. Leiserson, with whom he published early work on the systolic array, Monica Lam, and Feng-hsiung Hsu. Leiserson went on to become an MIT professor of computer science and artificial intelligence, and author of the most widely-used algorithms textbook "Introduction to Algorithms," Lam a Stanford Professor and early member of Tensilica Inc., and Hsu the principal designer of IBM Deep Blue, the first computer to beat a chess grandmaster in tournament play. Kung's work during this time is cited in Donald Knuth's The Art of Computer Programming, cementing its fundamental importance to the early development of computer science. Kung's other research contributions during this time include the iWarp system architecture, optimistic concurrency control, read-copy-update a mutual exclusion synchronization method used in the Linux kernel, and a communication-avoiding optimal distributed matrix multiplication algorithm.[13]
In 1992, Kung was appointed McKay professor of Electrical Engineering and Computer Science at Harvard. He was later elected the William H. Gates chair of computer of science. Kung became advisor to Robert T. Morris after Morris released one of the first ever internet worms. In 1995, while both graduate students of Kung, Morris and Trevor Blackwell teamed with Paul Graham to found Viaweb, which they sold to Yahoo! for $45 million. This windfall seeded Y Combinator, making the three among the most influential forces in Silicon Valley. Morris and Blackwell also worked alongside another of Kung's students Cliff Young who would go on to become chief architect of Google's Tensor Processing Unit. The TPU is one of the first neural network hardware accelerators and implements Kung's systolic array, now a cornerstone technology of the artificial intelligence boom of the 2010s.
Kung's research during this time was also influential in the war over WiMAX wireless technology standards. His work on geographic wireless data routing with Brad Karp produced the GPSR algorithm, a technology underlying ad-hoc and vehicular networks.[14] From 1999 to 2006, Kung co-chaired a joint Ph.D. program with colleagues at Harvard Business School. Renewed interest in systolic arrays for deep learning has led Kung to again contribute to hardware for artificial intelligence, including distributed and embedded low-precision neural networks.