Dana Angluin | |
Field: | |
Work Institution: | Yale University |
Alma Mater: | University of California, Berkeley |
Doctoral Advisor: | Manuel Blum |
Known For: |
|
Thesis Title: | An Application of the Theory of Computational Complexity to the Study of Inductive Inference |
Thesis Year: | 1976 |
Doctoral Students: | Ehud Shapiro |
Dana Angluin is a professor emeritus of computer science at Yale University.[1] She is known for foundational work in computational learning theory[2] [3] [4] and distributed computing.[5]
Angluin received her B.A. (1969) and Ph.D. (1976) at University of California, Berkeley.[6] Her thesis, entitled "An application of the theory of computational complexity to the study of inductive inference" [7] was one of the first works to apply complexity theory to the field of inductive inference. Angluin joined the faculty at Yale in 1979.
Angluin's work helped establish the theoretical foundations of machine learning.
L* Algorithm
Angluin has written highly cited papers on computational learning theory, particularly in the context of learning regular language sets from membership and equivalence queries using the L* algorithm.[8] This algorithm addresses the problem of identifying an unknown set. In essence, this algorithm is a way for programs to learn complex systems through the process of trial and error of educated guesses, to determine the behavior of the system. Through the responses, the algorithm can continue to refine its understanding of the system. This algorithm uses a minimally adequate Teacher (MAT) to pose questions about the unknown set. The MAT provides yes or no answers to membership queries, saying whether an input is a member of the unknown set, and equivalence queries, saying whether a description of the set is accurate or not. The Learner uses responses from the Teacher to refine its understanding of the set S in polynomial time.[9] Though Angluin's paper was published in 1987, a 2017 article by computer science Professor Frits Vaandrager says "the most efficient learning algorithms that are being used today all follow Angluin's approach of a minimally adequate teacher".[9]
Angluin's work on learning from noisy examples[10] has also been very influential to the field of machine learning.[11] Her work addresses the problem of adapting learning algorithms to cope with incorrect training examples (noisy data). Angluin's study demonstrates that algorithms exist for learning in the presence of errors in the data.
In distributed computing, she co-invented the population protocol model and studied the problem of consensus.[12] In probabilistic algorithms, she has studied randomized algorithms for Hamiltonian circuits and matchings.[13] [14] [15]
Angluin helped found the Computational Learning Theory (COLT) conference, and has served on program committees and steering committees for COLT[16] [17] [18] She served as an area editor for Information and Computation from 1989–1992.[19] [20] She organized Yale's Computer Science Department's Perlis Symposium in April 2001: "From Statistics to Chat: Trends in Machine Learning".[21] She is a member of the Association for Computing Machinery and the Association for Women in Mathematics.
Angluin is highly celebrated as an educator, having won "three of the most distinguished teaching prizes Yale College has to offer": the Dylan Hixon Prize for Teaching Excellence in the Sciences, The Bryne/Sewall Prize for distinguished undergraduate teaching, and the Phi Beta Kappa DeVane Medal.[22] [11]
Angluin has also published works on Ada Lovelace and her involvement with the Analytical Engine.[23]