Robert M. French Explained

Robert M. French is a research director at the French National Centre for Scientific Research. He is currently at the University of Burgundy in Dijon. He holds a Ph.D. from the University of Michigan, where he worked with Douglas Hofstadter on the Tabletop computational cognitive model. He specializes in cognitive science and has made an extensive study of the process of analogy-making.[1]

French is the inventor of Tabletop, a computer program that forms analogies in a microdomain consisting of everyday objects placed on a table.

He has done extensive research in artificial intelligence and written several articles about the Turing test, which was proposed by Alan Turing in 1950 as a means of determining whether an advanced computer can be said to be intelligent. French was for a long time an outspoken critic of the test, which, he suggested, no computer might ever be able to meet. French now believes that the way forward in AI does not lie in an attempt to flawlessly simulate human cognition (i.e., pass a Turing Test) but, rather, in trying to design computers capable of developing their own abilities to understand the world and in interacting with these machines in a meaningful manner.[2]

He has published work on catastrophic forgetting in neural networks, the Turing test and foundations of cognitive science, the evolution of sex, and categorization and learning in infants, among other topics.

Early life and education

French attended Miami University from 1969 to 1972, earning a B.S. in mathematics after three years of study. From 1972 to 1974 he was at Indiana University, from which received an M.A. in mathematics.[3]

Career

Early career and doctoral studies

From 1972 to 1974, French worked as a teaching assistant in mathematics at Indiana University. For several months in 1975, he taught mathematics at Hanover College in Hanover, Indiana.

He then moved to France, where from 1976 to 1985 he lived in Paris, working as a freelance translator and interpreter.[3] During his years there, he collaborated with a colleague, Jacqueline Henry, on the French translation of Douglas Hofstadter's bestseller Gödel, Escher, Bach.[1] [3]

French returned to the U.S. in 1985 to become a graduate student in computer science at the University of Michigan, Ann Arbor, where he pursued a Ph.D. under Hofstadter in artificial intelligence/cognitive science. He completed his doctoral work in 1992, receiving a degree in computer science.

His Ph.D. dissertation was entitled Tabletop: An Emergent, Stochastic Computer Model of Analogy-Making. His thesis committee consisted of Hofstadter, John Holland, Daniel Dennett, Arthur Burks, John Laird, and Steve Lytinen.[3]

“The key notion underlying the research presented in this dissertation,” wrote French in his summary of the dissertation, “is my conviction that the cognitive mechanisms giving rise to human analogy-making form the very basis of intelligence. Our ability to perceive and create analogies is made possible by the same mechanisms that drive our ability to categorize, to generalize, and to compare different situations.”[4]

From 1985 to 1992 he was a research assistant in Computer Science at the University of Michigan, Ann Arbor. During this period he was also a Visiting Researcher at CREA, Ecole Polytechnique, Paris (1988), and a Visiting Lecturer in Computer Science at Earlham College in Richmond, Indiana (1991).

He spent several months in 1992 as a postdoctoral fellow at the Center for Research on Concepts and Cognition at Indiana University. From 1992 to 1994, he was Visiting Assistant Professor of Computer Science at Willamette University in Salem, Oregon. From 1994 to 1995, he was a Postdoctoral Fellow at the Department of Psychology at the University of Wisconsin, Madison, and a Lecturer in Cognitive Science in the Department of Educational Psychology at the same institution.[3]

Later career

From 1995 to 1998, French was a Research Scientist in the Department of Psychology at the University of Liège. From 1998 to 2000, he was an Associate Professor in Quantitative Psychology and Cognitive Science in the same department. From 2001 to 2004, he was a Professor of Quantitative Psychology and Cognitive Science in that department.

Since 2004, he has been the Research Director at the French National Center for Scientific Research (CNRS).[3]

Selected publications

Books

In his foreword to the book, Daniel Dennett wrote that French “has created a model of human analogy-making that attempts to bridge the gap between classical top-down AI and more recent bottom-up approaches.” French's research, Dennett explained, “is based on the premise that human analogy-making is an extension of our constant background process of perceiving—in other words, that analogy-making and the perception of sameness are two sides of the same coin. At the heart of the author's theory and computer model of analogy-making is the idea that the building-up and the manipulation of representations are inseparable aspects of mental functioning, in contrast to traditional AI models of high-level cognitive processes, which have almost always depended on a clean separation.” Dennett maintained that “French's work is exciting not only because it reveals analogy-making to be an extension of our complex and subtle ability to perceive sameness but also because it offers a computational model of mechanisms underlying these processes. This model makes significant strides in putting into practice microlevel stochastic processing, distributed processing, simulated parallelism, and the integration of representation-building and representation-processing.”[5] Arthur B. Markman of Columbia University, in a review for the International Journal of Neural Systems described The Subtlety of Sameness as “fascinating.”[6]

A review in Choice said that “French reveals analogy-making to be an extension of our complex and subtle ability to perceive sameness. His computer program, Tabletop, forms analogies in a microdomain consisting of objects (utensils, cups, drinking glasses, etc.) on a table set for a meal. The theory and the program rely on the idea that stochastic choices made on the microlevel can add up to human-like robustness on a macrolevel. Thousands of program runs attempt to verify this on dozens of interrelated analogy problems in the Tabletop microworld.”[7]

Articles

Notes and References

  1. Book: Gödel Escher Bach : Les Brins d'une Guirlande Eternelle. .
  2. [Communications of the ACM]
  3. Web site: ROBERT M. FRENCH. University of Burgundy.
  4. Book: Tabletop: an emergent, stochastic computer model of analogy-making (intelligence modeling). OCLC WorldCat. 68794705.
  5. Book: The Subtlety of Sameness: A Theory and Computer Model of Analogy-making. 9780262061803. French. Robert Matthew. 1995. MIT Press .
  6. Extended Book Review: The Subtlety of Sameness by Robert M. French. International Journal of Neural Systems. 07 . 5 . 665–670 . 10.1142/S0129065796000737 . 1996 . Markman . Arthur B. .
  7. Web site: The subtlety of sameness a theory and computer model of analogy-making. Villanova University .
  8. Web site: Moving Beyond the Turing Test. Yildiz.
  9. Artificial Intelligence Could Be on Brink of Passing Turing Test. Wired. 2012-04-12.
  10. Web site: Computational Biology and "Dusting Off the Turing Test". Computing Community Consortium . 15 April 2012 .
  11. The Turing Test: The First Fifty Years . Trends in Cognitive Sciences . 4 . 3 . 115–122 . 10.1016/S1364-6613(00)01453-4 . 2000. 10689346 . French . R. M. . 1930455 .