Thomas Gilovich | |
Birth Date: | 16 January 1954 |
Nationality: | American |
Field: | Psychology |
Work Institutions: | Cornell University |
Alma Mater: | University of California, Santa Barbara (BA) Stanford University (PhD) |
Known For: | Research in heuristics and cognitive biases |
Doctoral Advisors: | Lee Ross Mark Lepper |
Thesis Title: | Biased evaluation and persistence in gambling |
Thesis Url: | http://www.worldcat.org/oclc/38618951 |
Thesis Year: | 1981 |
Doctoral Students: | Justin Kruger |
Thomas Dashiff Gilovich (born January 16, 1954) an American psychologist who is the Irene Blecker Rosenfeld Professor of Psychology at Cornell University. He has conducted research in social psychology, decision making, and behavioral economics, and has written popular books on these subjects. Gilovich has collaborated with Daniel Kahneman, Richard Nisbett, Lee Ross and Amos Tversky. His articles in peer-reviewed journals on subjects such as cognitive biases have been widely cited. In addition, Gilovich has been quoted in the media on subjects ranging from the effect of purchases on happiness[1] to people's most common regrets, to perceptions of people and social groups.[2] Gilovich is a fellow of the Committee for Skeptical Inquiry.
Gilovich earned his B.A. from the University of California, Santa Barbara and his PhD from Stanford University. After hearing Amos Tversky and Daniel Kahneman give a lecture about judgment and decision making in his very first classroom experience there, Gilovich changed his program of research to focus on the intersection of social psychology and judgment and decision making .[3] He went on to earn his Ph.D. in psychology from Stanford in 1981.
Gilovich is best known for his research in heuristics and biases in the field of social psychology. He describes his research as dealing with "how people evaluate the evidence of their everyday experience to make judgments, form beliefs, and decide on courses of action, and how they sometimes misevaluate that evidence and make faulty judgments, form dubious beliefs, and embark on counterproductive courses of action."[4] According to Google Scholar, he has an h-index of 77 for all his published academic papers, which is considered exceptional.[5] [6] The focus of Gilovich's work is reflected in two influential texts, Heuristics and Biases: The Psychology of Intuitive Judgment[7] (with Dale Griffin and Daniel Kahneman) and Social Psychology[8] (with Serena Chen, Dacher Keltner and Robert Nisbett), both of which are used as textbooks in academic courses in psychology and social psychology throughout the USA.Summarizing the research in an interview when asked what the benefits are, he responded, "I think that field has an enormous amount to offer, because we make consequential decisions all the time, and they aren't always easy, we don't always do them well," and that his research program is about trying to figure out how the mind works so we "understand why some decisions are easy, and we tend to do certain things very well, and why some decisions are difficult, and we tend to do them poorly." He further explained that his hope is that he and his colleagues are "providing lots of information to help us understand those difficult decisions, and give people the tools so that they can make better decisions so they less often in life are going down paths that don't serve them well."[9]
Gilovich condensed his academic research in judgement and decision making into a popular book, How We Know What Isn't So. Writing in Skeptical Inquirer, Carl Sagan called it "a most illuminating book" that "shows how people systematically err in understanding numbers, in rejecting unpleasant evidence, in being influenced by the opinions of others. We're good in some things, but not in everything. Wisdom lies in understanding our limitations."[10] Reviewing the book for The New York Times, George Johnson wrote, "Over time, the ability to infer rules about the way the world works from skimpy evidence confers a survival advantage, even if much of the time the lessons are wrong. From evolution's standpoint, it is better to be safe than sorry."[11] In an interview, Gilovich summarized the thesis of How We Know What Isn't So as people "thinking we really have the evidence for things, [that] the world is telling us something, but in fact the world is telling us something a little more complicated, and how is it that we can misread the evidence of our everyday experience, and be convinced that something is true when it really isn't." He further elaborated on some of the erroneous beliefs his book discusses, including the sophomore jinx, the idea that things such as natural disasters come in threes, and the belief that the lines we are in slow down but the lines we leave speed up.[12] In the same interview he called confirmation bias the "mother of all biases."
Through his published work in biases and heuristics, Gilovich has made notable contributions to the field through the following concepts:
Gilovich's research in the alleged "hot hand" effect, or the belief that success in a particular endeavor, usually sports, will likely be followed by further success, has been particularly influential. A paper he wrote with Amos Tversky in 1985 became the benchmark on the subject for years.[13] Some of the research from the 1985 paper has been contested recently, with a new journal article arguing that Gilovich and his coauthors themselves fell victim to a cognitive bias in interpreting the data from the original study. Specifically, that in a truly random situation, a hit would be expected to be followed by another hit less than 50 percent of the time, but if one hit followed another at 50 percent, that would be evidence for the hot hand.[14]
The spotlight effect, the phenomenon where people tend to believe that they're noticed more than they really are, is a term Gilovich coined. In a paper he wrote with two graduate students in 1999, he explained that "because we are so focused on our own behavior, it can be difficult to arrive at an accurate assessment of how much–or how little–our behavior is noticed by others. Indeed, close inspection reveals frequent disparities between the way we view our performance (and think others will view it) and the way it is actually seen by others."[15] For the paper, Gilovich and his coauthors conducted an experiment asking college students to put on a Barry Manilow shirt and walk into a room of strangers facing the door. The researchers predicted that the students would assume that more people would notice their T-shirt than was actually true. The results were as predicted, with participants thinking that roughly half the strangers would have recognized the Barry Manilow shirt, when in fact the number was closer to 20 percent.[16]
Gilovich has contributed to an understanding of bias blind spot, or the tendency to recognize biases in other people, but not in ourselves. Several studies he coauthored found that people tend to believe that their own personal connection to a given issue is a source of accuracy and enlightenment, but that such personal connections in the case of others who hold different views are a source of bias.[17] Similarly, he has found that people look to external behavior in evaluating biases in others, but engage in introspection when evaluating their own.[18] Two examples he gave in a talk are that both older and younger siblings felt the other were held to a higher standard, and that Democrats and Republicans both felt that the electoral college helped the other side more than their own party.[19]
Gilovich was an early author in the clustering illusion, which is closely related to the "hot hand" fallacy, and is the tendency to see "clusters" of data in a random sequence of data as nonrandom. In How We Know What Isn't So, Gilovich explains how people want to see a sequence such as as planned, even though it was arbitrary. In addition, he stated that people tend to misjudge randomness, thinking that rolling the same number on dice 4 times in a row is not truly random, when in fact it is.[20]
Building on his research on the spotlight effect, Gilovich helped to discover the illusion of transparency, or the tendency to overestimate the extent to which people telegraph their inner thoughts and emotions. In a study he conducted with two coauthors in 1998, individuals read questions from index cards and answered them out loud. They either lied or told the truth based on what the card said to do on a label only they could see. Half of the liars thought they had been caught, but in fact only a quarter were, hence the illusion of transparency. In addition, they found in the same study that in an emergency situation, people assumed the emergency and concern would show in their expression and behavior, but it didn't, which the authors believe partially explains the bystander effect: "When confronted with a potential emergency, people typically play it cool, adopt a look of nonchalance, and monitor the reactions of others to determine if a crisis is really at hand. No one wants to overreact, after all, if it might not be a true emergency. However, because each individual holds back, looks nonchalant, and monitors the reactions of others, sometimes everyone concludes (perhaps erroneously) that the situation is not an emergency and hence does not require intervention."[21]
Gilovich has researched the causes of regret. A study he conducted in 1994 found that specific actions people wish they hadn't taken are regretted more in the short run, but ultimately, inactions are regretted more in the long run. He has continued to emphasize that people tend to regret the things they don't do more than the things they did.[22] [23]
Following Amos Tversky and Daniel Kahneman, Gilovich and his colleagues have conducted research in anchoring, the tendency to anchor on information that comes to mind and adjust until a plausible estimate is reached when making decisions. A study he co-authored with Nicholas Epley found that anchoring is actually several different effects, and the multiple causes are at play.[24] Another study that Gilovich and Nicholas Epley coauthored found that once an anchor is set, people adjust away from it, though their adjustments tend to be insufficient, so their final guess is close to the initial anchor.[25]
In his social psychology research, Gilovich discovered the phenomenon of self-handicapping, which he described as "attempts to manage how others perceive us by controlling the attributions they make for our performance." An example of self-handicapping, according to Gilovich, would be drawing attention to elements that inhibit performance, and so discount failure in others' eyes, or make success the result of overcoming insurmountable odds. The self-handicapping can either be real (failing to study or drinking excessively), or faked (merely claiming that there were difficult obstacles present). Gilovich has stated that the strategy is most common in sports and undergraduate academics, but that it often backfires.[20]
Besides his contributions to the field of social psychology, Gilovich's research in cognitive psychology has influenced the field of behavioral economics. Gilovich has written a popular book condensing his academic research in the field, and which touches on many of the topics in How We Know What Isn't So, The Wisest One in the Room: How You Can Benefit from Social Psychology's Most Powerful Insights. In an interview with Brian Lehrer, Gilovich discussed the book and the subjects it touches on, such as the difference between intelligence and wisdom, the latter being knowledge of other people and how to connect with them, the negative impact of income inequality on happiness, motivation, and what can create "virtuous cycles" in a university environment.[26] Kirkus Reviews gave it a positive review, writing, "The authors leap from personal behavior and motivation in the first half into societal, cultural, and even international change in the second, offering suggestions, if not necessarily a working blueprint, for how to achieve goals such as global environmental responsibility. None of this is riveting reading, but it rarely lapses into academic jargon."[27]
A major recurring theme in Gilovich's work in behavioral economics is the importance of experience over ownership of material things. For instance, a paper he co-authored with Leaf Van Boven found that people overwhelmingly preferred "experiential purchases" to "material purchases."[28] Writing for The Atlantic, James Hamblin noted the growing body of research, pioneered by Gilovich, showing that experiences tend to bring people more happiness than possessions: "It's the fleetingness of experiential purchases that endears us to them. Either they're not around long enough to become imperfect, or they are imperfect, but our memories and stories of them get sweet with time. Even a bad experience becomes a good story."[29] In a talk about barriers to gratitude, Gilovich further noted that a survey of his students at Cornell found that they enjoyed their conversations about their experiences than their material purchases, and that happiness from experiential purchases is more enduring than that from material purchases. The reason being that experiences make for better stories, cultivate personal identity more, and connect people to each other. Gilovich explained that the implication is that experiential purchases lead to more gratitude and thus to more pro-social behavior.[30] In addition, Gilovich has emphasized the importance of being active and seeking goals: "We evolved to be goal-striving creatures. You’ll regret more the things that you didn’t do rather than the things you did." Along similar lines, in one talk he urged his audience, "mind your peaks and ends. You won’t remember the length of your vacation experience, but you’ll remember the intensity. And do something special at the end."[22]
Thomas Gilovich is married to Karen Dashiff Gilovich, with whom he has two daughters, Ilana and Rebecca.[20] Gilovich stated in an interview that the best part about being a scientist is going to work every day asking "what do I want to do today?" and not so often "what do I have to do today?" and that the best quality of a scientist is knowing how to respond to failure.