The illusory truth effect (also known as the illusion of truth effect, validity effect, truth effect, or the reiteration effect) is the tendency to believe false information to be correct after repeated exposure.[1] This phenomenon was first identified in a 1977 study at Villanova University and Temple University.[2] When truth is assessed, people rely on whether the information is in line with their understanding or if it feels familiar. The first condition is logical, as people compare new information with what they already know to be true. Repetition makes statements easier to process relative to new, unrepeated statements, leading people to believe that the repeated conclusion is more truthful. The illusory truth effect has also been linked to hindsight bias, in which the recollection of confidence is skewed after the truth has been received.
In a 2012 study, researchers discovered that familiarity can overpower rationality and that repetitively hearing that a certain statement is wrong can paradoxically cause it to feel right.[3] Researchers observed the illusory truth effect's impact even on participants who knew the correct answer to begin with but were persuaded to believe otherwise through the repetition of a falsehood, to "processing fluency".
The illusory truth effect plays a significant role in fields such as advertising, news media, and political propaganda.
The effect was first named and defined following the results in a study from 1977 at Villanova University and Temple University where participants were asked to rate a series of trivia statements as true or false.[4] [5] On three occasions, Lynn Hasher, David Goldstein, and Thomas Toppino presented the same group of college students with lists of sixty plausible statements, some of them true and some of them false. The second list was distributed two weeks after the first, and the third two weeks after that. Twenty statements appeared on all three lists; the other forty items on each list were unique to that list. Participants were asked how confident they were of the truth or falsity of the statements, which concerned matters about which they were unlikely to know anything. (For example, "The first air force base was launched in New Mexico." Or "Basketball became an Olympic discipline in 1925.") Specifically, the participants were asked to grade their belief in the truth of each statement on a scale of one to seven. While the participants' confidence in the truth of the non-repeated statements remained steady, their confidence in the truth of the repeated statements increased from the first to the second and second to third sessions, with an average score for those items rising from 4.2 to 4.6 to 4.7. The conclusion made by the researchers was that repeating a statement makes it more likely to appear factual.[1]
In 1989, Hal R. Arkes, Catherine Hackett, and Larry Boehm replicated the original study, with similar results showing that exposure to false information changes the perceived truthfulness and plausibility of that information.[6]
The effect works because when people assess truth, they rely on whether the information agrees with their understanding or whether it feels familiar. The first condition is logical as people compare new information with what they already know to be true and consider the credibility of both sources. However, researchers discovered that familiarity can overpower rationality—so much so that repetitively hearing that a certain fact is wrong can paradoxically cause it to feel right.[3]
At first, the illusory truth effect was believed to occur only when individuals are highly uncertain about a given statement.[1] Psychologists also assumed that "outlandish" headlines wouldn't produce this effect however, recent research shows the illusory truth effect is indeed at play with false news.[5] This assumption was challenged by the results of a 2015 study by Lisa K. Fazio, Nadia M. Brasier, B. Keith Payne, and Elizabeth J. Marsh. Published in the Journal of Experimental Psychology; the study suggested that the effect can influence participants who actually knew the correct answer to begin with, but who were swayed to believe otherwise through the repetition of a falsehood. For example, when participants encountered on multiple occasions the statement "A sari is the name of the short plaid skirt worn by Scots," some of them were likely to come to believe it was true, even though these same people were able to correctly answer the question "What is the name of the short pleated skirt worn by Scots?"
After replicating these results in another experiment, Fazio and her team attributed this curious phenomenon to processing fluency, the facility with which people comprehend statements. "Repetition," explained the researcher, "makes statements easier to process (i.e. fluent) relative to new statements, leading people to the (sometimes) false conclusion that they are more truthful."[7] [8] When an individual hears something for a second or third time, their brain responds faster to it and misattributes that fluency as a signal for truth.[9]
In a 1997 study, Ralph Hertwig, Gerd Gigerenzer, and Ulrich Hoffrage linked the illusory truth effect to the phenomenon known as "hindsight bias", described as a situation in which the recollection of confidence is skewed after the truth or falsity has been received. They have described the effect (which they call "the reiteration effect") as a subset of hindsight bias.[10]
In a 1979 study, participants were told that repeated statements were no more likely to be true than unrepeated ones. Despite this warning, the participants perceived repeated statements as being more true than unrepeated ones.[6]
Studies in 1981 and 1983 showed that information deriving from recent experience tends to be viewed as "more fluent and familiar" than new experience. A 2011 study by Jason D. Ozubko and Jonathan Fugelsang built on this finding by demonstrating that, generally speaking, information retrieved from memory is "more fluent or familiar than when it was first learned" and thus produces an illusion of truth. The effect grew even more pronounced when statements were repeated twice and yet more pronounced when they were repeated four times. The researchers thus concluded that memory retrieval is a powerful method for increasing the so-called validity of statements and that the illusion of truth is an effect that can be observed without directly polling the factual statements in question.[11]
A 1992 study by Ian Maynard Begg, Ann Anas, and Suzanne Farinacci suggested that a statement will seem true if the information seems familiar.[6]
A 2012 experiment by Danielle C. Polage showed that some participants exposed to false news stories would go on to have false memories. The conclusion was that repetitive false claims increase believability and may also result in errors.[6] [5]
In a 2014 study, Eryn J. Newman, Mevagh Sanson, Emily K. Miller, Adele Quigley-McBride, Jeffrey L. Foster, Daniel M. Bernstein, and Maryanne Garry asked participants to judge the truth of statements attributed to various people, some of whose names were easier to pronounce than others. Consistently, statements by persons with easily pronounced names were viewed as being more truthful than those with names that were harder to pronounce. The researchers' conclusion was that subjective, tangential properties such as ease of processing can matter when people evaluate sourced information.[2]