Illusion of explanatory depth explained

The illusion of explanatory depth (IOED) is cognitive bias or an illusion where people tend to believe they understand a topic better than they actually do.[1] The term was coined by Yale researchers Leonid Rozenblit and Frank Keil in 2002.[1] [2] The effect was observed in only one type of knowledge called explanatory knowledge, in this case defined as "knowledge that involves complex causal patterns" (see causal reasoning). The effect has not been observed in procedural, narrative, or factual (descriptive) knowledge.[3] [4] Evidence of the IOED occurring has been found in everyday mechanical and electrical devices such as bicycles, in addition to mental disorders, natural phenomena, folk theories, and politics, with the most studied effect of IOED being in politics in the form of political polarization.[5]

The illusion is related to the Dunning–Kruger effect, differing in that the IOED examines explanatory knowledge as opposed to ability.[6] Limited evidence exists suggesting that the effects of the IOED are less significant in subject matter experts,[7] but it is believed to affect almost everyone, compared to the Dunning–Kruger effect which is usually defined to apply only to those of low to moderate competence.[8] The IOED is more significant for historical knowledge, in cases when knowing about the topic is perceived as socially desirable.[9]

Another description of the IOED is that "we mistake our familiarity with a situation for an understanding of how it works".[10] IOED has also been suggested to explain the perception that psychology as a field is "simple" or "obvious".

In politics

There is evidence to support the theory that the IOED is a contributing factor to increased political polarization in the United States.[11] A 2018 study with participants recruited in the context of the 2016 United States presidential election found that higher levels of IOED about political topics is associated with increased support in conspiracy theories.[12]

Management

It is thought that the effects of IOED, especially in politics, can be reduced by asking people to explain the topic rather than only asking people to provide reasons for their beliefs. The specific ways in which people are asked to explain the topic are important, as they may backfire. This was found in research that showed when people are asked to "justify their position", people's beliefs become more extreme. Asking for "reasons" may lead people to strengthen their beliefs by selectively thinking of support for their position, while asking for "explanations" may lead them to confront their lack of knowledge.

Original experiment

The term for IOED was coined by Yale researchers Leonid Rozenblit and Frank Keil in 2002. One inspiration for the IOED concept was research in change blindness suggesting at the time that people grossly overestimated their own spatial memory.[13]

In the experiment they conducted with 16 Yale undergraduate students, they asked them to rate their understanding of devices and simple items. They were then asked to generate a detailed explanation of how they worked and then rerate their understanding of that item. Consistently, ratings were lower after generating an explanation, suggesting they then began to understand that they lacked understanding of that item after attempting to explain. Rozenblit and Keil concluded that having to explain basic concepts or mechanisms, confronts people with the reality that they may not understand the subject as much as they think they do.

Notes and References

  1. Web site: 2017 : What scientific term or concept ought to be more widely known? . Waytz, Adam . . 26 January 2022 . 26 January 2022.
  2. Web site: The Illusion of Explanatory Depth. The Decision Lab. 26 January 2022.
  3. Rozenblit . Leonid . Keil . Frank . The misunderstood limits of folk science: an illusion of explanatory depth . Cognitive Science . Wiley . 26 . 5 . 2002 . 0364-0213 . 10.1207/s15516709cog2605_1 . 521–562. 21442007 . 3062901 .
  4. Mills . Candice M . Keil . Frank C . Knowing the limits of one's understanding: The development of an awareness of an illusion of explanatory depth . Journal of Experimental Child Psychology . Elsevier BV . 87 . 1 . 2004 . 0022-0965 . 10.1016/j.jecp.2003.09.003 . 1–32. 14698687 .
  5. Zeveney, Marsh. Andrew, Jessacae. 2016. The Illusion of Explanatory Depth in a Misunderstood Field: The IOED in Mental Disorders. Cognitive Science Society. 1020.
  6. Chromik. Michael. Eiband. Malin. Buchner. Felicitas. Krüger. Adrian. Butz. Andreas. 26th International Conference on Intelligent User Interfaces . 13 April 2021. I Think I Get Your Point, AI! The Illusion of Explanatory Depth in Explainable AI. 307–317 . ACM. New York, NY, USA. 10.1145/3397481.3450644. 9781450380171 .
  7. Lawson. Rebecca. 2006. The science of cycology: Failures to understand how everyday objects work. Memory & Cognition. Springer Science and Business Media LLC. 34. 8. 1667–1675. 10.3758/bf03195929. 17489293. 4998257. 0090-502X. free.
  8. McIntosh . Robert D. . Fowler . Elizabeth A. . Lyu . Tianjiao . Della Sala . Sergio . November 2019 . Wise up: Clarifying the role of metacognition in the Dunning-Kruger effect . Journal of Experimental Psychology. General . 148 . 11 . 1882–1897 . 10.1037/xge0000579 . 1939-2222 . 30802096. 73460013 . 20.500.11820/b5c09c5f-d2f2-4f46-b533-9e826ab85585 . free .
  9. Gaviria. Christian. Corredor. Javier. 23 June 2021. Illusion of explanatory depth and social desirability of historical knowledge. Metacognition and Learning. Springer Science and Business Media LLC. 16. 3. 801–832. 10.1007/s11409-021-09267-7. 237878736. 1556-1623.
  10. Web site: Stafford, Tom . February 2007 . Isn't it all just obvious? . 28 January 2022 . The Psychologist.
  11. Fernbach. Philip M.. Rogers. Todd. Fox. Craig R.. Sloman. Steven A.. 25 April 2013. Political Extremism Is Supported by an Illusion of Understanding. Psychological Science. SAGE Publications. 24. 6. 939–946. 10.1177/0956797612464058. 23620547. 6173291. 0956-7976.
  12. Vitriol . Joseph A. . Marsh . Jessecae K. . The illusion of explanatory depth and endorsement of conspiracy beliefs . European Journal of Social Psychology . Wiley . 48 . 7 . 15 June 2018 . 0046-2772 . 10.1002/ejsp.2504 . 955–969. 149811872 .
  13. Levin . Daniel T. . Momen . Nausheen . Drivdahl . Sarah B. . Simons . Daniel J. . January 2000 . Change Blindness Blindness: The Metacognitive Error of Overestimating Change-detection Ability . Visual Cognition . 7 . 1–3 . 397–412 . 10.1080/135062800394865 . 14623812 . 1350-6285.