Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 In politics  





2 Management  





3 Original experiment  





4 References  














Illusion of explanatory depth






Español
Русский
 

Edit links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 


The illusion of explanatory depth (IOED) is cognitive bias or an illusion where people tend to believe they understand a topic better than they actually do.[1][2][3] The term was coined by Yale researchers Leonid Rozenblit and Frank Keil in 2002.[1][4] The effect was observed in only one type of knowledge called explanatory knowledge, in this case defined as "knowledge that involves complex causal patterns" (see causal reasoning). The effect has not been observed in procedural, narrative, or factual (descriptive) knowledge.[2][5] Evidence of the IOED occurring has been found in everyday mechanical and electrical devices such as bicycles, in addition to mental disorders, natural phenomena, folk theories, and politics, with the most studied effect of IOED being in politics in the form of political polarization.[6][2]

The illusion is related to the Dunning–Kruger effect, differing in that the IOED examines explanatory knowledge as opposed to ability.[1][3] Limited evidence exists suggesting that the effects of the IOED are less significant in subject matter experts,[7] but it is believed to affect almost everyone, compared to the Dunning–Kruger effect which is usually defined to apply only to those of low to moderate competence.[3][8] The IOED is more significant for historical knowledge, in cases when knowing about the topic is perceived as socially desirable.[9]

Another description of the IOED is that "we mistake our familiarity with a situation for an understanding of how it works".[10] IOED has also been suggested to explain the perception that psychology as a field is "simple" or "obvious".[10][non-primary source needed]

In politics

[edit]

There is evidence to support the theory that the IOED is a contributing factor to increased political polarization in the United States.[11] A 2018 study with participants recruited in the context of the 2016 United States presidential election found that higher levels of IOED about political topics is associated with increased support in conspiracy theories.[12]

Management

[edit]

It is thought that the effects of IOED, especially in politics, can be reduced by asking people to explain the topic rather than only asking people to provide reasons for their beliefs.[1][11] The specific ways in which people are asked to explain the topic are important, as they may backfire. This was found in research that showed when people are asked to "justify their position", people's beliefs become more extreme. Asking for "reasons" may lead people to strengthen their beliefs by selectively thinking of support for their position, while asking for "explanations" may lead them to confront their lack of knowledge.[11]

Original experiment

[edit]

The term for IOED was coined by Yale researchers Leonid Rozenblit and Frank Keil in 2002.[2] One inspiration for the IOED concept was research in change blindness suggesting at the time that people grossly overestimated their own spatial memory.[13]

In the experiment they conducted with 16 Yale undergraduate students, they asked them to rate their understanding of devices and simple items. They were then asked to generate a detailed explanation of how they worked and then rerate their understanding of that item. Consistently, ratings were lower after generating an explanation, suggesting they then began to understand that they lacked understanding of that item after attempting to explain. Rozenblit and Keil concluded that having to explain basic concepts or mechanisms, confronts people with the reality that they may not understand the subject as much as they think they do.

References

[edit]
  1. ^ a b c d Waytz, Adam (26 January 2022). "2017 : What scientific term or concept ought to be more widely known?". Edge.org. Retrieved 26 January 2022.
  • ^ a b c d Rozenblit, Leonid; Keil, Frank (2002). "The misunderstood limits of folk science: an illusion of explanatory depth". Cognitive Science. 26 (5). Wiley: 521–562. doi:10.1207/s15516709cog2605_1. ISSN 0364-0213. PMC 3062901. PMID 21442007.
  • ^ a b c Chromik, Michael; Eiband, Malin; Buchner, Felicitas; Krüger, Adrian; Butz, Andreas (13 April 2021). "I Think I Get Your Point, AI! The Illusion of Explanatory Depth in Explainable AI". 26th International Conference on Intelligent User Interfaces. New York, NY, USA: ACM. pp. 307–317. doi:10.1145/3397481.3450644. ISBN 9781450380171.
  • ^ "The Illusion of Explanatory Depth". The Decision Lab. Retrieved 26 January 2022.
  • ^ Mills, Candice M; Keil, Frank C (2004). "Knowing the limits of one's understanding: The development of an awareness of an illusion of explanatory depth". Journal of Experimental Child Psychology. 87 (1). Elsevier BV: 1–32. doi:10.1016/j.jecp.2003.09.003. ISSN 0022-0965. PMID 14698687.
  • ^ Zeveney, Marsh, Andrew, Jessacae (2016). "The Illusion of Explanatory Depth in a Misunderstood Field: The IOED in Mental Disorders" (PDF). Cognitive Science Society: 1020.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  • ^ Lawson, Rebecca (2006). "The science of cycology: Failures to understand how everyday objects work". Memory & Cognition. 34 (8). Springer Science and Business Media LLC: 1667–1675. doi:10.3758/bf03195929. ISSN 0090-502X. PMID 17489293. S2CID 4998257.
  • ^ McIntosh, Robert D.; Fowler, Elizabeth A.; Lyu, Tianjiao; Della Sala, Sergio (November 2019). "Wise up: Clarifying the role of metacognition in the Dunning-Kruger effect". Journal of Experimental Psychology. General. 148 (11): 1882–1897. doi:10.1037/xge0000579. hdl:20.500.11820/b5c09c5f-d2f2-4f46-b533-9e826ab85585. ISSN 1939-2222. PMID 30802096. S2CID 73460013.
  • ^ Gaviria, Christian; Corredor, Javier (23 June 2021). "Illusion of explanatory depth and social desirability of historical knowledge". Metacognition and Learning. 16 (3). Springer Science and Business Media LLC: 801–832. doi:10.1007/s11409-021-09267-7. ISSN 1556-1623. S2CID 237878736.
  • ^ a b Stafford, Tom (February 2007). "Isn't it all just obvious?". The Psychologist. Retrieved 28 January 2022.
  • ^ a b c Fernbach, Philip M.; Rogers, Todd; Fox, Craig R.; Sloman, Steven A. (25 April 2013). "Political Extremism Is Supported by an Illusion of Understanding". Psychological Science. 24 (6). SAGE Publications: 939–946. doi:10.1177/0956797612464058. ISSN 0956-7976. PMID 23620547. S2CID 6173291.
  • ^ Vitriol, Joseph A.; Marsh, Jessecae K. (15 June 2018). "The illusion of explanatory depth and endorsement of conspiracy beliefs". European Journal of Social Psychology. 48 (7). Wiley: 955–969. doi:10.1002/ejsp.2504. ISSN 0046-2772. S2CID 149811872.
  • ^ Levin, Daniel T.; Momen, Nausheen; Drivdahl, Sarah B.; Simons, Daniel J. (January 2000). "Change Blindness Blindness: The Metacognitive Error of Overestimating Change-detection Ability". Visual Cognition. 7 (1–3): 397–412. doi:10.1080/135062800394865. ISSN 1350-6285. S2CID 14623812.

  • Retrieved from "https://en.wikipedia.org/w/index.php?title=Illusion_of_explanatory_depth&oldid=1193740321"

    Category: 
    Cognitive biases
    Hidden categories: 
    CS1 maint: multiple names: authors list
    Articles with short description
    Short description matches Wikidata
    All pages needing factual verification
    Wikipedia articles needing factual verification from May 2022
     



    This page was last edited on 5 January 2024, at 11:00 (UTC).

    Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



    Privacy policy

    About Wikipedia

    Disclaimers

    Contact Wikipedia

    Code of Conduct

    Developers

    Statistics

    Cookie statement

    Mobile view



    Wikimedia Foundation
    Powered by MediaWiki