Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 22, Issue 38;   September 28, 2022: The Illusion of Explanatory Depth

The Illusion of Explanatory Depth

by

The illusion of explanatory depth is the tendency of humans to believe they understand something better than they actually do. Discovering the illusion when you're explaining something is worse than embarrassing. It can be career ending.
Roger Boisjoly of Morton Thiokol, who tried to halt the launch of Challenger in 1986

Roger Boisjoly, the Morton Thiokol engineer who, in 1985, one year before the catastrophic failure of the Space Shuttle Challenger, wrote a memorandum outlining the safety risks of cold-weather launches. He successfully raised the issue then, and many times subsequently, including the evening prior to the launch. In 1988, he was awarded the Prize for Scientific Freedom and Responsibility by the American Association for the Advancement of Science, for "…his exemplary and repeated efforts to fulfill his professional responsibilities as an engineer by alerting others to life-threatening design problems of the Challenger space shuttle and for steadfastly recommending against the tragic launch of January 1986."

Photo courtesy the Online Ethics Center at the National Academy of Engineering.

If you've ever had the embarrassing experience of suddenly realizing that you don't know what you're talking about as you were trying to explain something to someone else, you might have first-hand knowledge of what psychologists call the illusion of explanatory depth. I say might because there are many ways to not know what you're talking about. The illusion of explanatory depth is just one of those ways. It occurs only with respect to explanatory knowledge — the kind of knowledge that involves causal patterns.

Taking care not to fall victim to this illusion is important in the modern workplace, where so many of us must explain to others why we do so much of what we do. To manage that risk, we must understand when the illusion is most likely to form and how the different kinds of explanatory knowledge are affected.

When the illusion is most likely to appear

The There are many ways to not know what
you're talking about. The illusion of
explanatory depth is just one way.
illusion has been observed only in self-assessment with respect to "knowing why" (explanatory knowledge). For example, most of us know that hundreds of human-made satellites orbit the Earth. But few of us can explain why they don't all immediately fall into the oceans or crash into the land.

The illusion hasn't been observed experimentally with respect to all kinds of knowledge. For example, the illusion doesn't occur with respect to procedural knowledge. Procedural knowledge is the kind of knowledge that pertains to how we perform a particular task, such as administering a COVID-19 vaccination to a patient, or deleting a file from a computer, or gaining approval, in your organization, for a capital purchase of more than $50,000.

Nor have we observed the illusion of explanatory depth with respect to descriptive knowledge, which is knowledge of specific facts or propositions. Descriptive knowledge includes, for example, the names of the bones of the human hand, or where to find the Sort command on the ribbon of Microsoft Word, or the names of the signers of the U.S. Declaration of Independence.

To say that the illusion of explanatory depth hasn't been observed with respect to procedural knowledge or descriptive knowledge isn't to imply that humans are at ease with acquiring or retaining those kinds of knowledge. It does mean that we are less likely to be mistaken in self-assessment with respect to knowledge that consists of "knowing how" (procedural knowledge), or "knowing that" (descriptive knowledge), than we are with respect to "knowing why" (explanatory knowledge).

Kinds of explanatory knowledge

Researchers have identified four categories of explanatory knowledge.

Knowledge relating to causal patterns
Explanatory knowledge of the first category relates to causal patterns among the entities whose behavior is being explained. And there are four types of causal patterns: common cause, common effect, linear causal chains, and causal homeostasis. Common-cause patterns appear frequently in diagnosing the misbehavior of systems. Debugging code is a fine example, in which multiple forms of misbehavior can be traced to a single cause.
Common-effect explanations appear when we try to explain the behavior of complex systems. For example, the causes of the Chernobyl nuclear accident include human error, but the design of the reactor made it inherently difficult to manage under low power conditions.
Linear causal chains are a form of common-cause explanation that are also common-effect explanations, in which a single cause leads to a single effect through a chain of other single causes. An example is the explosion of Space Shuttle Challenger, in which one might identify a linear causal chain including the failure to notice O-ring erosion in previous launches, the decision to launch in cold weather, and the design of the O-rings. [Rogers 1986]
Causal homeostatic explanations focus on reasons why a system state, or a given set of system attributes, might persist over time. For example, if a system software module is repeatedly implicated in system failures, even when those failures are otherwise unrelated, a causal homeostatic explanation might point to the general disorganized state of the code, or its lack of a modular design.
Awareness of these four categories of causal patterns can be enormously useful as a framework for seeking causal patterns in new explanations.
Knowledge relating to explanatory stances
Keil surveys the literature of another way of categorizing explanations that he refers to as stances or modes. [Keil 2006] Three stances are the mechanical stance, the design stance, and the intentional stance. In the mechanical stance, we focus on how mechanical objects interact. For example, in the game of tennis, two keys for imparting topspin to the ball are keeping the racket face slightly closed, and brushing up on the back of the ball.
In the design stance, our explanations focus on purpose. For example, the counterweight of an elevator reduces the torque required of the motor that lifts the elevator cab from the first floor to the second.
In the intentional stance, we attribute beliefs and desires to the (usually inanimate) entities whose behavior we're explaining. For example, the reason why we cannot load into Microsoft Excel two workbooks with the same name is that Excel uses the filename to distinguish the workbooks. If two workbooks had the same name, Excel would get confused.
Any given explanation might have properties of more than one stance. But to my mind, choosing a stance and adhering to it offers the best chance of achieving clarity.
Knowledge relating to domains of phenomena
The causal patterns that are relevant for a given domain of phenomena vary with the domain. For example, when explaining why people might not respond truthfully to workplace surveys, we must understand what kinds of survey questions are likely to be affected by the prevailing degree of psychological safety. But psychological safety is unrelated to how a bicycle works.
When it comes to explanations, different domains of phenomena require different knowledge. And knowing what knowledge is relevant can be the hard part of the explanation. For example, in assessing the progress of an Agile Transformation by fielding a survey, expertise in Agile processes can be less important than expertise in psychological safety.
Knowledge relating to value-laden or emotion-laden situations
Explaining the behavior of others can involve attributing it to complex networks of values, social norms, and emotions. These factors can shift the "threshold for acceptance" of explanations based on social norms and emotions. [Rozenblit 2002]
For example, in explaining why a team member felt insulted when omitted from a special meeting, we would need to invoke an understanding of the team norms about invitation lists for meetings. An explanation that fails to invoke that understanding would likely be unacceptable to many team members.

Last words

Watch carefully for examples of this illusion in action. One way to learn recovery techniques is by watching how other people recover from realizing they don't actually know as much as they thought they did. Go to top Top  Next issue: Downscoping Under Pressure: I  Next Issue

303 Secrets of Workplace PoliticsIs every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Rogers 1986]
William P. Rogers. "Report of the PRESIDENTIAL COMMISSION on the Space Shuttle Challenger Accident" Chapter 4, 1986. Available here. Retrieved 12 September 2022. Back
[Keil 2006]
Frank C. Keil. "Explanation and understanding." Annual Review of Psychology 57, (2006), pp.227-254. Available here. Retrieved 12 September 2022. Back
[Rozenblit 2002]
Leonid Rozenblit and Frank Keil. "The misunderstood limits of folk science: An illusion of explanatory depth." Cognitive Science 26:5, (2002), pp.521-562. Available here. Retrieved 12 September 2022. Back

Your comments are welcome

Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Winston Churchill in the Canadian Parliament, December 30, 1941The Trap of Beautiful Language
As we assess the validity of others' statements, we risk making a characteristically human error — we confuse the beauty of their language with the reliability of its meaning. We're easily thrown off by alliteration, anaphora, epistrophe, and chiasmus.
Bullet pointsBullet Point Madness: II
Decision makers in many organizations commonly demand briefings in the form of a series of bullet points or a series of series of bullet points. Briefers who combine this format with a variety of persuasion techniques can mislead decision makers, guiding them into making poor decisions.
A fly caught in a carnivorous plant known as a venus flytrap (Dionaea muscipula)Seven Planning Pitfalls: I
Whether in war or in projects, plans rarely work out as, umm well, as planned. In part, this is due to our limited ability to foretell the future, or to know what we don't know. But some of the problem arises from the way we think. And if we understand this we can make better plans.
The Bay of Pigs, CubaSeven More Planning Pitfalls: II
Planning teams, like all teams, are susceptible to several patterns of interaction that can lead to counter-productive results. Three of these most relevant to planners are False Consensus, Groupthink, and Shared Information Bias.
Assembling an IKEA chairSeven More Planning Pitfalls: III
Planning teams, like all teams, are vulnerable to several patterns of interaction that can lead to counter-productive results. Two of these relevant to planners are a cognitive bias called the IKEA Effect, and a systemic bias against realistic estimates of cost and schedule.

See also Cognitive Biases at Work and Workplace Politics for more related articles.

Forthcoming issues of Point Lookout

A dangerous curve in an icy roadComing May 1: Antipatterns for Time-Constrained Communication: 2
Recognizing just a few patterns that can lead to miscommunication can reduce the incidence of miscommunications. Here's Part 2 of a collection of antipatterns that arise in communication under time pressure, emphasizing those that depend on content. Available here and by RSS on May 1.
And on May 8: Antipatterns for Time-Constrained Communication: 3
Recognizing just a few patterns that can lead to miscommunication can reduce the incidence of problems. Here is Part 3 of a collection of antipatterns that arise in technical communication under time pressure, emphasizing past experiences of participants. Available here and by RSS on May 8.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.