Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 20, Issue 34;   August 19, 2020: Motivated Reasoning

Motivated Reasoning

by

When we prefer a certain outcome of a decision process, we risk falling into a pattern of motivated reasoning. That can cause us to gather data and construct arguments that erroneously lead to the outcome we prefer, often outside our awareness. And it can happen even when the outcome we prefer is known to threaten our safety and security.
Unripe grapes that are probably sour

Unripe grapes that are probably sour. The expression "sour grapes" is often used to characterize the comments of a person who is disparaging a prize that he or she failed to win, but only after failing to win it. The expression comes from one of Aesop's fables, "The Fox and the Grapes." In the tale, a fox notices some grapes high on a vine, and attempts to reach them. Failing, he says, "They were probably sour anyway," hence the expression, "sour grapes."

Manufacturing a narrative in this way is an example of motivated reasoning. The fox convinces itself that failure to secure the grapes isn't a reflection on its skill as a grape harvester because those particular grapes weren't worth harvesting. In this way it can maintain a positive evaluation of itself despite failing to secure the grapes. This particular instance of motivated reasoning supports the cognitive bias known as self-serving bias.

Cognitive biases are patterns of thinking that deviate systematically from evidence-based reasoning. Scientists have uncovered nearly 200 cognitive biases experimentally, though many might be closely related to each other. In the workplace, in any activity that requires careful judgment based on reliable situational assessment, cognitive biases create significant risk of compromising either decision quality, or communication reliability, or both.

The risk of poor decisions or unreliable communications is elevated when the time scale of the decision process is short, as in high-stakes discussions, heated debate, and emergencies. Methods for mitigating that risk are therefore valuable. But how can we possibly protect ourselves against the effects of 200 cognitive biases?

On a one-by-one basis — cognitive bias by cognitive bias — risk mitigation is difficult. There are too many cognitive biases. But we can become familiar with patterns of working and thinking that increase our vulnerability to cognitive biases. When we notice the indicators of these patterns, we can be alert to the risk of cognitive biases. Motivated reasoning [Kunda 1990] [Molden 2005] is one of those patterns.

Motivated reasoning, also known as motivated thinking, is a pattern of thinking that we use to reach conclusions we prefer by means of what appears to be evidence-based reasoning, but which might actually be nothing of the kind. We also use motivated reasoning to avoid reaching or to defer reaching conclusions that we'd rather not reach. Consider this example:

Suppose you have an outdoor activity to do tomorrow. And you really can't do it in rainy conditions. As you're making plans for tomorrow, how often do you check tomorrow's weather forecast during the afternoon today and during the evening?

Research shows that in such a scenario, people who regard the activity as desirable will, on average, check the forecast more often if the forecast is unfavorable. And people who regard the activity as undesirable will, on average, check the forecast more often if the forecast is favorable. In other words, data gathering is less intensive when the data indicates a favored result, and more intensive when the data indicates a disfavored result.

In this example, the existence of a preference affects the decision process. The preference provides a motivation that biases the decision, even though the actions undertaken seem consistent with unbiased, evidence-based reasoning.

In the workplace, motivated reasoning masquerades as reasonableness and probity. In that disguise motivated reasoning enables us to fool ourselves into believing that we're thinking and reasoning objectively when we are not. [Noval 2019] [Boiney 1997] Motivated reasoning is effective — counter-effective, actually — because it facilitates departure from evidence-based reasoning in two important ways.

Deceiving discourse participants
Because motivated reasoning is so superficially similar to evidence-based reasoning, it can conceal the effects of cognitive biases. Subjected to motivated reasoning, people come to believe that they're following an evidence-based argument. This leads them to the impression that they're engaged in critical thinking when they are not. Motivated reasoning suppresses our sensitivity to sources of bias in our decision processes.
Consuming resources
Any discourse Any discourse requires the time
and the energy of the participants.
The specious arguments of motivated
reasoning consume both.
requires two resources: the time of the participants, and the energy of the participants. The specious arguments of motivated reasoning consume both of these resources. For example, most meetings have defined duration. Time is limited for the meeting and for its agenda items. Participant energy is also limited. Participants who want to attend to an evidence-based argument must share the time and energy resources with those who put forth arguments based on motivated reasoning. And when they do have the floor, they must spend some of their time dealing with the effects and distractions of motivated reasoning. Thus motivated reasoning both establishes specious conclusions and obstructs the establishment of more legitimate conclusions. Meetings can be extended, and deadlines relaxed, up to a point. But time lost to motivated reasoning is lost nevertheless.

All this would be enough to qualify motivated reasoning as one of the more destructive cognitive processes, but there is more. The most devious among us use motivated reasoning to deal with others who might be or might become clear-thinking opponents who might refute deceptive arguments. Using motivated reasoning, these devious individuals convert others into unwitting accomplices who then help to propagate deception.

In the context of motivated reasoning, some cognitive biases are more likely to arise than others. These biases include Attribute Substitution, Hindsight Bias, Confirmation Bias, Self-serving Bias, and the Pseudocertainty Effect. In the next post of this series, I'll discuss how Motivated Reasoning can increase our vulnerability to the Pseudocertainty Effect. First in this series  Next in this series Go to top Top  Next issue: Motivated Reasoning and the Pseudocertainty Effect  Next Issue

303 Secrets of Workplace PoliticsIs every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Kunda 1990]
Ziva Kunda. "The case for motivated reasoning," Psychological bulletin 108:3 (1990), 480-498. Available here. Retrieved 22 April 2021. Back
[Molden 2005]
Daniel C. Molden, and E. Tory Higgins. "Motivated thinking", in Keith James Holyoak and Robert G. Morrison, eds. The Cambridge handbook of thinking and reasoning. Vol. 137. Cambridge: Cambridge University Press, 2005, 295-321. Back
[Noval 2019]
Laura J. Noval and Morela Hernandez. "The unwitting accomplice: How organizations enable motivated reasoning and self-serving behavior," Journal of Business Ethics 157:3 (2019), 699-713. Available here. Back
[Boiney 1997]
Lindsley G. Boiney, Jane Kennedy and Pete Nye. "Instrumental bias in motivated reasoning: More when more is needed," Organizational Behavior and Human Decision Processes 72:1 (1997), 1-24. Back

Your comments are welcome

Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Daffodils of the variety Narcissus 'Barrett Browning'Self-Serving Bias in Organizations
We all want to believe that we can rely on the good judgment of decision makers when they make decisions that affect organizational performance. But they're human, and they are therefore subject to a cognitive bias known as self-serving bias. Here's a look at what can happen.
Gachi Fernandez and Sergio Cortazzo, professional tango coupleScope Creep, Hot Hands, and the Illusion of Control
Despite our awareness of scope creep's dangerous effects on projects and other efforts, we seem unable to prevent it. Two cognitive biases — the "hot hand fallacy" and "the illusion of control" — might provide explanations.
The Great Wall of China near MutianyuScope Creep and Confirmation Bias
As we've seen, some cognitive biases can contribute to the incidence of scope creep in projects and other efforts. Confirmation bias, which causes us to prefer evidence that bolsters our preconceptions, is one of these.
Auklet flock, Shumagins, March 2006Seven More Planning Pitfalls: I
Planners and members of planning teams are susceptible to patterns of thinking that lead to unworkable plans. But planning teams also suffer vulnerabilities. Two of these are Group Polarization and Trips to Abilene.
Braided streams in Grewingk Glacier RiverRisk Acceptance: One Path
When a project team decides to accept a risk, and when their project eventually experiences that risk, a natural question arises: What were they thinking? Cognitive biases, other psychological phenomena, and organizational dysfunction all can play roles.

See also Cognitive Biases at Work and Critical Thinking at Work for more related articles.

Forthcoming issues of Point Lookout

Three gears in a configuration that's inherently locked upComing April 24: Antipatterns for Time-Constrained Communication: 1
Knowing how to recognize just a few patterns that can lead to miscommunication can be helpful in reducing the incidence of problems. Here is Part 1 of a collection of communication antipatterns that arise in technical communication under time pressure. Available here and by RSS on April 24.
A dangerous curve in an icy roadAnd on May 1: Antipatterns for Time-Constrained Communication: 2
Recognizing just a few patterns that can lead to miscommunication can reduce the incidence of problems. Here is Part 2 of a collection of antipatterns that arise in technical communication under time pressure, emphasizing those that depend on content. Available here and by RSS on May 1.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.