Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 20, Issue 35;   August 26, 2020: Motivated Reasoning and the Pseudocertainty Effect

Motivated Reasoning and the Pseudocertainty Effect

by

When we have a preconceived notion of what conclusion a decision process should produce, we sometimes engage in "motivated reasoning" to ensure that we get the result we want. That's risky enough as it is. But when we do this in relation to a chain of decisions in the context of uncertainty, trouble looms.

As discussed in my last post, motivated reasoning is a pattern of thinking used in decision making (and elsewhere I suppose). It appears on its face to be evidence-based reasoning, but it departs from evidence-based reasoning at important junctures to produce the conclusions we've preferred from the outset. Those departures are possible, for example, when we weigh the strength of arguments or evidence, or when we set priorities for investigations and evidence gathering, or when we interpret evidence we find, or when we allocate time in meetings for discussion of particular points, or when we choose less capable or more capable individuals to present material relevant to pending decisions.

Motivated reasoning Motivated reasoning is a source of
bias in workplace decision making.
But in combination with a cognitive
bias known as the Pseudocertainty
Effect it can be especially damaging.
is thus a source of bias in workplace decision making. But in combination with certain specific cognitive biases, its effects can be profound. In this post I explore the synergistic effects of motivated reasoning in combination with a cognitive bias known as the Pseudocertainty Effect. [Tversky 1981]

The Pseudocertainty Effect is a cognitive bias that affects our ability to make good decisions in situations that involve projecting outcomes from a sequence of decisions under uncertainty. Because of the Pseudocertainty Effect, we humans have a tendency to focus only on the final decision in the sequence, ignoring earlier-stage uncertainties.

Explaining the Pseudocertainty Effect is a little easier if I begin with the Certainty Effect. [Kahneman 1979.1] The Certainty Effect is a cognitive bias that causes us to value too highly the utility of those outcomes that are certain, as compared to the utility of outcomes that are merely probable. The use of the word "highly" is a bit tricky here, because the Certainty Effect applies for both welcome and unwelcome outcomes — for both positive and negative utility. That is, the Certainty Effect also causes us to overestimate the "damage" of an unwelcome outcome when that outcome is certain, as compared to the damages of unwelcome outcomes that are merely probable.

Let's return now to the Pseudocertainty Effect. Consider a two-stage decision process. In the first stage we must choose between two options, Opt11 and Opt12, that produce different outcomes, O11 and O12, with probabilities P11 and P12 respectively. Similarly, the second stage also has two options, Opt21 and Opt22 that produce different outcomes, O21 and O22, with probabilities P21 and P22 respectively. However, in this scenario, we reach this second stage only if we choose Option Opt11, and we succeed in achieving Outcome O11. Since the probability of Outcome O11 is only P11, reaching the second stage isn't a sure thing.

Now it gets a little tricky.

In the second stage, the respective probabilities are P21 and P22. If P21 is 100%, that is, if the probability of Option Opt21 producing Outcome O21 is 100%, then the Certainty Effect would tend to cause us to overvalue O21.

But now, because we're in a two-stage decision process, the actual probability of Outcome O21 isn't 100%. It's only P11*P21, assuming that the option outcome distributions are probabilistically independent of each other. But Kahneman and Tversky demonstrated experimentally that people tend to overvalue Outcome O21 in a manner analogous to how they would have treated it if it actually were certain, that is, if P11*P21 were 100%. Hence the term Pseudocertainty Effect. That is, people tend to disregard the fact that the first stage of this two-stage scenario imposes a probability distribution that affects the final outcome. Instead, people focus only on the final stage of the two-stage scenario.

The experimental results suggest that people tend to "assume away" the uncertainties of the first stage of a two-stage decision string, and choose options only on the basis of the final stage. Or, at least, they give too much weight to the uncertainties of the final stage. This is the essence of the Pseudocertainty Effect.

Presumably, this phenomenon also applies to multi-stage scenarios, and to scenarios in which the probability distributions of the various stages aren't entirely independent.

Synergistic effects of Pseudocertainty Effect and Motivated Reasoning

The Pseudocertainty Effect has implications for workplace decision making in the context of motivated reasoning. Either phenomenon, acting alone, can be costly. But when both are acting they display a synergy that can be especially pernicious. For example, consider risk management.

In a typical risk management problem, we identify five attributes of a risk: the risk event, its probability, its impact, response if it materializes, and a mitigation strategy. What makes risk analysis so interesting is that some risks cannot materialize unless other risks materialize first, forming a "risk string." For example, some neighborhoods in Houston, Texas, can flood only if (a) a hurricane passes over the area and (b) a dam fails as a result of rainfall so extreme that the dam cannot withstand the pressure of the accumulated rainfall. These two events combine to provide an example of a risk string of length 2. Longer strings are clearly possible.

Because analyzing risk strings inherently produces staged decision strings, risk strings provide a setting in which the Pseudocertainty Effect can take hold. But the effect will be even more significant if we can identify a preferred outcome. That is, if we're also at risk of engaging in motivated reasoning, then the Pseudocertainty Effect can cause some real trouble.

For example, consider a risk A' that can materialize only if risk A materializes and is successfully addressed by OptA, one of the options defined for risk A. The decisions regarding how much to invest in mitigating either A or A' (or both) satisfy the structure Kahneman and Tversky studied in their research into the Pseudocertainty Effect. Moreover, in risk analysis, decision makers have a clear preference for investing a minimum amount in risk mitigation. They are thus at risk of engaging in motivated reasoning to justify low-cost options for risk management.

Probably the most dangerous case occurs when OptA', one of the options for dealing with risk A', is very low cost but nearly certain to work. Risk managers will be tempted to implement that option, despite the fact that the probability of that option succeeding is also dependent on the success of the Option OptA. Because of the Pseudocertainty Effect, risk managers will tend to ignore the probabilities associated with Risk A. That will lead them to over-invest in preparing for OptA', which they regard as leading to a favored outcome, because it is low cost.

An example from history

The battleship <cite>USS Arizona</cite>, burning during the Japanese attack on the U.S. naval base at Pearl Harbor, Hawaii, December 7, 1941

The battleship USS Arizona, burning during the Japanese attack on the U.S. naval base at Pearl Harbor, Hawaii, December 7, 1941. Image source: National Archives and Records Administration, courtesy Wikipedia.

The image here shows the battleship USS Arizona, burning during the Japanese attack on the U.S. naval base at Pearl Harbor, Hawaii, December 7, 1941. Naval losses were extraordinary, but losses of aircraft were no less severe. Mark Parillo, a military historian, writes, "U.S. [aircraft] losses amounted to 188 aircraft destroyed and another 159 damaged of the 402 aircraft present when the raid began." [Parillo 2006] Among factors contributing to the aircraft losses was the decision to position the aircraft for defense against sabotage, instead of defense against air attack. Aircraft at Wheeler Field, for example, "were taken out of the U-shaped earthen bunkers that had been built for their protection." [Correll 2007] Aircraft were also disarmed, and in some cases, rounds were removed from their belts to make storage more efficient. These actions led to delays in mounting effective defense against air attack.

The strategic decision to defend against sabotage instead of air attack can be regarded as the first stage in one of Kahneman and Tversky's experiments. A second stage could be the deployment of manpower around the island. Personnel that would be needed for air defense on the day of the attack were either off duty or standing guard at facilities at some remove from the airfield. These decision strings would have produced a successful defense against sabotage, but as we now know, they produced an unsuccessful defense against air attack. The outcome suggests that the Pseudocertainty Effect might have played a role.

Last words

In situations that involve risk strings and motivated reasoning, trusting to intuition is likely to run afoul of the Pseudocertainty Effect. Careful mathematical analysis of all options under consideration offers a path with a minimum of exposure to the Pseudocertainty Effect. First in this series  Go to top Top  Next issue: Seven Planning Pitfalls: I  Next Issue

303 Secrets of Workplace PoliticsIs every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Tversky 1981]
Amos Tversky and Daniel Kahneman. "The framing of decisions and the psychology of choice," Science 211.4481 (1981), 453-458. Available here or here. Back
[Kahneman 1979.1]
Daniel Kahneman and Amos Tversky. "Prospect Theory: An Analysis of Decision under Risk," Econometrica 47:2 (1979), 263-291. Available here. Back
[Parillo 2006]
Mark Parillo. "The United States in the Pacific," in Robin Higham and Stephen J. Harris, eds. Why Air Forces Fail: The Anatomy of Defeat. University Press of Kentucky, 2016. Back
[Correll 2007]
John T. Correll. "Caught on the Ground," Air Force Magazine, December 1, 2007. www.airforcemag.com/article/1207ground/. Back

Your comments are welcome

Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Brendan Nyhan and Jason ReiflerWishful Significance: I
When things don't work out, and we investigate why, we sometimes attribute our misfortune to "wishful thinking." In this part of our exploration of wishful thinking we examine how we arrive at mistaken assessments of the significance of what we see, hear, or learn.
An actual bandwagon in a circus paradeCognitive Biases and Influence: I
The techniques of influence include inadvertent — and not-so-inadvertent — uses of cognitive biases. They are one way we lead each other to accept or decide things that rationality cannot support.
Winston Churchill in the Canadian Parliament, December 30, 1941The Trap of Beautiful Language
As we assess the validity of others' statements, we risk making a characteristically human error — we confuse the beauty of their language with the reliability of its meaning. We're easily thrown off by alliteration, anaphora, epistrophe, and chiasmus.
A so-called "Paris Gun" of World War ICognitive Biases at Work
Cognitive biases can lead us to misunderstand situations, overlook options, and make decisions we regret. The patterns of thinking that lead to cognitive biases provide speed and economy advantages, but we must manage the risks that come along with them.
The Yin and Yang symbol with white representing Yang and black representing YinUnrecognized Bullying: III
Much workplace bullying goes unrecognized because of cognitive biases that can cause targets, perpetrators, bystanders, and supervisors of perpetrators not to notice bullying. The Halo Effect and the Horn Effect are two of these biases.

See also Cognitive Biases at Work and Critical Thinking at Work for more related articles.

Forthcoming issues of Point Lookout

A meeting in a typical conference roomComing April 3: Recapping Factioned Meetings
A factioned meeting is one in which participants identify more closely with their factions, rather than with the meeting as a whole. Agreements reached in such meetings are at risk of instability as participants maneuver for advantage after the meeting. Available here and by RSS on April 3.
Franz Halder, German general and the chief of staff of the Army High Command (OKH) in Nazi Germany from 1938 until September 1942And on April 10: Managing Dunning-Kruger Risk
A cognitive bias called the Dunning-Kruger Effect can create risk for organizational missions that require expertise beyond the range of knowledge and experience of decision-makers. They might misjudge the organization's capacity to execute the mission successfully. They might even be unaware of the risk of so misjudging. Available here and by RSS on April 10.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.