Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 20, Issue 35;   August 26, 2020: Motivated Reasoning and the Pseudocertainty Effect

Motivated Reasoning and the Pseudocertainty Effect

by

When we have a preconceived notion of what conclusion a decision process should produce, we sometimes engage in "motivated reasoning" to ensure that we get the result we want. That's risky enough as it is. But when we do this in relation to a chain of decisions in the context of uncertainty, trouble looms.

As discussed in my last post, motivated reasoning is a pattern of thinking used in decision making (and elsewhere I suppose). It appears on its face to be evidence-based reasoning, but it departs from evidence-based reasoning at important junctures to produce the conclusions we've preferred from the outset. Those departures are possible, for example, when we weigh the strength of arguments or evidence, or when we set priorities for investigations and evidence gathering, or when we interpret evidence we find, or when we allocate time in meetings for discussion of particular points, or when we choose less capable or more capable individuals to present material relevant to pending decisions.

Motivated reasoning Motivated reasoning is a source of
bias in workplace decision making.
But in combination with a cognitive
bias known as the Pseudocertainty
Effect it can be especially damaging.
is thus a source of bias in workplace decision making. But in combination with certain specific cognitive biases, its effects can be profound. In this post I explore the synergistic effects of motivated reasoning in combination with a cognitive bias known as the Pseudocertainty Effect. [Tversky 1981]

The Pseudocertainty Effect is a cognitive bias that affects our ability to make good decisions in situations that involve projecting outcomes from a sequence of decisions under uncertainty. Because of the Pseudocertainty Effect, we humans have a tendency to focus only on the final decision in the sequence, ignoring earlier-stage uncertainties.

Explaining the Pseudocertainty Effect is a little easier if I begin with the Certainty Effect. [Kahneman 1979.1] The Certainty Effect is a cognitive bias that causes us to value too highly the utility of those outcomes that are certain, as compared to the utility of outcomes that are merely probable. The use of the word "highly" is a bit tricky here, because the Certainty Effect applies for both welcome and unwelcome outcomes — for both positive and negative utility. That is, the Certainty Effect also causes us to overestimate the "damage" of an unwelcome outcome when that outcome is certain, as compared to the damages of unwelcome outcomes that are merely probable.

Let's return now to the Pseudocertainty Effect. Consider a two-stage decision process. In the first stage we must choose between two options, Opt11 and Opt12, that produce different outcomes, O11 and O12, with probabilities P11 and P12 respectively. Similarly, the second stage also has two options, Opt21 and Opt22 that produce different outcomes, O21 and O22, with probabilities P21 and P22 respectively. However, in this scenario, we reach this second stage only if we choose Option Opt11, and we succeed in achieving Outcome O11. Since the probability of Outcome O11 is only P11, reaching the second stage isn't a sure thing.

Now it gets a little tricky.

In the second stage, the respective probabilities are P21 and P22. If P21 is 100%, that is, if the probability of Option Opt21 producing Outcome O21 is 100%, then the Certainty Effect would tend to cause us to overvalue O21.

But now, because we're in a two-stage decision process, the actual probability of Outcome O21 isn't 100%. It's only P11*P21, assuming that the option outcome distributions are probabilistically independent of each other. But Kahneman and Tversky demonstrated experimentally that people tend to overvalue Outcome O21 in a manner analogous to how they would have treated it if it actually were certain, that is, if P11*P21 were 100%. Hence the term Pseudocertainty Effect. That is, people tend to disregard the fact that the first stage of this two-stage scenario imposes a probability distribution that affects the final outcome. Instead, people focus only on the final stage of the two-stage scenario.

The experimental results suggest that people tend to "assume away" the uncertainties of the first stage of a two-stage decision string, and choose options only on the basis of the final stage. Or, at least, they give too much weight to the uncertainties of the final stage. This is the essence of the Pseudocertainty Effect.

Presumably, this phenomenon also applies to multi-stage scenarios, and to scenarios in which the probability distributions of the various stages aren't entirely independent.

Synergistic effects of Pseudocertainty Effect and Motivated Reasoning

The Pseudocertainty Effect has implications for workplace decision making in the context of motivated reasoning. Either phenomenon, acting alone, can be costly. But when both are acting they display a synergy that can be especially pernicious. For example, consider risk management.

In a typical risk management problem, we identify five attributes of a risk: the risk event, its probability, its impact, response if it materializes, and a mitigation strategy. What makes risk analysis so interesting is that some risks cannot materialize unless other risks materialize first, forming a "risk string." For example, some neighborhoods in Houston, Texas, can flood only if (a) a hurricane passes over the area and (b) a dam fails as a result of rainfall so extreme that the dam cannot withstand the pressure of the accumulated rainfall. These two events combine to provide an example of a risk string of length 2. Longer strings are clearly possible.

Because analyzing risk strings inherently produces staged decision strings, risk strings provide a setting in which the Pseudocertainty Effect can take hold. But the effect will be even more significant if we can identify a preferred outcome. That is, if we're also at risk of engaging in motivated reasoning, then the Pseudocertainty Effect can cause some real trouble.

For example, consider a risk A' that can materialize only if risk A materializes and is successfully addressed by OptA, one of the options defined for risk A. The decisions regarding how much to invest in mitigating either A or A' (or both) satisfy the structure Kahneman and Tversky studied in their research into the Pseudocertainty Effect. Moreover, in risk analysis, decision makers have a clear preference for investing a minimum amount in risk mitigation. They are thus at risk of engaging in motivated reasoning to justify low-cost options for risk management.

Probably the most dangerous case occurs when OptA', one of the options for dealing with risk A', is very low cost but nearly certain to work. Risk managers will be tempted to implement that option, despite the fact that the probability of that option succeeding is also dependent on the success of the Option OptA. Because of the Pseudocertainty Effect, risk managers will tend to ignore the probabilities associated with Risk A. That will lead them to over-invest in preparing for OptA', which they regard as leading to a favored outcome, because it is low cost.

An example from history

The battleship USS Arizona, burning during the Japanese attack on the U.S. naval base at Pearl Harbor, Hawaii, December 7, 1941

The battleship USS Arizona, burning during the Japanese attack on the U.S. naval base at Pearl Harbor, Hawaii, December 7, 1941. Image source: National Archives and Records Administration, courtesy Wikipedia.

The image here shows the battleship USS Arizona, burning during the Japanese attack on the U.S. naval base at Pearl Harbor, Hawaii, December 7, 1941. Naval losses were extraordinary, but losses of aircraft were no less severe. Mark Parillo, a military historian, writes, "U.S. [aircraft] losses amounted to 188 aircraft destroyed and another 159 damaged of the 402 aircraft present when the raid began." [Parillo 2006] Among factors contributing to the aircraft losses was the decision to position the aircraft for defense against sabotage, instead of defense against air attack. Aircraft at Wheeler Field, for example, "were taken out of the U-shaped earthen bunkers that had been built for their protection." [Correll 2007] Aircraft were also disarmed, and in some cases, rounds were removed from their belts to make storage more efficient. These actions led to delays in mounting effective defense against air attack.

The strategic decision to defend against sabotage instead of air attack can be regarded as the first stage in one of Kahneman and Tversky's experiments. A second stage could be the deployment of manpower around the island. Personnel that would be needed for air defense on the day of the attack were either off duty or standing guard at facilities at some remove from the airfield. These decision strings would have produced a successful defense against sabotage, but as we now know, they produced an unsuccessful defense against air attack. The outcome suggests that the Pseudocertainty Effect might have played a role.

Last words

In situations that involve risk strings and motivated reasoning, trusting to intuition is likely to run afoul of the Pseudocertainty Effect. Careful mathematical analysis of all options under consideration offers a path with a minimum of exposure to the Pseudocertainty Effect.  Cognitive Biases at Work First issue in this series  Go to top Top  Next issue: Seven Planning Pitfalls: I  Next Issue

303 Secrets of Workplace PoliticsIs every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Tversky 1981]
Amos Tversky and Daniel Kahneman. "The framing of decisions and the psychology of choice," Science 211.4481 (1981), 453-458. Available here or here. Back
[Kahneman 1979.1]
Daniel Kahneman and Amos Tversky. "Prospect Theory: An Analysis of Decision under Risk," Econometrica 47:2 (1979), 263-291. Available here. Back
[Parillo 2006]
Mark Parillo. "The United States in the Pacific," in Robin Higham and Stephen J. Harris, eds. Why Air Forces Fail: The Anatomy of Defeat. University Press of Kentucky, 2016. Back
[Correll 2007]
John T. Correll. "Caught on the Ground," Air Force Magazine, December 1, 2007. www.airforcemag.com/article/1207ground/. Back

Your comments are welcome

Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Braided streams in Grewingk Glacier RiverRisk Acceptance: One Path
When a project team decides to accept a risk, and when their project eventually experiences that risk, a natural question arises: What were they thinking? Cognitive biases, other psychological phenomena, and organizational dysfunction all can play roles.
A drone carrying a camera, flying under remote controlIllusory Management: I
Many believe that managers control organizational performance, but a puzzle emerges when we consider the phenomena managers clearly cannot control. Why do we believe in Management control when the phenomena Management cannot control are so many and powerful?
18 hatsIllusory Management: II
Many believe that managers control organizational performance more precisely than they actually do. This illusion might arise, in part, from a mechanism that causes leaders and the people they lead to tend to misattribute organizational success.
A hummingbird feeding on the nectar of a flowerDownscoping Under Pressure: II
We sometimes "downscope" projects to bring them back on budget and schedule when they're headed for overruns. Downscoping doesn't always work. Cognitive biases like the sunk cost effect and confirmation bias can distort decisions about how to downscope.
Opera house, Sydney, AustraliaLessons Not Learned: I
The planning fallacy is a cognitive bias that causes us to underestimate the cost and effort involved in projects large and small. Mitigating its effects requires understanding how we go wrong when we plan projects by referencing our own past experience.

See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.

Forthcoming issues of Point Lookout

A game of Jenga underwayComing September 4: Beating the Layoffs: I
If you work in an organization likely to conduct layoffs soon, keep in mind that exiting voluntarily before the layoffs can carry significant advantages. Here are some that relate to self-esteem, financial anxiety, and future employment. Available here and by RSS on September 4.
A child at a fork in a pathAnd on September 11: Beating the Layoffs: II
If you work in an organization likely to conduct layoffs soon, keep in mind that exiting voluntarily can carry advantages. Here are some advantages that relate to collegial relationships, future interviews, health, and severance packages. Available here and by RSS on September 11.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.