Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 21, Issue 9;   March 3, 2021: Risk Acceptance: One Path

Risk Acceptance: One Path

by

When a project team decides to accept a risk, and when their project eventually experiences that risk, a natural question arises: What were they thinking? Cognitive biases, other psychological phenomena, and organizational dysfunction all can play roles.
Braided streams in Grewingk Glacier River

Braided streams in Grewingk Glacier River, Kachemak Bay, Cook Inlet, Alaska, in June 2009. Rivers emanating from retreating glaciers carry large volumes of sediment, producing braided river patterns with multiple channels. Braided channels are variable and dynamic. The Alaska ShoreZone exhibition guide states: "Although the threshold between meandering (sinuous, single channel river pattern) and braiding is not clearly understood, three factors are probably necessary for braiding to occur: 1) an abundant bedload supply (portion of a river's sediment load supported by the channel bed), 2) erodible banks, and 3) high stream power (the potential energy for a given river channel length)."

Some classes of group interactions can produce similar results through multiple interacting paths. We can expect these phenomena when 1) there are multiple relevant psychological phenomena, 2) the relevant psychological phenomena can produce behaviors that can stimulate each other, and 3) the participants care deeply about the end result. As is often the case, the forces of Nature, by example and metaphor, provide insight into human behavior.

Photo courtesy U.S. National Oceanic and Atmospheric Administration. Photo credit: Alaska ShoreZone.

In last week's post, I noted the existence of a cognitive bias known as naïve realism, which is the human tendency to assume (usually incorrectly) that our perceptions of the world are accurate and objective. Naïve realism can contribute to our tendency to push ahead with projects even after we've noticed risks that present significant threats to success. I called these risks very bad news (VBN) risks.

Discussions about how to deal with VBN risks typically involve two sets of participants. Representing the "Business" are people from units such as Sales, Marketing, Human Resources, users, and executives. I call them, collectively, the Business. Representing the "technologists" are the people who will be doing the actual work of producing what the Business needs and wants. They include people from units such as product engineering, Information Technology (IT), and business technology. I call them, collectively, Engineering/IT.

And in that latest post, I described how the Business sometimes accepts VBN risks by rejecting the assertions of Engineering/IT as a result of naïve realism. The probability of occurrence of this phenomenon is elevated when the tactics available for addressing the VBN risk — avoidance, compensation, and mitigation — are so costly that they render the project unaffordable and a candidate for cancellation. So instead of abandoning the project because of its VBN risks, organizations tend to accept the risks more often than objectivity would allow.

Last week I closed with a question: What patterns of thought and decision-making enable the group to accept a VBN risk, despite the stark disagreement between the Business and Engineering/IT?

The case

In what follows I use the name Ben when referring to someone from the Business side of the decision, and the name Emma when referring to someone on the Engineering/IT side of the decision.

For concreteness, here's a description of a risk-acceptance situation:

Within a few months, the VBN risk materialized and after several attempts to recover from it, Marigold was abandoned.

How did this happen? What enabled the team to set aside their disagreements about the VBN risk and push forward on a doomed project? What follows is an exploration of the phenomena that move such groups as they try to reach a go-no-go decision about a project for which they've identified a VBN risk.

In the remainder of this post, I describe one pathway through cognitive biases, other psychological phenomena, and organizational politics that leads to inappropriate risk acceptance. There are probably dozens of such paths, involving different psychological phenomena and organizational dysfunctions. This is only an example.

The Dunning-Kruger effect leads to overconfidence of the Business team

Justin Kruger and David Dunning, working at Cornell in 1999, discovered a phenomenon now called the Dunning-Kruger effect. [Kruger 1999] They performed experiments that yielded results consistent with the following four principles (paraphrasing):

  1. Incompetent individuals, compared to their more competent peers, dramatically overestimate their ability and performance.
  2. Incompetent individuals are less able than their more competent peers to recognize competence when they see it.
  3. Incompetent individuals are less able than their more competent peers to gain insight into the true level of their own performance.
  4. Incompetent individuals can gain insight about their shortcomings, but this comes (paradoxically) by gaining competence.
Because of the Dunning-Kruger effect, the members of the Business team tended to overestimate their own competence relative to handling the VBN risk. And they were also unable to appreciate the competence of Emma's team in assessing the severity of the risk. This gap in the levels of competence of the Business and Engineering/IT can create a gap in understanding that forms the basis of the risk acceptance that occurs in this case. And the level of competence of the Business team relative to managing the VBN risk can lead to errors as they assess their own competence. Those errors manifest themselves as overconfidence about how well they can deal with the VBN risk.

Motivated reasoning creates a gap in understanding

In the case Motivated reasoning can conspire with
the Dunning-Kruger Effect to cause
people to confidently adopt positions
that place the project at risk
described above, Ben and the other members of the Business team were very motivated to continue with the Marigold project. This motivation led to a biased assessment of the consequences of the VBN risk. [Brenner 2020.4] [Kunda 1990] [Molden 2005] Because of motivated reasoning, the people representing the Business tended to underestimate the seriousness of the VBN risk. Emma and her team might also be engaged in motivated reasoning. But for the engineers, the bias led them to tend to overestimate the seriousness of the VBN risk.

Motivated reasoning thus tends to create a sense of confidence among Ben's team that could become overconfidence. This situation is inherently asymmetric. The Business tends to adopt a risk-aggressive position; Engineering/IT tends to adopt a risk-averse position. Moreover, the two parties are subject to very different consequences if Marigold is cancelled. Engineering/IT almost certainly has a backlog of other work to do, which they are happy to tackle. But the Business will have to find another business opportunity. And since Marigold was probably the most attractive opportunity within reach, the Business will likely experience more disappointment following Marigold's cancellation than will Engineering/IT.

The power differential favors the Business

In many organizations, relative to questions of organizational direction, the power distance between the Business and Engineering/IT is significant. In organizations in which the Business is more powerful than Engineering/IT, the Business has greater influence over the organizational agenda. In such organizations, although it is the task of Engineering/IT to determine how to achieve the objectives of the Business, Engineering/IT has somewhat less influence relative to the question of whether to achieve those objectives.

The power differential thus accounts for some of the inability of Engineering/IT to stop Marigold on account of the VBN risk. But raw power, in the end, is costly to the organization. Its exercise is demoralizing for the people who must bow to it. For that reason, most groups search for reasons to support and comply with the preferences of the powerful. Such reasons provide a basis for the powerful to believe that they have considered all views and have led the group to a mutual accommodation; and they provide a basis for the less powerful to feel heard and to maintain professional dignity.

Overconfidence is contagious

When Enron entered into bankruptcy on December 3, 2001, it was the seventh-largest company in the United States. Subsequent research has determined that the organization had a "culture of arrogance," and that Enron employees regarded themselves as working in an elite organization, and smarter than anyone else. These beliefs account in part for Enron's pattern of negotiating deals that others would have regarded as risky to the point of being foolhardy.

Joey T. Cheng and colleagues have conducted experiments that suggest that in social groups, overconfidence can be transmitted from one person to the next, eventually taking root in the culture of the group. [Cheng 2020] [Cheng 2021] The phenomenon of overconfidence contagion could explain why the people representing the Business so uniformly regard a VBN risk as acceptable, saying, for example, "We'll cross that bridge if we come to it." While that uniformity of overconfidence can be remarkable, it can also arise from other factors, such as similarity of experience, education, or interests.

More intriguing is the possibility that overconfidence can be transmitted from the members of the Business team to members of the Engineering/IT team. During negotiations and working meetings, confronted with uniformly confident statements by Business team members, Engineering/IT team members "adjust" their views of the seriousness of the VBN risk. The risk of overconfidence contagion is especially elevated when the issue at hand is a VBN risk, because the significance of the issue is so strongly dependent on judgment and experience. Probabilities and consequences aren't physical things. Although we represent them as numbers, they aren't measurements. To assess them, we must rely on estimates and judgments. They are vulnerable to the effects of overconfidence contagion.

Last words

Let me review the path we've just taken.

  • The Dunning-Kruger effect leads to overconfidence of Business teams with respect to their ability to tolerate VBN risks.
  • Motivated reasoning creates a gap between the Business and Engineering/IT relative to their understanding of the nature and consequences of the VBN risk.
  • The power differential between the Business and Engineering/It favors the Business, as they seek to have the organization adopt their approach to the VBN risk.
  • The overconfidence of the Business team is contagious, which spreads through the Business team and unifies them. It also affects Engineering/IT and weakens their reticence to accept the VBN risk.
This is just one path that leads to risk acceptance, overcoming the cautious position of Engineering/IT and causing the organization to accept A VBN risk that should have been reason enough to halt the project. You might have encountered similar paths in your organization. Beware. A path once trodden is easier to follow a second time. First in this series  Go to top Top  Next issue: On Repeatable Blunders  Next Issue

101 Tips for Managing Conflict Are you fed up with tense, explosive meetings? Are you or a colleague the target of a bully? Destructive conflict can ruin organizations. But if we believe that all conflict is destructive, and that we can somehow eliminate conflict, or that conflict is an enemy of productivity, then we're in conflict with Conflict itself. Read 101 Tips for Managing Conflict to learn how to make peace with conflict and make it an organizational asset. Order Now!

For more about the Dunning-Kruger Effect, see "The Paradox of Confidence," Point Lookout for January 7, 2009; "How to Reject Expert Opinion: II," Point Lookout for January 4, 2012; "Devious Political Tactics: More from the Field Manual," Point Lookout for August 29, 2012; "Overconfidence at Work," Point Lookout for April 15, 2015; "Wishful Thinking and Perception: II," Point Lookout for November 4, 2015; "Wishful Significance: II," Point Lookout for December 23, 2015; "Cognitive Biases and Influence: I," Point Lookout for July 6, 2016; and "The Paradox of Carefully Chosen Words," Point Lookout for November 16, 2016.

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Kruger 1999]
Justin Kruger and David Dunning. "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments," Journal of Personality and Social Psychology, 77:6 (1999), 1121-1134. Available here. Retrieved 17 December 2008. Back
[Brenner 2020.4]
Richard Brenner. "Motivated Reasoning," Point Lookout blog, August 19, 2020. Available here. Back
[Kunda 1990]
Ziva Kunda. "The case for motivated reasoning," Psychological bulletin 108:3 (1990), pp.480-498. Available here. Retrieved 22 April 2021. Back
[Molden 2005]
Daniel C. Molden, and E. Tory Higgins. "Motivated thinking", in Keith James Holyoak and Robert G. Morrison, eds. The Cambridge handbook of thinking and reasoning. Vol. 137. Cambridge: Cambridge University Press, (2005), pp.295-321. Back
[Cheng 2020]
Joey T. Cheng, Cameron Anderson, Elizabeth R. Tenney, Sebastien Brion, Don A. Moore, and Jennifer M. Logg. "The social transmission of overconfidence," Journal of Experimental Psychology: General 150:1 (2021), 157-186. Available here. Back
[Cheng 2021]
Joey T. Cheng, Elizabeth R. Tenney, Don A. Moore, and Jennifer M. Logg. "Overconfidence Is Contagious," Harvard Business Review, November 17, 2020. Available here. Back

Your comments are welcome

Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Thomas Paine, considered one of the Founding Fathers of the United StatesEffects of Shared Information Bias: II
Shared information bias is widely recognized as a cause of bad decisions. But over time, it can also erode a group's ability to assess reality accurately. That can lead to a widening gap between reality and the group's perceptions of reality.
An engineer attending a meeting with 14 other peopleHow Messages Get Mixed
Although most authors of mixed messages don't intend to be confusing, message mixing does happen. One of the most fascinating mixing mechanisms occurs in the mind of the recipient of the message.
The Leonard P. Zakim Bunker Hill BridgeSeven Planning Pitfalls: III
We usually attribute departures from plan to poor execution, or to "poor planning." But one cause of plan ineffectiveness is the way we think when we set about devising plans. Three cognitive biases that can play roles are the so-called Magical Number 7, the Ambiguity Effect, and the Planning Fallacy.
Roger Boisjoly of Morton Thiokol, who tried to halt the launch of the Challenger space shuttle in 1986Risk Acceptance: Naïve Realism
When we suddenly notice a "project-killer" risk that hasn't yet materialized, we sometimes accept the risk even though we know how seriously it threatens the effort. A psychological phenomenon known as naïve realism plays a role in this behavior.
A possibly difficult choiceChoice-Supportive Bias
Choice-supportive bias is a cognitive bias that causes us to assess our past choices as more fitting than they actually were. The erroneous judgments it produces can be especially costly to organizations interested in improving decision processes.

See also Cognitive Biases at Work and Conflict Management for more related articles.

Forthcoming issues of Point Lookout

A dangerous curve in an icy roadComing May 1: Antipatterns for Time-Constrained Communication: 2
Recognizing just a few patterns that can lead to miscommunication can reduce the incidence of miscommunications. Here's Part 2 of a collection of antipatterns that arise in communication under time pressure, emphasizing those that depend on content. Available here and by RSS on May 1.
And on May 8: Antipatterns for Time-Constrained Communication: 3
Recognizing just a few patterns that can lead to miscommunication can reduce the incidence of problems. Here is Part 3 of a collection of antipatterns that arise in technical communication under time pressure, emphasizing past experiences of participants. Available here and by RSS on May 8.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.