Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 21, Issue 8;   February 24, 2021:

Risk Acceptance: Naïve Realism

by

When we suddenly notice a "project-killer" risk that hasn't yet materialized, we sometimes accept the risk even though we know how seriously it threatens the effort. A psychological phenomenon known as naïve realism plays a role in this behavior.
Roger Boisjoly of Morton Thiokol, who tried to halt the launch of the Challenger space shuttle in 1986

Roger Boisjoly, the Morton Thiokol engineer who, in 1985, one year before the catastrophic failure of the U.S. Space Shuttle Challenger, wrote a memorandum outlining the safety risks of cold-weather launches. He successfully raised the issue then, and many times subsequently, including the evening prior to the launch. In 1988, he was awarded the Prize for Scientific Freedom and Responsibility by the American Association for the Advancement of Science, for "…his exemplary and repeated efforts to fulfill his professional responsibilities as an engineer by alerting others to life-threatening design problems of the Challenger space shuttle and for steadfastly recommending against the tragic launch of January 1986."

Mr. Boisjoly did not succeed in gaining acceptance of his objections to the Challenger launch. Moreover, he was shunned by colleagues, even after the investigation concluded, showing that his objections had been correct. He eventually resigned from Morton Thiokol. It is difficult to say from this distance, but this outcome could be an example of excoriation and ejection of a dissenter as a consequence of naïve realism.

Photo courtesy the Online Ethics Center at the National Academy of Engineering.

When a project is underway, or even when we're still in discussion stages, we sometimes stumble upon an unanticipated risk. Some unanticipated risks can be serious — fatal or near fatal to the project if they materialize. For convenience, I call these risks Very Bad News (VBN) risks. After serious study, if we can't devise practical approaches to avoid, transfer, reduce, or compensate for the VBN risk, we cancel the project and move on. That's what happens if we're thinking clearly.

Sometimes we don't think clearly enough. We decide to accept the VBN risk. Here are three scenarios that frequently lead to risk acceptance:

  • Someone who's politically very powerful insists that this project must succeed no matter what, and "you'll find a way around the problem somehow."
  • The team has found some approaches that might mitigate the VBN risk somewhat. The time, budget, or capabilities required are beyond what's available, but somebody says, "If we make a solid demo, they'll find the money."
  • There's division of opinion about whether the VBN risk is real. Most expert members of the project team acknowledge the risk, but one ambitious fellow, currying favor with Management, disagrees.

These situations are fertile ground for toxic conflict. When the VBN risk is fundamentally technological, a common pattern for this conflict is an oppositional debate between the "technologists" (IT or product engineers) on the one hand, and on the other hand the "business" (product managers, executives, or Sales or Marketing officials). When the conflict becomes toxic enough, it can leave lasting social scars that limit future organizational performance.

The conflict that arises in the context of debates about VBN risks isn't unique. We can understand its dynamics in terms of a psychological phenomenon known as naïve realism. Naïve realism is our human tendency to assume that our perceptions of the world are accurate and objective [Ross 1995]. Consequently, when our understanding of the world conflicts with that of others, we attribute the difference to distortions in others' perceptions, due to their ignorance, false beliefs, irrationality, or biases. As we do this, so do our partners in conflict attribute to us ignorance, false beliefs, irrationality, or biases. Naïve realism thus provides an elegantly symmetric setup for toxic conflict. It can transform a disagreement from one in which each party critiques the value of the other's approach to the issues to one in which each party attacks the other's value as a person.

But although a setup for toxic conflict is necessary, it isn't sufficient to seriously damage relationships. For truly destructive toxic conflict, the participants need to care deeply about the outcome of the conflict. VBN risks can provide the missing element. We know that conflict participants care deeply about the outcomes of debates about VBN risks, because they all regard VBN risks as "Very Bad News."

On the "business" side, After we identify a "Very Bad News" risk,
a common pattern for the ensuing
conflict is oppositional debate between
product engineers and the "business"
conflict participants strongly desire project success. They want what the project promises to deliver, and they can get what they want only if the project goes forward. On the "technologist" side, conflict participants want to work on successful projects, and they want to avoid working on projects that are doomed from the start. Too often, in their experience, project failures have been unjustly and incorrectly attributed not to foolhardy decisions to accept VBN risks, but to the lack of professionalism and low work quality of project teams. In the context of the VBN risk, the technologists can get what they want only if the VBN risk is properly acknowledged and managed, the cost of which can be prohibitive. The "business" faction wants the project to go forward despite the VBN risk; the "technologist" faction wants the project to be reconfigured or cancelled because of the VBN risk.

In many organizations, the "business" prevails. In practice, the technologists are directed to execute the project with the VBN risk wholly or largely unmitigated. Because the people who represent the "business" want the project to proceed, and because they cannot afford to manage the VBN risk, they decide to opt for "risk acceptance."

But exactly how does this happen? What patterns of thought and decision making enable the group to proceed despite this stark disagreement? I'll explore this part of the story next time.  Next in this series Go to top Top  Next issue: Risk Acceptance: One Path  Next Issue

How to Spot a Troubled Project Before the Trouble StartsProjects never go quite as planned. We expect that, but we don't expect disaster. How can we get better at spotting disaster when there's still time to prevent it? How to Spot a Troubled Project Before the Trouble Starts is filled with tips for executives, senior managers, managers of project managers, and sponsors of projects in project-oriented organizations. It helps readers learn the subtle cues that indicate that a project is at risk for wreckage in time to do something about it. It's an ebook, but it's about 15% larger than "Who Moved My Cheese?" Just . Order Now! .

Footnotes

[Ross 1995]
Lee Ross and Andrew Ward. "Naive realism in everyday life: Implications for social conflict and misunderstanding." (1995). Available here Back

Your comments are welcome

Would you like to see your comments posted here? rbrenXEiRBfuFHUtjHrqUner@ChacpYPvvSVhUNIOeXHKoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Computer-generated image of the third stage ignition for Mars Climate OrbiterConfirmation Bias: Workplace Consequences Part II
We continue our exploration of confirmation bias. In this Part II, we explore its effects in management processes.
Thomas Paine, considered one of the Founding Fathers of the United StatesEffects of Shared Information Bias: II
Shared information bias is widely recognized as a cause of bad decisions. But over time, it can also erode a group's ability to assess reality accurately. That can lead to a widening gap between reality and the group's perceptions of reality.
A so-called "Paris Gun" of World War ICognitive Biases at Work
Cognitive biases can lead us to misunderstand situations, overlook options, and make decisions we regret. The patterns of thinking that lead to cognitive biases provide speed and economy advantages, but we must manage the risks that come along with them.
The Leonard P. Zakim Bunker Hill BridgeSeven Planning Pitfalls: III
We usually attribute departures from plan to poor execution, or to "poor planning." But one cause of plan ineffectiveness is the way we think when we set about devising plans. Three cognitive biases that can play roles are the so-called Magical Number 7, the Ambiguity Effect, and the Planning Fallacy.
The Bay of Pigs, CubaSeven More Planning Pitfalls: II
Planning teams, like all teams, are susceptible to several patterns of interaction that can lead to counter-productive results. Three of these most relevant to planners are False Consensus, Groupthink, and Shared Information Bias.

See also Cognitive Biases at Work and Conflict Management for more related articles.

Forthcoming issues of Point Lookout

Tennis balls on a tennis court. Your fitness program can be a part of your job search.Coming July 28: Be Choosier About Job Offers: II
An unfortunate outcome of job searches occurs when a job seeker feels forced to accept an offer that isn't a good fit. Sometimes financial pressures are so severe that the seeker has little choice. But financial pressures are partly perceptual. Here's how to manage feeling that pressure. Available here and by RSS on July 28.
A beach at sunsetAnd on August 4: What Are the Chances: I
When estimating the probabilities of success of different strategies, we must often estimate the probability of multiple events occurring. People make a common mistake when forming such estimates. They assume that events are independent when they are not. Available here and by RSS on August 4.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenXEiRBfuFHUtjHrqUner@ChacpYPvvSVhUNIOeXHKoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Public seminars

The Power Affect: How We Express Our Personal Power

Many The Power Affect: How We Express Personal Powerpeople who possess real organizational power have a characteristic demeanor. It's the way they project their presence. I call this the power affect. Some people — call them power pretenders — adopt the power affect well before they attain significant organizational power. Unfortunately for their colleagues, and for their organizations, power pretenders can attain organizational power out of proportion to their merit or abilities. Understanding the power affect is therefore important for anyone who aims to attain power, or anyone who works with power pretenders. Read more about this program.

Bullet Points: Mastery or Madness?

DecisBullet Point Madnession makers in modern organizations commonly demand briefings in the form of bullet points or a series of series of bullet points. But this form of presentation has limited value for complex decisions. We need something more. We actually need to think. Briefers who combine the bullet-point format with a variety of persuasion techniques can mislead decision makers, guiding them into making poor decisions. Read more about this program.

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at Twitter, or share a tweet Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.