Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 23, Issue 36;   September 6, 2023: The Risk Planning Fallacy

The Risk Planning Fallacy

by

The planning fallacy is a cognitive bias that causes underestimates of cost, time required, and risks for projects. Analogously, I propose a risk planning fallacy that causes underestimates of probabilities and impacts of risk events.
A metaphor for preventing risk propagation

A metaphor for preventing risk propagation. Preventing the falling dominos from disrupting dominos that are still standing is a metaphor for preventing risks propagating from one project to the next. Image by OleksandrPidvalnyi courtesy Pixabay.com.

In a 1977 paper, Kahneman and Tversky identified a cognitive bias that causes project planners to systematically underestimate the execution time, execution costs, and risks associated with their plans. [Kahneman 1977] They called this bias the planning fallacy. By analogy with the bias they found, we can reasonably expect to find a risk planning fallacy that causes risk planners to systematically underestimate the probabilities and impacts of the risks they identify. In an additional twist, the risk planning fallacy causes risk planners to overlook risks in their plans, even though they might easily notice those same risks in the plans of other risk planners.

As Kahneman and Tversky write, "The planning fallacy is a consequence of the tendency to neglect distributional data and to adopt what may be termed an internal approach to prediction, in which one focuses on the constituents of the specific problem rather than on the distribution of outcomes in similar cases." [Kahneman 1979] We can then inquire as to the effect of this behavior on risk planning. There are three ways this behavior can affect risk planning: identifying risks, estimating risk probabilities, and estimating risk impacts.

Identifying risks
Trying to identify all risks that could affect a specific project is an example of what Kahneman and Tversky call "focusing on the constituents of the specific problem." By contrast, to take a distributional approach, we would instead determine in how many projects similar to this project did we encounter a risks that weren't anticipated in their risk plans. Call this question IR-1.
In answering IR-1 we must include all cases of past projects in which an unanticipated risk event occurred. But there are other instances of possibly greater interest. For example, with respect to an unanticipated risk event that did occur, we can ask how many past projects could have been affected by that same risk, but which escaped unscathed because the risk didn't materialize, even though it could have. Call this question IR-2.
Risk planners who don't ask the two questions IR-1 and IR-2 are vulnerable to omitting risk event types from their plans, and not being aware that they might be doing so.
Estimating risk event probabilities
In the singular-focused approach to risk planning, planners devise procedures for estimating the probability of risk events for each risk they've identified.
By contrast, in the distributional approach, planners survey past projects and compare the incidence of risk events to the estimated probabilities their planners calculated. The question to answer is how well the estimated probabilities compare to the actual events. (Call this question EP-1) A related question is how many past risk plans show evidence of measurement of risk event probabilities in projects that preceded them. (Call this question EP-2) Failure to measure risk event probabilities calls into question the procedures past risk planners used for devising estimates of risk event probabilities.
Risk planners who don't research questions EP-1 and EP-2 are vulnerable to underestimating risk event probabilities because they're unaware of the probability of doing so.
Estimating risk event impacts
The impact of a risk event is its effect on business value, often expressed as a numeric value (currency) or a severity level (a number chosen from a discrete list). [Engert 1999] Impact can have multiple dimensions. We can experience impacts on finance, reputation, regulatory compliance, health, safety, security, environment, and more. It's possible for a risk planner to gather data from past projects about the different impact values along these different axes. Call this question EI-1.
Risk planners who ignore EI-1 take a singular-focused approach. They try to estimate severity (or severities) for each type of risk event they have identified for their particular project. Planners who adopt a distributional approach will use the results of researching EI-1 to develop a risk profile from similar past projects, and use that as a basis for estimating the impact of all risks collectively on the current problem.

Last words

Researching the five questions IR-1, IR-2, EP-1, EP-2, and EI-1 for each project plan is a significant burden. Fortunately, much of this work is re-usable from project to project. Assembling and maintaining a library of these results can reduce the cost of this research below the cost of performing it for each project plan. And that can reduce the impact of the risk planning fallacy risk. Go to top Top  Next issue: Subject Lines for Intra-Team Messages  Next Issue

How to Spot a Troubled Project Before the Trouble StartsProjects never go quite as planned. We expect that, but we don't expect disaster. How can we get better at spotting disaster when there's still time to prevent it? How to Spot a Troubled Project Before the Trouble Starts is filled with tips for executives, senior managers, managers of project managers, and sponsors of projects in project-oriented organizations. It helps readers learn the subtle cues that indicate that a project is at risk for wreckage in time to do something about it. It's an ebook, but it's about 15% larger than "Who Moved My Cheese?" Just . Order Now! .

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Kahneman 1977]
Daniel Kahneman and Amos Tversky. "Intuitive Prediction: Biases and Corrective Procedures," Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977. Available here. Retrieved 19 September 2017. Back
[Kahneman 1979]
Daniel Kahneman and Amos Tversky. "Intuitive Prediction: Biases and Corrective Procedures," Management Science 12 (1979), 313-327. Back
[Engert 1999]
Pamela Engert and Zachary Lansdowne. (1999). Risk matrix user's guide. MA. The MITRE Corporation, Bedford, Massachusetts. Available here. Retrieved 23 August 2023. Back

Your comments are welcome

Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

The Leonard P. Zakim Bunker Hill BridgeSeven Planning Pitfalls: III
We usually attribute departures from plan to poor execution, or to "poor planning." But one cause of plan ineffectiveness is the way we think when we set about devising plans. Three cognitive biases that can play roles are the so-called Magical Number 7, the Ambiguity Effect, and the Planning Fallacy.
Roger Boisjoly of Morton Thiokol, who tried to halt the launch of the Challenger space shuttle in 1986Risk Acceptance: Naïve Realism
When we suddenly notice a "project-killer" risk that hasn't yet materialized, we sometimes accept the risk even though we know how seriously it threatens the effort. A psychological phenomenon known as naïve realism plays a role in this behavior.
A reversed calendar pageSome Perils of Reverse Scheduling
Especially when time is tight, project sponsors sometimes ask their project managers to produce "reverse schedules." They want to know what would have to be done by when to complete their projects "on time." It's a risky process that produces aggressive schedules.
Roger Boisjoly of Morton Thiokol, who tried to halt the launch of Challenger in 1986The Illusion of Explanatory Depth
The illusion of explanatory depth is the tendency of humans to believe they understand something better than they actually do. Discovering the illusion when you're explaining something is worse than embarrassing. It can be career ending.
Benjamin Franklin portrait by Joseph Siffred DuplessisClouted Thinking
When we say that people have "clout" we mean that they have more organizational power or social influence than most others do. But when people with clout try to use it in realms beyond those in which they've earned it, trouble looms.

See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.

Forthcoming issues of Point Lookout

A game of Jenga underwayComing September 4: Beating the Layoffs: I
If you work in an organization likely to conduct layoffs soon, keep in mind that exiting voluntarily before the layoffs can carry significant advantages. Here are some that relate to self-esteem, financial anxiety, and future employment. Available here and by RSS on September 4.
A child at a fork in a pathAnd on September 11: Beating the Layoffs: II
If you work in an organization likely to conduct layoffs soon, keep in mind that exiting voluntarily can carry advantages. Here are some advantages that relate to collegial relationships, future interviews, health, and severance packages. Available here and by RSS on September 11.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.