Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 23, Issue 36;   September 6, 2023: The Risk Planning Fallacy

The Risk Planning Fallacy


The planning fallacy is a cognitive bias that causes underestimates of cost, time required, and risks for projects. Analogously, I propose a risk planning fallacy that causes underestimates of probabilities and impacts of risk events.
A metaphor for preventing risk propagation

A metaphor for preventing risk propagation. Preventing the falling dominos from disrupting dominos that are still standing is a metaphor for preventing risks propagating from one project to the next. Image by OleksandrPidvalnyi courtesy Pixabay.com.

In a 1977 paper, Kahneman and Tversky identified a cognitive bias that causes project planners to systematically underestimate the execution time, execution costs, and risks associated with their plans. [Kahneman 1977] They called this bias the planning fallacy. By analogy with the bias they found, we can reasonably expect to find a risk planning fallacy that causes risk planners to systematically underestimate the probabilities and impacts of the risks they identify. In an additional twist, the risk planning fallacy causes risk planners to overlook risks in their plans, even though they might easily notice those same risks in the plans of other risk planners.

As Kahneman and Tversky write, "The planning fallacy is a consequence of the tendency to neglect distributional data and to adopt what may be termed an internal approach to prediction, in which one focuses on the constituents of the specific problem rather than on the distribution of outcomes in similar cases." [Kahneman 1979] We can then inquire as to the effect of this behavior on risk planning. There are three ways this behavior can affect risk planning: identifying risks, estimating risk probabilities, and estimating risk impacts.

Identifying risks
Trying to identify all risks that could affect a specific project is an example of what Kahneman and Tversky call "focusing on the constituents of the specific problem." By contrast, to take a distributional approach, we would instead determine in how many projects similar to this project did we encounter a risks that weren't anticipated in their risk plans. Call this question IR-1.
In answering IR-1 we must include all cases of past projects in which an unanticipated risk event occurred. But there are other instances of possibly greater interest. For example, with respect to an unanticipated risk event that did occur, we can ask how many past projects could have been affected by that same risk, but which escaped unscathed because the risk didn't materialize, even though it could have. Call this question IR-2.
Risk planners who don't ask the two questions IR-1 and IR-2 are vulnerable to omitting risk event types from their plans, and not being aware that they might be doing so.
Estimating risk event probabilities
In the singular-focused approach to risk planning, planners devise procedures for estimating the probability of risk events for each risk they've identified.
By contrast, in the distributional approach, planners survey past projects and compare the incidence of risk events to the estimated probabilities their planners calculated. The question to answer is how well the estimated probabilities compare to the actual events. (Call this question EP-1) A related question is how many past risk plans show evidence of measurement of risk event probabilities in projects that preceded them. (Call this question EP-2) Failure to measure risk event probabilities calls into question the procedures past risk planners used for devising estimates of risk event probabilities.
Risk planners who don't research questions EP-1 and EP-2 are vulnerable to underestimating risk event probabilities because they're unaware of the probability of doing so.
Estimating risk event impacts
The impact of a risk event is its effect on business value, often expressed as a numeric value (currency) or a severity level (a number chosen from a discrete list). [Engert 1999] Impact can have multiple dimensions. We can experience impacts on finance, reputation, regulatory compliance, health, safety, security, environment, and more. It's possible for a risk planner to gather data from past projects about the different impact values along these different axes. Call this question EI-1.
Risk planners who ignore EI-1 take a singular-focused approach. They try to estimate severity (or severities) for each type of risk event they have identified for their particular project. Planners who adopt a distributional approach will use the results of researching EI-1 to develop a risk profile from similar past projects, and use that as a basis for estimating the impact of all risks collectively on the current problem.

Last words

Researching the five questions IR-1, IR-2, EP-1, EP-2, and EI-1 for each project plan is a significant burden. Fortunately, much of this work is re-usable from project to project. Assembling and maintaining a library of these results can reduce the cost of this research below the cost of performing it for each project plan. And that can reduce the impact of the risk planning fallacy risk. Go to top Top  Next issue: Subject Lines for Intra-Team Messages  Next Issue

How to Spot a Troubled Project Before the Trouble StartsProjects never go quite as planned. We expect that, but we don't expect disaster. How can we get better at spotting disaster when there's still time to prevent it? How to Spot a Troubled Project Before the Trouble Starts is filled with tips for executives, senior managers, managers of project managers, and sponsors of projects in project-oriented organizations. It helps readers learn the subtle cues that indicate that a project is at risk for wreckage in time to do something about it. It's an ebook, but it's about 15% larger than "Who Moved My Cheese?" Just . Order Now! .


Comprehensive list of all citations from all editions of Point Lookout
[Kahneman 1977]
Daniel Kahneman and Amos Tversky. "Intuitive Prediction: Biases and Corrective Procedures," Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977. Available here. Retrieved 19 September 2017. Back
[Kahneman 1979]
Daniel Kahneman and Amos Tversky. "Intuitive Prediction: Biases and Corrective Procedures," Management Science 12 (1979), 313-327. Back
[Engert 1999]
Pamela Engert and Zachary Lansdowne. (1999). Risk matrix user's guide. MA. The MITRE Corporation, Bedford, Massachusetts. Available here. Retrieved 23 August 2023. Back

Your comments are welcome

Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Brendan Nyhan and Jason ReiflerWishful Significance: I
When things don't work out, and we investigate why, we sometimes attribute our misfortune to "wishful thinking." In this part of our exploration of wishful thinking we examine how we arrive at mistaken assessments of the significance of what we see, hear, or learn.
An actual bandwagon in a circus paradeCognitive Biases and Influence: I
The techniques of influence include inadvertent — and not-so-inadvertent — uses of cognitive biases. They are one way we lead each other to accept or decide things that rationality cannot support.
Thomas Paine, considered one of the Founding Fathers of the United StatesEffects of Shared Information Bias: II
Shared information bias is widely recognized as a cause of bad decisions. But over time, it can also erode a group's ability to assess reality accurately. That can lead to a widening gap between reality and the group's perceptions of reality.
An unfinished building, known as SzkieletorThe Planning Fallacy and Self-Interest
A well-known cognitive bias, the planning fallacy, accounts for many unrealistic estimates of project cost and schedule. Overruns are common. But another cognitive bias, and organizational politics, combine with the planning fallacy to make a bad situation even worse.
A hummingbird feeding on the nectar of a flowerDownscoping Under Pressure: II
We sometimes "downscope" projects to bring them back on budget and schedule when they're headed for overruns. Downscoping doesn't always work. Cognitive biases like the sunk cost effect and confirmation bias can distort decisions about how to downscope.

See also Cognitive Biases at Work and Project Management for more related articles.

Forthcoming issues of Point Lookout

A well-festooned utility poleComing June 26: Additive bias…or Not: I
When we alter existing systems to enhance them, we tend to favor adding components even when subtracting might be better. This effect has been attributed to a cognitive bias known as additive bias. But other forces more important might be afoot. Available here and by RSS on June 26.
A close-up view of a chipseal road surfaceAnd on July 3: Additive bias…Not: II
Additive bias is a cognitive bias that many believe contributes to bloat of commercial products. When we change products to make them more capable, additive bias might not play a role, because economic considerations sometimes favor additive approaches. Available here and by RSS on July 3.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.