Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 23, Issue 36;   September 6, 2023: The Risk Planning Fallacy

The Risk Planning Fallacy

by

The planning fallacy is a cognitive bias that causes underestimates of cost, time required, and risks for projects. Analogously, I propose a risk planning fallacy that causes underestimates of probabilities and impacts of risk events.
A metaphor for preventing risk propagation

A metaphor for preventing risk propagation. Preventing the falling dominos from disrupting dominos that are still standing is a metaphor for preventing risks propagating from one project to the next. Image by OleksandrPidvalnyi courtesy Pixabay.com.

In a 1977 paper, Kahneman and Tversky identified a cognitive bias that causes project planners to systematically underestimate the execution time, execution costs, and risks associated with their plans. [Kahneman 1977] They called this bias the planning fallacy. By analogy with the bias they found, we can reasonably expect to find a risk planning fallacy that causes risk planners to systematically underestimate the probabilities and impacts of the risks they identify. In an additional twist, the risk planning fallacy causes risk planners to overlook risks in their plans, even though they might easily notice those same risks in the plans of other risk planners.

As Kahneman and Tversky write, "The planning fallacy is a consequence of the tendency to neglect distributional data and to adopt what may be termed an internal approach to prediction, in which one focuses on the constituents of the specific problem rather than on the distribution of outcomes in similar cases." [Kahneman 1979] We can then inquire as to the effect of this behavior on risk planning. There are three ways this behavior can affect risk planning: identifying risks, estimating risk probabilities, and estimating risk impacts.

Identifying risks
Trying to identify all risks that could affect a specific project is an example of what Kahneman and Tversky call "focusing on the constituents of the specific problem." By contrast, to take a distributional approach, we would instead determine in how many projects similar to this project did we encounter a risks that weren't anticipated in their risk plans. Call this question IR-1.
In answering IR-1 we must include all cases of past projects in which an unanticipated risk event occurred. But there are other instances of possibly greater interest. For example, with respect to an unanticipated risk event that did occur, we can ask how many past projects could have been affected by that same risk, but which escaped unscathed because the risk didn't materialize, even though it could have. Call this question IR-2.
Risk planners who don't ask the two questions IR-1 and IR-2 are vulnerable to omitting risk event types from their plans, and not being aware that they might be doing so.
Estimating risk event probabilities
In the singular-focused approach to risk planning, planners devise procedures for estimating the probability of risk events for each risk they've identified.
By contrast, in the distributional approach, planners survey past projects and compare the incidence of risk events to the estimated probabilities their planners calculated. The question to answer is how well the estimated probabilities compare to the actual events. (Call this question EP-1) A related question is how many past risk plans show evidence of measurement of risk event probabilities in projects that preceded them. (Call this question EP-2) Failure to measure risk event probabilities calls into question the procedures past risk planners used for devising estimates of risk event probabilities.
Risk planners who don't research questions EP-1 and EP-2 are vulnerable to underestimating risk event probabilities because they're unaware of the probability of doing so.
Estimating risk event impacts
The impact of a risk event is its effect on business value, often expressed as a numeric value (currency) or a severity level (a number chosen from a discrete list). [Engert 1999] Impact can have multiple dimensions. We can experience impacts on finance, reputation, regulatory compliance, health, safety, security, environment, and more. It's possible for a risk planner to gather data from past projects about the different impact values along these different axes. Call this question EI-1.
Risk planners who ignore EI-1 take a singular-focused approach. They try to estimate severity (or severities) for each type of risk event they have identified for their particular project. Planners who adopt a distributional approach will use the results of researching EI-1 to develop a risk profile from similar past projects, and use that as a basis for estimating the impact of all risks collectively on the current problem.

Last words

Researching the five questions IR-1, IR-2, EP-1, EP-2, and EI-1 for each project plan is a significant burden. Fortunately, much of this work is re-usable from project to project. Assembling and maintaining a library of these results can reduce the cost of this research below the cost of performing it for each project plan. And that can reduce the impact of the risk planning fallacy risk. Go to top Top  Next issue: Subject Lines for Intra-Team Messages  Next Issue

How to Spot a Troubled Project Before the Trouble StartsProjects never go quite as planned. We expect that, but we don't expect disaster. How can we get better at spotting disaster when there's still time to prevent it? How to Spot a Troubled Project Before the Trouble Starts is filled with tips for executives, senior managers, managers of project managers, and sponsors of projects in project-oriented organizations. It helps readers learn the subtle cues that indicate that a project is at risk for wreckage in time to do something about it. It's an ebook, but it's about 15% larger than "Who Moved My Cheese?" Just . Order Now! .

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Kahneman 1977]
Daniel Kahneman and Amos Tversky. "Intuitive Prediction: Biases and Corrective Procedures," Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977. Available here. Retrieved 19 September 2017. Back
[Kahneman 1979]
Daniel Kahneman and Amos Tversky. "Intuitive Prediction: Biases and Corrective Procedures," Management Science 12 (1979), 313-327. Back
[Engert 1999]
Pamela Engert and Zachary Lansdowne. (1999). Risk matrix user's guide. MA. The MITRE Corporation, Bedford, Massachusetts. Available here. Retrieved 23 August 2023. Back

Your comments are welcome

Would you like to see your comments posted here? rbrenIyeJIiAfnGdKlUXrner@ChacsxirZwZlENmHUNHioCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Computer-generated image of the third stage ignition for Mars Climate OrbiterConfirmation Bias: Workplace Consequences Part II
We continue our exploration of confirmation bias. In this Part II, we explore its effects in management processes.
Louis Pasteur in 1885Wishful Significance: II
When we're beset by seemingly unresolvable problems, we sometimes conclude that "wishful thinking" was the cause. Wishful thinking can result from errors in assessing the significance of our observations. Here's a second group of causes of erroneous assessment of significance.
Prof. Tom PettigrewThe Ultimate Attribution Error at Work
When we attribute the behavior of members of groups to some cause, either personal or situational, we tend to make systematic errors. Those errors can be expensive and avoidable.
Bullet pointsBullet Point Madness: II
Decision makers in many organizations commonly demand briefings in the form of a series of bullet points or a series of series of bullet points. Briefers who combine this format with a variety of persuasion techniques can mislead decision makers, guiding them into making poor decisions.
Adolf Hitler, dictator of Germany and leader of the Nazi party 1934-1945Confirmation Bias and Myside Bias
Although we regard ourselves as rational, a well-established body of knowledge shows that rationality plays a less-than-central role in our decision-making process. Confirmation Bias and Myside Bias are two cognitive biases that influence our decisions.

See also Cognitive Biases at Work and Project Management for more related articles.

Forthcoming issues of Point Lookout

What most of us think of when we think of checklistsComing February 28: Checklists: Conventional or Auditable
Checklists help us remember the steps of complex procedures, and the order in which we must execute them. The simplest form is the conventional checklist. But when we need a record of what we've done, we need an auditable checklist. Available here and by RSS on February 28.
Adolf Hitler greets Neville Chamberlain at the beginning of the Bad Godesberg meeting on 24 September 1938And on March 6: Six More Insights About Workplace Bullying
Some of the lore about dealing with bullies at work isn't just wrong — it's harmful. It's harmful in the sense that applying it intensifies the bullying. Here are six insights that might help when devising strategies for dealing with bullies at work. Example: Letting yourself be bullied is not a thing. Available here and by RSS on March 6.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenIyeJIiAfnGdKlUXrner@ChacsxirZwZlENmHUNHioCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a tweet Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.