Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 23, Issue 33;   August 16, 2023: Lessons Not Learned: I

Lessons Not Learned: I

by

The planning fallacy is a cognitive bias that causes us to underestimate the cost and effort involved in projects large and small. Mitigating its effects requires understanding how we go wrong when we plan projects by referencing our own past experience.
Opera house, Sydney, Australia

Opera house, Sydney, Australia. Originally estimated in 1957 to be completed in six years for a cost of $7M. It was finally opened after 16 years at a cost of $102M. [Hall 1980] Image by Natasa Pavic courtesy Pixabay.

Some call them retrospectives, some call them post mortems, some call them lessons-learned meetings. By whatever name, the goal is to identify what worked well, so we can do more of it, and what worked not so well, so we can do less of that. And although we do benefit from these efforts, the result — it's fair to say — is most often less than impressive.

A lot less than impressive. We know this because our projects still take much longer than we thought they would and cost a lot more than we thought they should. A reasonable question arises: If we underestimated how long it would take (or how much it would cost) to do a project like this in the past, why is it so difficult to avoid underestimating how long it will take (or how much it will cost) to do this project now?

If we have access to all the data from past projects, why is that not enough information to help us repeat past successes and avoid repeating past mistakes?

The planning fallacy

In 1977 and again in 1979, Kahneman and Tversky provided the answer: a cognitive bias they called the planning fallacy. [Kahneman 1977] [Kahneman 1979] They observed that planners have available two categories of data when they develop their plans: "The singular information describes the specific features of the problem that distinguish it from others, while the distributional information characterizes the outcomes that have been observed in cases of the same general class."

They identified the planning fallacy as "a consequence of the tendency to neglect distributional data, and to adopt what may be termed an 'internal approach' to prediction, where one focuses on the constituents of the specific problem rather than on the distribution of outcomes in similar cases."

But "neglecting" If we have access to all the data
from past projects, why is that
information not enough to help us
repeat past successes and avoid
repeating past mistakes?
distributional data is only one way the planning fallacy can lead to faulty plans. Ineffective or faulty incorporation of distributional data can present threats too. And it is at that point that additional cognitive biases can play a role in the planning fallacy.

In the almost five decades since the work of Kahneman and Tversky, researchers have identified hundreds of cognitive biases. With this work in mind, it's useful to consider how some of the cognitive biases identified more recently can contribute to the planning fallacy's tendency to undervalue distributional data. In this post and the next, I suggest how five of these cognitive biases can contribute to the planning fallacy in the special case of conducting a retrospective study of a past project, focusing on the project's failings. I begin with the Fundamental Attribution Error and Choice-Supportive Bias.

The Fundamental Attribution Error

The Fundamental Attribution Error is the tendency to explain the behavior of others with too much emphasis on their dispositions or character traits, and too little emphasis on the context in which they worked, or the actions of third parties.

So when we devise explanations for past disappointing performance of a project, we tend to attribute too much of the cause to the dispositions or character traits of project team members, and too little to other factors, such as the actions of people outside the team, or the organization's management, or general market conditions, or the state of the world's knowledge of pertinent subject matter.

But it gets worse. In some organizations, there is a taboo associated with critiquing the work of others. Because of the Fundamental Attribution Error, we tend to formulate explanations for failures that focus on the character of a past project's team members. This sets us up for a violation of the criticism taboo. In this way, the taboo and the Fundamental Attribution Error conspire to severely restrict examinations of failures. Under these conditions, harvesting much of value from a retrospective can be challenging.

Choice-supportive bias

Choice-supportive bias affects our ability to assess the fitness of our own past decisions. It causes us to tend to assess positively the options we chose, and to assess negatively the options we rejected. Because of this bias, we tend to conclude that our decisions to adopt or reject various options were in every instance correct.

To some degree, this bias enables us to "rewrite history" for decision processes. That's one reason why mitigating the effects of choice-supportive bias is of special interest to organizations that have recognized the need to monitor and continuously improve the quality of the results of their decision-making processes.

And mitigating the effects of choice-supportive bias is no less important for planners. Choice-supportive bias distorts memories to make the choices we made in the past appear to be the best that we could have made. In project retrospectives, this bias has greatest effect when the participants are assessing their own team's performance. Presumably, the effects of this bias are less significant when the assessors played no role in the project being assessed.

For planners whose focus is exposing opportunities for improvement over their own past performance, choice-supportive bias can be a source of confusion, because it causes us to generate "data" that is essentially fictitious. Because valid conclusions about the fitness of past decisions must be founded on facts, choice-supportive bias contaminates investigations.

Last words

The arguments above are essentially plausible speculations. But they do suggest points in the estimation process where planners must be especially careful. Next week, I'll explore the effects of three more cognitive biases: Confirmation Bias, the Overconfidence Effect, and Optimism Bias.  Lessons Not Learned: II Next issue in this series  Go to top Top  Next issue: Lessons Not Learned: II  Next Issue

Great Teams WorkshopOccasionally we have the experience of belonging to a great team. Thrilling as it is, the experience is rare. In part, it's rare because we usually strive only for adequacy, not for greatness. We do this because we don't fully appreciate the returns on greatness. Not only does it feel good to be part of great team — it pays off. Check out my Great Teams Workshop to lead your team onto the path toward greatness. More info

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Hall 1980]
Peter Hall. Great planning disasters: With a new introduction, Vol. 1, University of California Press, 1982. Order from Amazon.com. Back
[Kahneman 1977]
Daniel Kahneman and Amos Tversky. "Intuitive Prediction: Biases and Corrective Procedures," Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977. Available here. Retrieved 19 September 2017. Back
[Kahneman 1979]
Daniel Kahneman and Amos Tversky. "Intuitive Prediction: Biases and Corrective Procedures," Management Science 12 (1979), 313-327. Back

Your comments are welcome

Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Project Management:

The Cone NebulaShining Some Light on "Going Dark"
If you're a project manager, and a team member "goes dark" — disappears or refuses to report how things are going — project risks escalate dramatically. Getting current status becomes a top priority problem. What can you do?
Harry Stonecipher, former CEO of The Boeing CompanyPersonnel-Sensitive Risks: II
Personnel-sensitive risks are risks that are difficult to discuss openly. Open discussion could infringe on someone's privacy, or lead to hurt feelings, or to toxic politics or toxic conflict. If we can't discuss them openly, how can we deal with them?
A schematic representation of the blowout preventer that failed in the Deepwater Horizon incidentOn the Risk of Undetected Issues: II
When things go wrong and remain undetected, trouble looms. We continue our efforts, increasing investment on a path that possibly leads nowhere. Worse, time — that irreplaceable asset — passes. How can we improve our ability to detect undetected issues?
Selling an ideaRisk Creep: II
When risk events occur, and they're of a kind we never considered before, it's possible that we've somehow invited those risks without realizing we have. This is one way for risk to creep into our efforts. Here's Part II of an exploration of risk creep.
A metaphor for preventing risk propagationThe Risk Planning Fallacy
The planning fallacy is a cognitive bias that causes underestimates of cost, time required, and risks for projects. Analogously, I propose a risk planning fallacy that causes underestimates of probabilities and impacts of risk events.

See also Project Management and Project Management for more related articles.

Forthcoming issues of Point Lookout

A white water rafting team completes its courseComing December 11: White Water Rafting as a Metaphor for Group Development
Tuckman's model of small group development, best known as "Forming-Storming-Norming-Performing," applies better to development of some groups than to others. We can use a metaphor to explore how the model applies to Storming in task-oriented work groups. Available here and by RSS on December 11.
Tuckman's stages of group developmentAnd on December 18: Subgrouping and Conway's Law
When task-oriented work groups address complex tasks, they might form subgroups to address subtasks. The structure of the subgroups and the order in which they form depend on the structure of the group's task and the sequencing of the subtasks. Available here and by RSS on December 18.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.