Some call them retrospectives, some call them post mortems, some call them lessons-learned meetings. By whatever name, the goal is to identify what worked well, so we can do more of it, and what worked not so well, so we can do less of that. And although we do benefit from these efforts, the result — it's fair to say — is most often less than impressive.
A lot less than impressive. We know this because our projects still take much longer than we thought they would and cost a lot more than we thought they should. A reasonable question arises: If we underestimated how long it would take (or how much it would cost) to do a project like this in the past, why is it so difficult to avoid underestimating how long it will take (or how much it will cost) to do this project now?
If we have access to all the data from past projects, why is that not enough information to help us repeat past successes and avoid repeating past mistakes?
The planning fallacy
In 1977 and again in 1979, Kahneman and Tversky provided the answer: a cognitive bias they called the planning fallacy. [Kahneman 1977] [Kahneman 1979] They observed that planners have available two categories of data when they develop their plans: "The singular information describes the specific features of the problem that distinguish it from others, while the distributional information characterizes the outcomes that have been observed in cases of the same general class."
They identified the planning fallacy as "a consequence of the tendency to neglect distributional data, and to adopt what may be termed an 'internal approach' to prediction, where one focuses on the constituents of the specific problem rather than on the distribution of outcomes in similar cases."
But "neglecting" If we have access to all the data
from past projects, why is that
information not enough to help us
repeat past successes and avoid
repeating past mistakes?distributional data is only one way the planning fallacy can lead to faulty plans. Ineffective or faulty incorporation of distributional data can present threats too. And it is at that point that additional cognitive biases can play a role in the planning fallacy.
In the almost five decades since the work of Kahneman and Tversky, researchers have identified hundreds of cognitive biases. With this work in mind, it's useful to consider how some of the cognitive biases identified more recently can contribute to the planning fallacy's tendency to undervalue distributional data. In this post and the next, I suggest how five of these cognitive biases can contribute to the planning fallacy in the special case of conducting a retrospective study of a past project, focusing on the project's failings. I begin with the Fundamental Attribution Error and Choice-Supportive Bias.
The Fundamental Attribution Error
The Fundamental Attribution Error is the tendency to explain the behavior of others with too much emphasis on their dispositions or character traits, and too little emphasis on the context in which they worked, or the actions of third parties.
So when we devise explanations for past disappointing performance of a project, we tend to attribute too much of the cause to the dispositions or character traits of project team members, and too little to other factors, such as the actions of people outside the team, or the organization's management, or general market conditions, or the state of the world's knowledge of pertinent subject matter.
But it gets worse. In some organizations, there is a taboo associated with critiquing the work of others. Because of the Fundamental Attribution Error, we tend to formulate explanations for failures that focus on the character of a past project's team members. This sets us up for a violation of the criticism taboo. In this way, the taboo and the Fundamental Attribution Error conspire to severely restrict examinations of failures. Under these conditions, harvesting much of value from a retrospective can be challenging.
Choice-supportive bias
Choice-supportive bias affects our ability to assess the fitness of our own past decisions. It causes us to tend to assess positively the options we chose, and to assess negatively the options we rejected. Because of this bias, we tend to conclude that our decisions to adopt or reject various options were in every instance correct.
To some degree, this bias enables us to "rewrite history" for decision processes. That's one reason why mitigating the effects of choice-supportive bias is of special interest to organizations that have recognized the need to monitor and continuously improve the quality of the results of their decision-making processes.
And mitigating the effects of choice-supportive bias is no less important for planners. Choice-supportive bias distorts memories to make the choices we made in the past appear to be the best that we could have made. In project retrospectives, this bias has greatest effect when the participants are assessing their own team's performance. Presumably, the effects of this bias are less significant when the assessors played no role in the project being assessed.
For planners whose focus is exposing opportunities for improvement over their own past performance, choice-supportive bias can be a source of confusion, because it causes us to generate "data" that is essentially fictitious. Because valid conclusions about the fitness of past decisions must be founded on facts, choice-supportive bias contaminates investigations.
Last words
The arguments above are essentially plausible speculations. But they do suggest points in the estimation process where planners must be especially careful. Next week, I'll explore the effects of three more cognitive biases: Confirmation Bias, the Overconfidence Effect, and Optimism Bias. Next issue in this series Top Next Issue
Occasionally we have the experience of belonging to a great team. Thrilling as it is, the experience is rare. In part, it's rare because we usually strive only for adequacy, not for greatness. We do this because we don't fully appreciate the returns on greatness. Not only does it feel good to be part of great team — it pays off. Check out my Great Teams Workshop to lead your team onto the path toward greatness. More info
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Project Management:
- Shining Some Light on "Going Dark"
- If you're a project manager, and a team member "goes dark" — disappears or refuses to
report how things are going — project risks escalate dramatically. Getting current status becomes
a top priority problem. What can you do?
- Personnel-Sensitive Risks: II
- Personnel-sensitive risks are risks that are difficult to discuss openly. Open discussion could infringe
on someone's privacy, or lead to hurt feelings, or to toxic politics or toxic conflict. If we can't
discuss them openly, how can we deal with them?
- On the Risk of Undetected Issues: II
- When things go wrong and remain undetected, trouble looms. We continue our efforts, increasing investment
on a path that possibly leads nowhere. Worse, time — that irreplaceable asset — passes.
How can we improve our ability to detect undetected issues?
- Risk Creep: II
- When risk events occur, and they're of a kind we never considered before, it's possible that we've somehow
invited those risks without realizing we have. This is one way for risk to creep into our efforts. Here's
Part II of an exploration of risk creep.
- The Risk Planning Fallacy
- The planning fallacy is a cognitive bias that causes underestimates of cost, time required, and risks
for projects. Analogously, I propose a risk planning fallacy that causes underestimates of probabilities
and impacts of risk events.
See also Project Management and Project Management for more related articles.
Forthcoming issues of Point Lookout
- Coming December 11: White Water Rafting as a Metaphor for Group Development
- Tuckman's model of small group development, best known as "Forming-Storming-Norming-Performing," applies better to development of some groups than to others. We can use a metaphor to explore how the model applies to Storming in task-oriented work groups. Available here and by RSS on December 11.
- And on December 18: Subgrouping and Conway's Law
- When task-oriented work groups address complex tasks, they might form subgroups to address subtasks. The structure of the subgroups and the order in which they form depend on the structure of the group's task and the sequencing of the subtasks. Available here and by RSS on December 18.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed