Some call them retrospectives, some call them post mortems, some call them lessons-learned meetings. By whatever name, the goal is to identify what worked well, so we can do more of it, and what worked not so well, so we can do less of that. And although we do benefit from these efforts, the result — it's fair to say — is most often less than impressive.
A lot less than impressive. We know this because our projects still take much longer than we thought they would and cost a lot more than we thought they should. A reasonable question arises: If we underestimated how long it would take (or how much it would cost) to do a project like this in the past, why is it so difficult to avoid underestimating how long it will take (or how much it will cost) to do this project now?
If we have access to all the data from past projects, why is that not enough information to help us repeat past successes and avoid repeating past mistakes?
The planning fallacy
In 1977 and again in 1979, Kahneman and Tversky provided the answer: a cognitive bias they called the planning fallacy. [Kahneman 1977] [Kahneman 1979] They observed that planners have available two categories of data when they develop their plans: "The singular information describes the specific features of the problem that distinguish it from others, while the distributional information characterizes the outcomes that have been observed in cases of the same general class."
They identified the planning fallacy as "a consequence of the tendency to neglect distributional data, and to adopt what may be termed an 'internal approach' to prediction, where one focuses on the constituents of the specific problem rather than on the distribution of outcomes in similar cases."
But "neglecting" If we have access to all the data
from past projects, why is that
information not enough to help us
repeat past successes and avoid
repeating past mistakes?distributional data is only one way the planning fallacy can lead to faulty plans. Ineffective or faulty incorporation of distributional data can present threats too. And it is at that point that additional cognitive biases can play a role in the planning fallacy.
In the almost five decades since the work of Kahneman and Tversky, researchers have identified hundreds of cognitive biases. With this work in mind, it's useful to consider how some of the cognitive biases identified more recently can contribute to the planning fallacy's tendency to undervalue distributional data. In this post and the next, I suggest how five of these cognitive biases can contribute to the planning fallacy in the special case of conducting a retrospective study of a past project, focusing on the project's failings. I begin with the Fundamental Attribution Error and Choice-Supportive Bias.
The Fundamental Attribution Error
The Fundamental Attribution Error is the tendency to explain the behavior of others with too much emphasis on their dispositions or character traits, and too little emphasis on the context in which they worked, or the actions of third parties.
So when we devise explanations for past disappointing performance of a project, we tend to attribute too much of the cause to the dispositions or character traits of project team members, and too little to other factors, such as the actions of people outside the team, or the organization's management, or general market conditions, or the state of the world's knowledge of pertinent subject matter.
But it gets worse. In some organizations, there is a taboo associated with critiquing the work of others. Because of the Fundamental Attribution Error, we tend to formulate explanations for failures that focus on the character of a past project's team members. This sets us up for a violation of the criticism taboo. In this way, the taboo and the Fundamental Attribution Error conspire to severely restrict examinations of failures. Under these conditions, harvesting much of value from a retrospective can be challenging.
Choice-supportive bias affects our ability to assess the fitness of our own past decisions. It causes us to tend to assess positively the options we chose, and to assess negatively the options we rejected. Because of this bias, we tend to conclude that our decisions to adopt or reject various options were in every instance correct.
To some degree, this bias enables us to "rewrite history" for decision processes. That's one reason why mitigating the effects of choice-supportive bias is of special interest to organizations that have recognized the need to monitor and continuously improve the quality of the results of their decision-making processes.
And mitigating the effects of choice-supportive bias is no less important for planners. Choice-supportive bias distorts memories to make the choices we made in the past appear to be the best that we could have made. In project retrospectives, this bias has greatest effect when the participants are assessing their own team's performance. Presumably, the effects of this bias are less significant when the assessors played no role in the project being assessed.
For planners whose focus is exposing opportunities for improvement over their own past performance, choice-supportive bias can be a source of confusion, because it causes us to generate "data" that is essentially fictitious. Because valid conclusions about the fitness of past decisions must be founded on facts, choice-supportive bias contaminates investigations.
The arguments above are essentially plausible speculations. But they do suggest points in the estimation process where planners must be especially careful. Next week, I'll explore the effects of three more cognitive biases: Confirmation Bias, the Overconfidence Effect, and Optimism Bias. Next in this series Top Next Issue
Occasionally we have the experience of belonging to a great team. Thrilling as it is, the experience is rare. In part, it's rare because we usually strive only for adequacy, not for greatness. We do this because we don't fully appreciate the returns on greatness. Not only does it feel good to be part of great team — it pays off. Check out my Great Teams Workshop to lead your team onto the path toward greatness. More info
Your comments are welcomeWould you like to see your comments posted here? rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Project Management:
- Make a Project Family Album
- Like a traditional family album, a project family album has pictures of people, places, and events.
It builds connections, helps tie the team together, and it can be as much fun to look through as it
is to create.
- Who Would You Take With You to Mars?
- What makes a great team? What traits do you value in teammates? Project teams can learn a lot from the
latest thinking about designing teams for extended space exploration.
- Risk Management Risk: II
- Risk Management Risk is the risk that a particular risk management plan is deficient. Here are some
guidelines for reducing risk management risk arising from risk interactions and change.
- Mitigating Risk Resistance Risk
- Project managers are responsible for managing risks, but they're often stymied by insufficient resources.
Here's a proposal for making risk management more effective at an organizational scale.
- The Planning Fallacy and Self-Interest
- A well-known cognitive bias, the planning fallacy, accounts for many unrealistic estimates of project
cost and schedule. Overruns are common. But another cognitive bias, and organizational politics, combine
with the planning fallacy to make a bad situation even worse.
Forthcoming issues of Point Lookout
- Coming September 27: On Working Breaks in Meetings
- When we convene a meeting to work a problem, we sometimes find that progress is stalled. Taking a break to allow a subgroup to work part of the problem can be key to finding simple, elegant solutions rapidly. Choosing the subgroup is only the first step. Available here and by RSS on September 27.
- And on October 4: Self-Importance and Conversational Narcissism at Work: I
- Conversational narcissism is a set of behaviors that participants use to focus the exchange on their own self-interest rather than the shared objective. This post emphasizes the role of these behaviors in advancing a narcissist's sense of self-importance. Available here and by RSS on October 4.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info