Plans, as the saying goes, never survive first contact with Reality. As strong a statement as this is, it might be an understatement, because the first contact between plans and Reality occurs in their conception. That is, they come into contact with Reality while they're still being developed. The people who develop plans are people. And as people, they're vulnerable to a range of cognitive biases that affect their ability to make workable plans. For many plans, workability dies in conception.
Some of these biases are specific to groups. Three of these are False Consensus, Groupthink, and Shared Information Bias. They have much in common, in that they all work in different ways to limit the group's access to diversity of perspective. That limitation leads groups to develop plans that focus too much attention on some things, and not enough on others.
- False consensus
- False consensus is a phenomenon defined relative to a group and its circumstances. [Ross 1977] The False Consensus Effect is at work in a group when, with respect to the group's circumstances, members of the group assume that their own views, behaviors, and feelings are relatively common in the group. That is, the group members believe — incorrectly and often without evidence — that the group is in general agreement with respect to the issues at hand.
- The consequences of false consensus for plan development can be costly and severe. For example, consider a scenario in which one member of the planning team (I'll call him Mike) considers workable a particular approach A to part of the plan. Approach A is actually workable, but it would take too long to execute. But Mike likes Approach A, and doesn't know enough about it to realize that it would take too long to execute. In a series of discussions about this part of the plan, because no member of the team has said anything about schedule, all participants believe that Approach A is the leading candidate for this part of the plan. Other elements of the plan have since been developed assuming that Approach A would be selected. Only late in the process do the deficiencies of Approach A become evident, and much of the plan must be reworked. This could have been avoided if Mike had said, earlier in the process, "I like Approach A, but I don't know much about how long it would take, so I'm relying on the rest of you to assess whether it fits into our schedule."
- Planning teams wishing to avoid false consensus would do well to openly inventory their assumptions periodically.
- Groupthink is a pattern of group behavior that leads to a group's adopting a position or undertaking a project that conflicts with the group's stated objectives or values. [Janis 1982] Although groupthink is widely oversimplified as "premature unanimity," it actually has a number of components that contribute to the problem.
- Among groupthink's critical elements is a high degree of group cohesiveness, which exposes the group to risk by limiting diversity of perspective, by limiting members' ability to offer alternative perspectives, and by limiting their receptivity to offers of others. A second element, insulation from external perspectives, also limits the group's exposure to alternative views of the problems it addresses. A third element, biased and closed leadership, can prevent the group from accessing diverse perspectives that might be present within the group, or which might come to the attention of some group members, or which some members might recall from past experiences. A fourth element, lack of diversity in the social backgrounds of group members, further limits the group's access to alternative perspectives.
- These factors, False Consensus, Groupthink, and Shared
Information Bias are cognitive biases that
have much in common, in that they all
work in different ways to limit the
group's access to diversity of perspectiveand others, have more striking effects when the group must grapple with complex, unfamiliar problems under extreme time pressure, as often happens during planning complex projects. The probability of this problem occurring is elevated during the re-planning that occurs in response to unanticipated difficulties. The result is similar to premature unanimity, but a more fitting description might be incongruent unanimity — unanimity that doesn't fit the situation.
- Mitigating groupthink requires opening the group's social system to diverse perspectives, life experiences, agendas, and professions. In the planning context, one way to accomplish this is to ensure involvement of all stakeholders in the planning process.
- Shared information bias
- Shared information bias is the tendency of groups to spend time and energy discussing information that most group members already know. [Forsyth 2010] Groups seem to prefer such discussions to discussions of topics that only a few members know. Consequently they have less time and energy to devote to information that only a few members know. See "Effects of Shared Information Bias: I," Point Lookout for December 5, 2018, for more.
- In the planning context shared information bias can lead to plans that are overly complete and thorough with respect to some sets of issues, and inadequate with respect to all other issues. Moreover, when misadventures do occur, they tend to occur in areas in which the planning team lacks the depth and breadth of knowledge needed to support effective and timely re-planning efforts.
- Subjecting plans to thorough review is an effective mitigation for risk of shared information bias, but only to the extent that the reviewers specifically seek the unevenness and imbalances that are the hallmarks of shared information bias.
Just as biodiversity brings stability to biological systems, diversity of perspectives provides a sound foundation for planning efforts. How diverse is your planning team?
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Your comments are welcomeWould you like to see your comments posted here? rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- Effects of Shared Information Bias: I
- Shared information bias is the tendency for group discussions to emphasize what everyone already knows.
It's widely believed to lead to bad decisions. But it can do much more damage than that.
- Seven Planning Pitfalls: II
- Plans are well known for working out differently from what we intended. Sometimes, the unintended outcome
is due to external factors over which the planning team has little control. Two examples are priming
effects and widely held but inapplicable beliefs.
- Seven Planning Pitfalls: III
- We usually attribute departures from plan to poor execution, or to "poor planning." But one
cause of plan ineffectiveness is the way we think when we set about devising plans. Three cognitive
biases that can play roles are the so-called Magical Number 7, the Ambiguity Effect, and the Planning Fallacy.
- Seven More Planning Pitfalls: III
- Planning teams, like all teams, are vulnerable to several patterns of interaction that can lead to counter-productive
results. Two of these relevant to planners are a cognitive bias called the IKEA Effect, and a systemic
bias against realistic estimates of cost and schedule.
- Some Perils of Reverse Scheduling
- Especially when time is tight, project sponsors sometimes ask their project managers to produce "reverse
schedules." They want to know what would have to be done by when to complete their projects "on
time." It's a risky process that produces aggressive schedules.
Forthcoming issues of Point Lookout
- Coming December 13: Contrary Indicators of Psychological Safety: I
- To take the risks that learning and practicing new ways require, we all need a sense that trial-and-error approaches are safe. Organizations seeking to improve processes would do well to begin by assessing their level of psychological safety. Available here and by RSS on December 13.
- And on December 20: Contrary Indicators of Psychological Safety: II
- When we begin using new tools or processes, we make mistakes. Practice is the cure, but practice can be scary if the grace period for early mistakes is too short. For teams adopting new methods, psychological safety is a fundamental component of success. Available here and by RSS on December 20.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info