In two previous posts, I examined four reasons why planning complex projects is so difficult. In the first of these two posts, I explored how we can be limited by our ability to imagine what we're planning, and by our access to knowledge of past failures. In the second post, I considered external influences. One set of external influences is associated with a group of cognitive biases known as priming effects. A second set of external influences includes fads, dogma, regulations, and traditions.
Continuing this exploration of difficulties encountered in planning complex projects, I turn now to internal causes of trouble, namely, patterns in the way we think when we're constructing plans.
- Magical number 7
- In one of most-cited papers in the entirety of the psychology literature, George Miller reported experimental results regarding numerical limits to human cognition. [Miller 1956] There are limitations to the number of "chunks" people can hold in short term memory. And there are limits to our ability to distinguish among a defined set of stimuli to govern a choice among a defined set of associated responses. These limits, coincidentally, appear to lie in the range of 7±2 items.
- The effects of these limits on planning activities are both subtle and profound. For example, a widely believed maxim in the presentation architecture community is known as the 6x6 rule. It states that presentation slides should contain no more than six bullet points of not more than six words each. Although it is motivated by Miller's observations, experimental confirmation of its value has been elusive.
- However, the limits Miller described are real. They can play a role in planning efforts for complex projects that have parallel efforts that number more than 7±2. For example, managing a project with 19 parallel streams of tasks would tend to become unwieldy because project managers might not be able to keep the various attributes of so many work streams in their minds.
- The usual approach for dealing with this is to "chunk" the effort into pieces that are more manageable in number. Then if necessary, chunk each piece. This approach is usually called analysis and synthesis.
- Some of what we attribute to "poor planning"
is perhaps better regarded as an inevitable
result of how humans thinkAnalysis and synthesis can be problematic, because the chunks sometimes interact with each other in unexpected ways, outside the descriptions we use for the synthesis. - Ambiguity effect
- The ambiguity effect is a cognitive bias that affects how we make decisions under uncertainty. [Ellsberg 1961] When choosing among options that have favorable outcomes, we tend to favor those options for which the outcome is more certain, even if less favorable. And we tend to avoid options for which the probability of a given favorable outcome is unknown, even if all possible outcomes of that option are favorable.
- When devising plans for projects, the ambiguity effect can be costly indeed. For example, when considering a novel approach that offers great savings in cost and schedule, we might compare it unfavorably to a more familiar approach that's slower and more costly. The ambiguity effect causes us to favor the conventional approach more than might be justified by the uncertainties of using an unconventional approach for the first time.
- Mitigating the ambiguity effect requires careful estimation and objective computations.
- The planning fallacy
- In a 1977 report, Daniel Kahneman and Amos Tversky identify a particular cognitive bias, the planning fallacy, which afflicts planners. [Kahneman 1977] [Kahneman 1979] They discuss two types of information used by planners. Singular information is specific to the project at hand; distributional information is drawn from similar past efforts. The planning fallacy is the tendency of planners to pay too little attention to distributional information and too much attention to singular information, even when the singular information is scanty or questionable. Planners tend to underestimate cost and schedule by failing to harvest lessons from the distributional information, which is inherently more diverse and reliable than singular information.
- The tendency to attend too little to distributional information afflicts us all as people, but it can afflict organizations as well. For example, many organizations conduct retrospectives or "lessons learned" exercises in connection with projects. But the information they collect, valuable though it might be to subsequent projects, isn't always archived in ways that facilitate its use by the leaders of those subsequent projects. It might be scattered, or stored within the project that generated it, rather than collected with other similar volumes into an organized library. In some organizations, it is actually classified and its use is restricted.
- Such practices intensify the effects of the planning fallacy.
Knowing these patterns, and others like them, provides enormous advantages to planners. They can check their plans for these effects, and when they find indications of their presence, they can revise those plans to mitigate the effects. Of course, you have to plan on taking these steps from the outset. And that plan is itself subject to these same effects. First issue in this series Top Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- The Planning Fallacy and Self-Interest
- A well-known cognitive bias, the planning fallacy, accounts for many unrealistic estimates of project
cost and schedule. Overruns are common. But another cognitive bias, and organizational politics, combine
with the planning fallacy to make a bad situation even worse.
- The Rhyme-as-Reason Effect
- When we speak or write, the phrases we use have both form and meaning. Although we usually think of
form and meaning as distinct, humans tend to assess as more meaningful and valid those phrases that
are more beautifully formed. The rhyme-as-reason effect causes us to confuse the validity of a phrase
with its aesthetics.
- Some Perils of Reverse Scheduling
- Especially when time is tight, project sponsors sometimes ask their project managers to produce "reverse
schedules." They want to know what would have to be done by when to complete their projects "on
time." It's a risky process that produces aggressive schedules.
- The Illusion of Explanatory Depth
- The illusion of explanatory depth is the tendency of humans to believe they understand something better
than they actually do. Discovering the illusion when you're explaining something is worse than embarrassing.
It can be career ending.
- Lessons Not Learned: I
- The planning fallacy is a cognitive bias that causes us to underestimate the cost and effort involved
in projects large and small. Mitigating its effects requires understanding how we go wrong when we plan
projects by referencing our own past experience.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming December 11: White Water Rafting as a Metaphor for Group Development
- Tuckman's model of small group development, best known as "Forming-Storming-Norming-Performing," applies better to development of some groups than to others. We can use a metaphor to explore how the model applies to Storming in task-oriented work groups. Available here and by RSS on December 11.
- And on December 18: Subgrouping and Conway's Law
- When task-oriented work groups address complex tasks, they might form subgroups to address subtasks. The structure of the subgroups and the order in which they form depend on the structure of the group's task and the sequencing of the subtasks. Available here and by RSS on December 18.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed