Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 20, Issue 38;   September 16, 2020: Seven Planning Pitfalls: III

Seven Planning Pitfalls: III

by

We usually attribute departures from plan to poor execution, or to "poor planning." But one cause of plan ineffectiveness is the way we think when we set about devising plans. Three cognitive biases that can play roles are the so-called Magical Number 7, the Ambiguity Effect, and the Planning Fallacy.
The Leonard P. Zakim Bunker Hill Bridge

The Leonard P. Zakim Bunker Hill Bridge, part of Boston's "Big Dig." One of the many problems contributing to cost overruns in the project was a sequence of delays in determining the design of the Charles River Crossing. Once this decision-making overran its schedule, very serious problems developed later in the project, in both management and engineering and construction. For projects that take years to complete, inflation can be a significant cause of cost overruns. In the case of the Big Dig, the original cost estimate in 1982 was 2.8 billion dollars with completion in 1998. It was completed in December 2007 at a cost of over 8 billion dollars (in 1982 dollars). It has been estimated that over half of the overrun was due to inflation, though that figure is somewhat controversial. The lesson for long-lived projects is that initial cost estimates should be calculated in year-of-expenditure dollars.

A failure to allow for inflation might be traceable to the ambiguity of inflation itself. We have difficulty assessing the validity of cost projections that anticipate inflation in the future, because inflation is so difficult to predict. Buyers don't demand such projections, and financial experts have difficulty producing them.

Pictured is the Leonard P. Zakim Bunker Hill Bridge in a later construction phase, as it emerges from the Thomas P. "Tip" O'Neill Jr. Tunnel to cross the Charles River. This particular crossing is very near the one referred to as "two if by sea," in the poem, "The Midnight Ride of Paul Revere", by Longfellow (1860). Photo courtesy Massachusetts Turnpike Authority.

In two previous posts, I examined four reasons why planning complex projects is so difficult. In the first of these two posts, I explored how we can be limited by our ability to imagine what we're planning, and by our access to knowledge of past failures. In the second post, I considered external influences. One set of external influences is associated with a group of cognitive biases known as priming effects. A second set of external influences includes fads, dogma, regulations, and traditions.

Continuing this exploration of difficulties encountered in planning complex projects, I turn now to internal causes of trouble, namely, patterns in the way we think when we're constructing plans.

Magical number 7
In one of most-cited papers in the entirety of the psychology literature, George Miller reported experimental results regarding numerical limits to human cognition. [Miller 1956] There are limitations to the number of "chunks" people can hold in short term memory. And there are limits to our ability to distinguish among a defined set of stimuli to govern a choice among a defined set of associated responses. These limits, coincidentally, appear to lie in the range of 7±2 items.
The effects of these limits on planning activities are both subtle and profound. For example, a widely believed maxim in the presentation architecture community is known as the 6x6 rule. It states that presentation slides should contain no more than six bullet points of not more than six words each. Although it is motivated by Miller's observations, experimental confirmation of its value has been elusive.
However, the limits Miller described are real. They can play a role in planning efforts for complex projects that have parallel efforts that number more than 7±2. For example, managing a project with 19 parallel streams of tasks would tend to become unwieldy because project managers might not be able to keep the various attributes of so many work streams in their minds.
The usual approach for dealing with this is to "chunk" the effort into pieces that are more manageable in number. Then if necessary, chunk each piece. This approach is usually called analysis and synthesis.
Some of what we attribute to "poor planning"
is perhaps better regarded as an inevitable
result of how humans think
Analysis and synthesis can be problematic, because the chunks sometimes interact with each other in unexpected ways, outside the descriptions we use for the synthesis.
Ambiguity effect
The ambiguity effect is a cognitive bias that affects how we make decisions under uncertainty. [Ellsberg 1961] When choosing among options that have favorable outcomes, we tend to favor those options for which the outcome is more certain, even if less favorable. And we tend to avoid options for which the probability of a given favorable outcome is unknown, even if all possible outcomes of that option are favorable.
When devising plans for projects, the ambiguity effect can be costly indeed. For example, when considering a novel approach that offers great savings in cost and schedule, we might compare it unfavorably to a more familiar approach that's slower and more costly. The ambiguity effect causes us to favor the conventional approach more than might be justified by the uncertainties of using an unconventional approach for the first time.
Mitigating the ambiguity effect requires careful estimation and objective computations.
The planning fallacy
In a 1977 report, Daniel Kahneman and Amos Tversky identify a particular cognitive bias, the planning fallacy, which afflicts planners. [Kahneman 1977] [Kahneman 1979] They discuss two types of information used by planners. Singular information is specific to the project at hand; distributional information is drawn from similar past efforts. The planning fallacy is the tendency of planners to pay too little attention to distributional information and too much attention to singular information, even when the singular information is scanty or questionable. Planners tend to underestimate cost and schedule by failing to harvest lessons from the distributional information, which is inherently more diverse and reliable than singular information.
The tendency to attend too little to distributional information afflicts us all as people, but it can afflict organizations as well. For example, many organizations conduct retrospectives or "lessons learned" exercises in connection with projects. But the information they collect, valuable though it might be to subsequent projects, isn't always archived in ways that facilitate its use by the leaders of those subsequent projects. It might be scattered, or stored within the project that generated it, rather than collected with other similar volumes into an organized library. In some organizations, it is actually classified and its use is restricted.
Such practices intensify the effects of the planning fallacy.

Knowing these patterns, and others like them, provides enormous advantages to planners. They can check their plans for these effects, and when they find indications of their presence, they can revise those plans to mitigate the effects. Of course, you have to plan on taking these steps from the outset. And that plan is itself subject to these same effects.  Seven Planning Pitfalls: I First issue in this series  Go to top Top  Next issue: Seven More Planning Pitfalls: I  Next Issue

52 Tips for Leaders of Project-Oriented OrganizationsAre your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Miller 1956]
George A. Miller. "The magical number seven, plus or minus two: Some limits on our capacity for processing information," Psychological Review 63:2 (1956), 81-97. Available here. Back
[Ellsberg 1961]
Daniel Ellsberg. "Risk, Ambiguity and the Savage Axioms," The Quarterly Journal of Economics (1961), 643-669. Available here. Back
[Kahneman 1977]
Daniel Kahneman and Amos Tversky. "Intuitive Prediction: Biases and Corrective Procedures," Technical Report PTR-1042-7746, Defense Advanced Research Projects Agency, June 1977. Available here. Retrieved 19 September 2017. Back
[Kahneman 1979]
Daniel Kahneman and Amos Tversky. "Intuitive Prediction: Biases and Corrective Procedures," Management Science 12 (1979), 313-327. Back

Your comments are welcome

Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Domestic turkeys. The turkey has become known for lack of intelligence.The Stupidity Attribution Error
In workplace debates, we sometimes conclude erroneously that only stupidity can explain why our debate partners fail to grasp the elegance or importance of our arguments. There are many other possibilities.
Braided streams in Grewingk Glacier RiverRisk Acceptance: One Path
When a project team decides to accept a risk, and when their project eventually experiences that risk, a natural question arises: What were they thinking? Cognitive biases, other psychological phenomena, and organizational dysfunction all can play roles.
Children playing a computer gameThe Risk of Astonishing Success
When we experience success, we're more likely to develop overconfidence. And when the success is so extreme as to induce astonishment, we become even more vulnerable to overconfidence. It's a real risk of success that must be managed.
Adolf Hitler, dictator of Germany and leader of the Nazi party 1934-1945Confirmation Bias and Myside Bias
Although we regard ourselves as rational, a well-established body of knowledge shows that rationality plays a less-than-central role in our decision-making process. Confirmation Bias and Myside Bias are two cognitive biases that influence our decisions.
A close-up view of a chipseal road surfaceAdditive bias…or Not: II
Additive bias is a cognitive bias that many believe contributes to bloat of commercial products. When we change products to make them more capable, additive bias might not play a role, because economic considerations sometimes favor additive approaches.

See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.

Forthcoming issues of Point Lookout

A game of Jenga underwayComing September 4: Beating the Layoffs: I
If you work in an organization likely to conduct layoffs soon, keep in mind that exiting voluntarily before the layoffs can carry significant advantages. Here are some that relate to self-esteem, financial anxiety, and future employment. Available here and by RSS on September 4.
A child at a fork in a pathAnd on September 11: Beating the Layoffs: II
If you work in an organization likely to conduct layoffs soon, keep in mind that exiting voluntarily can carry advantages. Here are some advantages that relate to collegial relationships, future interviews, health, and severance packages. Available here and by RSS on September 11.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.