Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 20, Issue 36;   September 2, 2020:

Seven Planning Pitfalls: I


Whether in war or in projects, plans rarely work out as, umm well, as planned. In part, this is due to our limited ability to foretell the future, or to know what we don't know. But some of the problem arises from the way we think. And if we understand this we can make better plans.
A fly caught in a carnivorous plant known as a venus flytrap (Dionaea muscipula)

A fly caught in a carnivorous plant known as a venus flytrap (Dionaea muscipula). The plant's strategy depends for success on exploiting the behavior of its prey. It lures prey by means of a reddish lining of the trap's leaves and the smell of sweet nectar the plant secretes. Once triggered, the plant's jaws close, but loosely enough for smaller insects to escape. Smaller insects are of little interest to the venus flytrap. The jaws close more tightly only if the plant senses additional movement within a defined time interval. This prevents the plant from wasting energy, material, and time trying to digest nonprey items such as litter or raindrops. And carnivorous plants have another problem: they must discriminate between their prey and their pollinators. Because the venus flytrap's prey are mostly crawlers and its pollinators mostly fliers, the venus flytrap can discriminate between prey and pollinators by positioning its flowers at a higher altitude than its traps.

We can think of planning activities as preying on planners. Planners can fall prey to aspects of the planning activity that exploit how planners think about plans, and how planners behave within the organizations the host them.

Making plans for projects, campaigns, developments, complex events, or other collaborative activities is a collaborative activity in itself. A common experience of plans, sadly, is their inadequacy, even when we invest significant resources in developing them. Indeed, it's often said that "…no plan survives first contact…" with the enemy if in a military context, or with reality in a project management context.

But the planning activity has a reputation perhaps worse than it deserves. Planning a complex undertaking is difficult to do well because of unpredictable changes in the context, or incomplete or inaccurate information about that context, or incomplete or inaccurate information about the undertaking itself. That much is understood and expected. But much of what's wrong with our plans is a direct result of the way we think about making plans. Our human limitations manifest themselves in plan deficiencies.

In these next posts, I "plan" to construct a short catalog of seven observations about how plan deficiencies arise from the way we think, or the way we go about developing plans. This first part addresses how we use our experience and preferences and our knowledge of past mistakes to develop plans.

We make better plans for things we know, or favor, or can imagine
When we make plans, we rely on experience, preference, and imagination. That's why our plans exhibit three kinds of biases. First, plans tend to anticipate better those events or conditions that have occurred in the past. In military conventional wisdom, as the saying goes, we tend to plan to re-fight the last battle or the last war.
A second source of bias in our plans arises from our preferences or preconceptions. We tend to search for reasons that justify or coincide with our preferences or preconceived notions. We tend also to avoid searching for reasons why our preferences or preconceived notions about our plan might be mistaken. In this way, the data generated by our research tends to confirm our preferences and preconceptions. The pattern is so prevalent that psychologists have given it a name: confirmation bias. [Nickerson 1998]
The third source of Much of what's wrong with our
plans is a direct result of the way
we think about making plans
bias in our plans is a cognitive bias known as the Availability Heuristic. [Tversky 1973] We're using the Availability Heuristic when we determine the relevance of a phenomenon by sensing the difficulty of imagining or understanding the string of events that contribute to its development. So if we have difficulty imagining a phenomenon, or if we have difficulty imagining the conditions that bring it about, we regard it as less than likely, and we tend not to address it effectively in our plans.
We lack adequate information about failures
When we make plans and choose approaches, we tend to focus on what has worked for us or for others in the past. We invest effort in understanding why a particular method is reliable, or why an approach is recommended. We try to be knowledgeable about "best practices." Usually, whatever we draw upon does need tailoring, but we use it as guidance nevertheless.
We pay much less attention to failures. Finding information about "worst practices" or "less-than-best practices" or even "ok practices" is next to impossible. Failures are often buried quietly. We have difficulty consulting people of our own organizations who led past efforts that failed, because they are often terminated, blocked from promotion, reassigned, or departed from the organization. Other organizations rarely publish results of investigations into their own failures, even when they do publish stories of their successes. These are some of the reasons why our understanding of failures is much less thorough than is our understanding of successes. In some cases, there are past failures of which current planners are completely unaware, even when the causes of those failures might be relevant to the planning task at hand.
Our relative ignorance about failures might be a contributing factor when we repeat our own errors or the errors of others. We follow this pattern so often that psychologists have given it a name: survivorship bias. [Elton 1996] Survivorship bias is our tendency, when making plans or decisions, to pay too much attention to past events that we regard as successes, and too little attention to past events that we regard as failures.
But failures have more to offer than mere patterns to avoid. If we truly understand a particular failure, sometimes we can identify those attributes of a failed approach that account for the failure and which could be adjusted. Occasionally these insights can lead to solutions for new problems that might be very valuable indeed.

In Part II, I'll examine the effects of factors external to the planning process, and which are therefore beyond the control of the planning team.  Next in this series Go to top Top  Next issue: Seven Planning Pitfalls: II  Next Issue

52 Tips for Leaders of Project-Oriented OrganizationsAre your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!


[Nickerson 1998]
Raymond S. Nickerson. "Confirmation bias: A ubiquitous phenomenon in many guises," Review of General Psychology 2:2 (1998), 175-220. Available here. Retrieved 22 April 2021. Back
[Tversky 1973]
Amos Tversky and Daniel Kahneman. "Availability: a heuristic for judging frequency and probability," Cognitive Psychology 5 (1973), 207-232. Available here. Retrieved 23 April 2021. Back
[Elton 1996]
Edwin J.Elton, Martin J. Gruber, and Christopher R. Blake. "Survivor bias and mutual fund performance," The Review of Financial Studies 9:4 (1996), 1097-1120. Available here. Back

Your comments are welcome

Would you like to see your comments posted here? rbrenXEiRBfuFHUtjHrqUner@ChacpYPvvSVhUNIOeXHKoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Computer-generated image of the third stage ignition for Mars Climate OrbiterConfirmation Bias: Workplace Consequences Part II
We continue our exploration of confirmation bias. In this Part II, we explore its effects in management processes.
Gachi Fernandez and Sergio Cortazzo, professional tango coupleScope Creep, Hot Hands, and the Illusion of Control
Despite our awareness of scope creep's dangerous effects on projects and other efforts, we seem unable to prevent it. Two cognitive biases — the "hot hand fallacy" and "the illusion of control" — might provide explanations.
Domestic turkeys. The turkey has become known for lack of intelligence.The Stupidity Attribution Error
In workplace debates, we sometimes conclude erroneously that only stupidity can explain why our debate partners fail to grasp the elegance or importance of our arguments. There are many other possibilities.
An unfinished building, known as SzkieletorThe Planning Fallacy and Self-Interest
A well-known cognitive bias, the planning fallacy, accounts for many unrealistic estimates of project cost and schedule. Overruns are common. But another cognitive bias, and organizational politics, combine with the planning fallacy to make a bad situation even worse.
Unripe grapes that are probably sourMotivated Reasoning
When we prefer a certain outcome of a decision process, we risk falling into a pattern of motivated reasoning. That can cause us to gather data and construct arguments that erroneously lead to the outcome we prefer, often outside our awareness. And it can happen even when the outcome we prefer is known to threaten our safety and security.

See also Cognitive Biases at Work and Project Management for more related articles.

Forthcoming issues of Point Lookout

Adolf Hitler greets Neville Chamberlain at the beginning of the Bad Godesberg meeting on 24 September 1938Coming October 20: On Ineffectual Leaders
When the leader of an important business unit is ineffectual, we need to make a change to protect the organization. Because termination can seem daunting, people often turn to one or more of a variety of other options. Those options have risks. Available here and by RSS on October 20.
Browsing books in a library. So many books, we must make choicesAnd on October 27: Five Guidelines for Choices
Each day we make dozens or hundreds of choices — maybe more. We make many of those choices outside our awareness. But we can make better choices if we can recognize choice patterns that often lead to trouble. Here are five guidelines for making choices. Available here and by RSS on October 27.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenXEiRBfuFHUtjHrqUner@ChacpYPvvSVhUNIOeXHKoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Public seminars

The Power Affect: How We Express Our Personal Power

Many The Power Affect: How We Express Personal Powerpeople who possess real organizational power have a characteristic demeanor. It's the way they project their presence. I call this the power affect. Some people — call them power pretenders — adopt the power affect well before they attain significant organizational power. Unfortunately for their colleagues, and for their organizations, power pretenders can attain organizational power out of proportion to their merit or abilities. Understanding the power affect is therefore important for anyone who aims to attain power, or anyone who works with power pretenders. Read more about this program.

Bullet Points: Mastery or Madness?

DecisBullet Point Madnession makers in modern organizations commonly demand briefings in the form of bullet points or a series of series of bullet points. But this form of presentation has limited value for complex decisions. We need something more. We actually need to think. Briefers who combine the bullet-point format with a variety of persuasion techniques can mislead decision makers, guiding them into making poor decisions. Read more about this program.

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at Twitter, or share a tweet Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.