Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 20, Issue 36;   September 2, 2020:

Seven Planning Pitfalls: I

by

Whether in war or in projects, plans rarely work out as, umm well, as planned. In part, this is due to our limited ability to foretell the future, or to know what we don't know. But some of the problem arises from the way we think. And if we understand this we can make better plans.
A fly caught in a carnivorous plant known as a venus flytrap (Dionaea muscipula)

A fly caught in a carnivorous plant known as a venus flytrap (Dionaea muscipula). The plant's strategy depends for success on exploiting the behavior of its prey. It lures prey by means of a reddish lining of the trap's leaves and the smell of sweet nectar the plant secretes. Once triggered, the plant's jaws close, but loosely enough for smaller insects to escape. Smaller insects are of little interest to the venus flytrap. The jaws close more tightly only if the plant senses additional movement within a defined time interval. This prevents the plant from wasting energy, material, and time trying to digest non-prey items such as litter or raindrops. And carnivorous plants have another problem: they must discriminate between their prey and their pollinators. Because the venus flytrap's prey are mostly crawlers and its pollinators mostly fliers, the venus flytrap can discriminate between prey and pollinators by positioning its flowers at a higher altitude than its traps.

We can think of planning activities as preying on planners. Planners can fall prey to aspects of the planning activity that exploit how planners think about plans, and how planners behave within the organizations the host them.

Making plans for projects, campaigns, developments, complex events, or other collaborative activities is a collaborative activity in itself. A common experience of plans, sadly, is their inadequacy, even when we invest significant resources in developing them. Indeed, it's often said that "…no plan survives first contact…" with the enemy if in a military context, or with reality in a project management context.

But the planning activity has a reputation perhaps worse than it deserves. Planning a complex undertaking is difficult to do well because of unpredictable changes in the context, or incomplete or inaccurate information about that context, or incomplete or inaccurate information about the undertaking itself. That much is understood and expected. But much of what's wrong with our plans is a direct result of the way we think about making plans. Our human limitations manifest themselves in plan deficiencies.

In these next posts, I "plan" to construct a short catalog of seven observations about how plan deficiencies arise from the way we think, or the way we go about developing plans. This first part addresses how we use our experience and preferences and our knowledge of past mistakes to develop plans.

We make better plans for things we know, or favor, or can imagine
When we make plans, we rely on experience, preference, and imagination. That's why our plans exhibit three kinds of biases. First, plans tend to anticipate better those events or conditions that have occurred in the past. In military conventional wisdom, as the saying goes, we tend to plan to re-fight the last battle or the last war.
A second source of bias in our plans arises from our preferences or preconceptions. We tend to search for reasons that justify or coincide with our preferences or preconceived notions. We tend also to avoid searching for reasons why our preferences or preconceived notions about our plan might be mistaken. In this way, the data generated by our research tends to confirm our preferences and preconceptions. The pattern is so prevalent that psychologists have given it a name: confirmation bias. [Nickerson 1998].
The third source of Much of what's wrong with our
plans is a direct result of the way
we think about making plans
bias in our plans is a cognitive bias known as the Availability Heuristic. [Tversky 1973] We're using the Availability Heuristic when we determine the relevance of a phenomenon by sensing the difficulty of imagining or understanding the string of events that contribute to its development. So if we have difficulty imagining a phenomenon, or if we have difficulty imagining the conditions that bring it about, we regard it as less than likely, and we tend not to address it effectively in our plans.
We lack adequate information about failures
When we make plans and choose approaches, we tend to focus on what has worked for us or for others in the past. We invest effort in understanding why a particular method is reliable, or why an approach is recommended. We try to be knowledgeable about "best practices." Usually, whatever we draw upon does need tailoring, but we use it as guidance nevertheless.
We pay much less attention to failures. Finding information about "worst practices" or "less-than-best practices" or even "ok practices" is next to impossible. Failures are often buried quietly. We have difficulty consulting people of our own organizations who led past efforts that failed, because they are often terminated, blocked from promotion, reassigned, or departed from the organization. Other organizations rarely publish results of investigations into their own failures, even when they do publish stories of their successes. These are some of the reasons why our understanding of failures is much less thorough than is our understanding of successes. In some cases, there are past failures of which current planners are completely unaware, even when the causes of those failures might be relevant to the planning task at hand.
Our relative ignorance about failures might be a contributing factor when we repeat our own errors or the errors of others. We follow this pattern so often that psychologists have given it a name: survivorship bias [Elton 1996]. Survivorship bias is our tendency, when making plans or decisions, to pay too much attention to past events that we regard as successes, and too little attention to past events that we regard as failures.
But failures have more to offer than mere patterns to avoid. If we truly understand a particular failure, sometimes we can identify those attributes of a failed approach that account for the failure and which could be adjusted. Occasionally these insights can lead to solutions for new problems that might be very valuable indeed.

In Part II, I'll examine the effects of factors external to the planning process, and which are therefore beyond the control of the planning team.  Next in this series Go to top Top  Next issue: Seven Planning Pitfalls: II  Next Issue

52 Tips for Leaders of Project-Oriented OrganizationsAre your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!

Footnotes

[Nickerson 1998]
Raymond S. Nickerson, "Confirmation bias: A ubiquitous phenomenon in many guises," Review of General Psychology, 2:2 (1998), 175-220. Available here. Back
[Tversky 1973]
Amos Tversky, and Daniel Kahneman. "Availability: a heuristic for judging frequency and probability," Cognitive Psychology 5, (1973), 207-232. Back
[Elton 1996]
Edwin J.Elton, Martin J. Gruber, and Christopher R. Blake. "Survivor bias and mutual fund performance," The review of financial studies 9:4 (1996), 1097-1120. Available here. Back

Your comments are welcome

Would you like to see your comments posted here? rbrenGBFYqdeDxZESDSsjner@ChacmtFQZGrwOdySPdSsoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Computer-generated image of the third stage ignition for Mars Climate OrbiterConfirmation Bias: Workplace Consequences Part II
We continue our exploration of confirmation bias. In this Part II, we explore its effects in management processes.
Prof. Jack Brehm, who developed the theory of psychological reactanceCognitive Biases and Influence: II
Most advice about influencing others offers intentional tactics. Yet, the techniques we actually use are often unintentional, and we're therefore unaware of them. Among these are tactics exploiting cognitive biases.
An engineer attending a meeting with 14 other peopleHow Messages Get Mixed
Although most authors of mixed messages don't intend to be confusing, message mixing does happen. One of the most fascinating mixing mechanisms occurs in the mind of the recipient of the message.
Bullet pointsBullet Point Madness: II
Decision-makers in many organizations commonly demand briefings in the form of a series of bullet points or a series of series of bullet points. Briefers who combine this format with a variety of persuasion techniques can mislead decision-makers, guiding them into making poor decisions.
The Leonard P. Zakim Bunker Hill BridgeSeven Planning Pitfalls: III
We usually attribute departures from plan to poor execution, or to "poor planning." But one cause of plan ineffectiveness is the way we think when we set about devising plans. Three cognitive biases that can play roles are the so-called Magical Number 7, the Ambiguity Effect, and the Planning Fallacy.

See also Cognitive Biases at Work and Project Management for more related articles.

Forthcoming issues of Point Lookout

NASA's Mars Climate Orbiter, which was lost on attempted entry into Mars orbitComing March 10: On Repeatable Blunders
When organizations make mistakes, they sometimes acknowledge them and learn how to avoid repeating them. And sometimes they conceal them or even deny they happened. When they conceal mistakes or deny they occurred, repetition is more likely. Available here and by RSS on March 10.
A U.S. 100-dollar bill made into a jigsaw puzzleAnd on March 17: Facts, Opinions, Estimates, and Desires
One reason why resource allocation debates can become so difficult is confusion about the differences among facts, opinions, estimates, and desires. Clarifying their differences can reduce the length and intensity of resource allocation debates. Available here and by RSS on March 17.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenGBFYqdeDxZESDSsjner@ChacmtFQZGrwOdySPdSsoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Public seminars

The Power Affect: How We Express Our Personal Power

Many The Power Affect: How We Express Personal Powerpeople who possess real organizational power have a characteristic demeanor. It's the way they project their presence. I call this the power affect. Some people — call them power pretenders — adopt the power affect well before they attain significant organizational power. Unfortunately for their colleagues, and for their organizations, power pretenders can attain organizational power out of proportion to their merit or abilities. Understanding the power affect is therefore important for anyone who aims to attain power, or anyone who works with power pretenders. Read more about this program.

Bullet Points: Mastery or Madness?

DecisBullet Point Madnession-makers in modern organizations commonly demand briefings in the form of bullet points or a series of series of bullet points. But this form of presentation has limited value for complex decisions. We need something more. We actually need to think. Briefers who combine the bullet-point format with a variety of persuasion techniques can mislead decision-makers, guiding them into making poor decisions. Read more about this program.

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at Twitter, or share a tweet Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.