In two previous posts, I noted five different phenomena that can lead planning teams to devise unworkable plans. They included Group Polarization, Trips to Abilene, False Consensus, Groupthink, and Shared Information Bias. In this final post of the series, I explore effects that cause planning teams to adopt or accept approaches for reasons other than their merits.
- The IKEA effect
- A cognitive bias known as the IKEA Effect causes individuals to place an inordinately high value on products they assembled themselves. [Norton 2012] One might speculate that an analogous bias occurs with respect to organizations. If this speculation is valid, organizations would tend to place inordinately high value on assets and processes that they created or helped to create, compared to similar assets or processes that they could acquire elsewhere. This phenomenon, if confirmed experimentally, might be related to what is sometimes called the not-invented-here syndrome. [Katz 1982]
- Planning teams A cognitive bias known as the
IKEA Effect causes individuals
to place an inordinately
high value on products
they assembled themselveswould be affected by the "organizational IKEA Effect" by assessing as more valuable or effective approaches that exploited products or technology developed in part or in toto by in-house efforts. That might also cause them to be compelled by internal political forces to use such assets, even if they were inferior to commercial alternatives.
- Competition bias
- When internal experts provide estimates of cost and schedule, they're vulnerable to a number of cognitive biases that cause them to underestimate both. I've noted some of these, such as the priming effects or Shared Information Bias, in previous posts. But even if the members of the planning team weren't vulnerable to these biases, another problem — potentially even more significant — causes them to produce underestimates. The forces that create this problem are traceable to competition, both internal and external. I call this phenomenon Competition Bias.
- Boehm, et al., observe that because organizational resources are finite, project advocates compete with each other for resources. [Boehm 2016] They are compelled by this competition to be unrealistically optimistic about their objectives, costs, and schedules. Although the authors call this mechanism the "Conspiracy of Optimism," possibly facetiously, it isn't actually a conspiracy. Rather, it's a variant of the N-Person Prisoner's Dilemma. [Hamburger 1973]
- Market dynamics provide a second illustration of the effects of competition. Those who advocate marketing strategies based on the so-called "first mover advantage" believe that the organization that first delivers an offering to a marketplace can gain advantages by arriving early. The strategy is somewhat controversial [Suarez 2005], but it is believed widely enough that it leads to pressure on project planning teams to reduce their estimates of cost and schedule.
- Estimates of cost and schedule are more likely to be realistic if the estimators aren't subjected to pressure to produce low estimates.
In these last six posts, I've inventoried 14 different phenomena that can lead to unworkable plans.
But there is a trap here. Some might feel that when a plan goes awry, and we see some evidence that the IKEA effect might have played a role, then the people who devised the plan are at fault for not recognizing the problem and doing something about it. That would be a mistake. Replacing those people, or disciplining them in some way, is unlikely to affect substantially the probability of a recurrence.
The root cause of the problem lies not in the people who devised the unworkable plan, but in the processes they used when devising the plan. To reduce the probability of recurrence of the IKEA effect, for example, we need to add to the planning process new steps. Those new steps must ensure decision maker objectivity with respect to the origins of the assets they're planning on using. For each of the 14 phenomena I've been exploring, we would need to add some measures like that.
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Your comments are welcomeWould you like to see your comments posted here? rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- The Focusing Illusion in Organizations
- The judgments we make at work, like the judgments we make elsewhere in life, are subject to human fallibility
in the form of cognitive biases. One of these is the Focusing Illusion. Here are some examples to watch for.
- Overconfidence at Work
- Confidence in our judgments and ourselves is essential to success. Confidence misplaced — overconfidence
— leads to trouble and failure. Understanding the causes and consequences of overconfidence can
be most useful.
- Perfectionism and Avoidance
- Avoiding tasks we regard as unpleasant, boring, or intimidating is a pattern known as procrastination.
Perfectionism is another pattern. The interplay between the two makes intervention a bit tricky.
- Bullet Point Madness: II
- Decision makers in many organizations commonly demand briefings in the form of a series of bullet points
or a series of series of bullet points. Briefers who combine this format with a variety of persuasion
techniques can mislead decision makers, guiding them into making poor decisions.
- Lessons Not Learned: I
- The planning fallacy is a cognitive bias that causes us to underestimate the cost and effort involved
in projects large and small. Mitigating its effects requires understanding how we go wrong when we plan
projects by referencing our own past experience.
Forthcoming issues of Point Lookout
- Coming December 13: Contrary Indicators of Psychological Safety: I
- To take the risks that learning and practicing new ways require, we all need a sense that trial-and-error approaches are safe. Organizations seeking to improve processes would do well to begin by assessing their level of psychological safety. Available here and by RSS on December 13.
- And on December 20: Contrary Indicators of Psychological Safety: II
- When we begin using new tools or processes, we make mistakes. Practice is the cure, but practice can be scary if the grace period for early mistakes is too short. For teams adopting new methods, psychological safety is a fundamental component of success. Available here and by RSS on December 20.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info