There are at least two ways to interpret the title of this post. One interpretation implies that the post is about the effects of cognitive biases in the workplace, and maybe, Cognitive biases are powerful.
They determine, in part, how
we make decisions, how we
interact with one other, and
how well those decisions and
interactions serve us
in the workplace.how to manage those effects. A rewording of the title, consistent with this interpretation, is "Cognitive Biases in the Workplace." A second interpretation is that the post is about noticing when cognitive biases are playing a role in a given situation, and maybe what to do about that or how to prevent it. A rewording of the title, consistent with this second interpretation, might be "When Cognitive Biases Are Playing a Role."
When you saw the title and read it for the first time, the interpretation that came to mind first was determined, in part, by cognitive biases. Cognitive biases are powerful. They determine, in part, how we make decisions, how we interact with one other, and how well those decisions and interactions serve us in the workplace — or anywhere else for that matter.
But the focus of this post is cognitive biases in the workplace.
Let's begin by clearing away some baggage related to the term cognitive bias. The word cognitive isn't (or until recently has not been) a common element of workplace vocabulary. In the context of this discussion, it just means, "of or related to thinking." The real problem with the term cognitive bias is the word bias, which has some very negative connotations. In lay language, bias relates to prejudice and unfairness. That isn't the sense we need for this context. For this context, the bias in question is a systematic skew of our thinking away from evidence-based reasoning.
And that's where the problems arise. At work, we tend to think of ourselves as making decisions and analyzing problems using only the tools of evidence-based reasoning. Although that is what we believe, science tells us another story. When we think and when we make decisions, we use a number of patterns of thinking that transcend — and sometimes exclude — evidence-based reasoning.
Some view these alternate patterns of thinking as "less than" or "subordinate to" or "of lesser value than" evidence-based reasoning. In this view, decisions or analysis performed on the basis of anything other than evidence-based reasoning are questionable and not to be relied upon. In this view, unless we can offer an evidence-based chain of reasoning to justify a decision or analysis, that decision or analysis is near worthless.
I disagree. But first let me offer support for the critics of alternate patterns of thinking.
An example of a cognitive bias in real lifeThere is abundant experimental evidence that these alternate patterns of thinking do often lead to inferior results. One such alternate pattern is the Availability Heuristic. [Tversky 1973] Rather than illustrate the Availability Heuristic with a description of a sterile psychology experiment, let me offer a plausible speculation about the role of the Availability Heuristic in a historical situation.
Pictured here is a so-called "Paris Gun" of World War I. It was enormous. With a barrel of 110 feet (almost 34 meters), the barrel was so long that it needed an external truss to keep it from drooping. Only a handful of these weapons were constructed. In late March of 1918, just seven months before the armistice that ended the fighting, an explosion occurred in northeast Paris. It was mysterious. No German aircraft had been seen operating in the vicinity, and German ground forces at that time were over 60 miles (almost 100 kilometers) away, placing Paris out of range of artillery — or so it was believed at first. The mystery was resolved when two more explosions occurred nearby 15 minutes and 30 minutes later. The source had to be artillery.
At the time, no German artillery piece known to the Allies had sufficient range. But Germany had developed a weapon that became known as the Paris Gun, with a range of 80 miles (more than 125 kilometers). These guns would eventually fire 367 rounds, of which 183 struck within Paris city limits.
But the Paris Guns had little impact on the war. Because their accuracy wasn't sufficient to strike any particular urban target, such as a palace or government building, the guns were being used essentially as terror weapons. According to Major General David T. Zabecki, U.S. Army (Ret.), the Paris Guns could have had significant effect if they had targeted elements of the supply lines of the British Expeditionary Force, such as port facilities or rail facilities. But German strategists chose instead to target population centers. Zabecki regards this choice as a strategic error, which Germany repeated in World War II during the Blitz, targeting London and other population centers. [Zabecki 2015]
The decision by Germany in World War I to target cities instead of logistics assets could have been influenced by the cognitive bias known as the Availability Heuristic. Because it's much easier to imagine destruction of parts of a city than it is to imagine the widely dispersed and unspectacular consequences of disabling a port or railhead, targeting Paris instead of Dover or Calais might have seemed to be more advantageous to the German cause than it actually could have been — or would have been. The Availability Heuristic may have led German strategists astray.
How we benefit from cognitive biases
If cognitive biases lead us to such disadvantageous conclusions, why then do we have cognitive biases? What good are they?
Cognitive biases are not something we have. What we have are patterns of thinking that result in cognitive biases. We have ways of making decisions and analyzing situations that are far more economical and much faster than evidence-based reasoning. And much of the time, the results we achieve with these alternate patterns of thinking are close to what we could have achieved with evidence-based reasoning. In many situations those results are close enough.
The patterns of thinking that exhibit cognitive biases aren't defects. They aren't shortcomings in the design of humans that need to be rooted out and destroyed. On the contrary, they're actually wonderful tools — alternatives to evidence-based reasoning — that get us "pretty fair" results quickly and cheaply much of the time. The defect, if there is one, is our habit of relying on one or more of these alternate patterns of thinking when their results aren't close enough to what we could achieve if we had the time and resources to apply evidence-based reasoning. Or the defect is our habit of relying on them when their results aren't close enough often enough.
But there are hundreds of cognitive biases. Wikipedia lists 196 of them as of this writing. Now it's unlikely that every single one of these identified cognitive biases arises from a single unique alternate pattern of thinking that produces that bias and only that bias. My own guess is that there are many fewer of these alternate patterns of thinking — often called heuristics — and that they exhibit different cognitive biases in different situations.
Even so, we can't possibly manage the risks associated with these alternate patterns of thinking by considering all of the known cognitive biases all the time. We need a way of focusing our risk management efforts to address only the cognitive biases that are most likely to affect particular kinds of decisions or analyses, depending on what we're doing at the moment. In short, we need a heuristic to help us manage the risks of using heuristics. And we'll make a start on that project next time. Next in this series Top Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Your comments are welcomeWould you like to see your comments posted here? rbrenZLkFdSHmlHvCaSsuner@ChacbnsTPttsdDaRAswloCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- Self-Serving Bias in Organizations
- We all want to believe that we can rely on the good judgment of decision makers when they make decisions
that affect organizational performance. But they're human, and they are therefore subject to a cognitive
bias known as self-serving bias. Here's a look at what can happen.
- Scope Creep and Confirmation Bias
- As we've seen, some cognitive biases can contribute to the incidence of scope creep in projects and
other efforts. Confirmation bias, which causes us to prefer evidence that bolsters our preconceptions,
is one of these.
- The Stupidity Attribution Error
- In workplace debates, we sometimes conclude erroneously that only stupidity can explain why our debate
partners fail to grasp the elegance or importance of our arguments. There are many other possibilities.
- Seven More Planning Pitfalls: I
- Planners and members of planning teams are susceptible to patterns of thinking that lead to unworkable
plans. But planning teams also suffer vulnerabilities. Two of these are Group Polarization and Trips
- Seven More Planning Pitfalls: II
- Planning teams, like all teams, are susceptible to several patterns of interaction that can lead to
counter-productive results. Three of these most relevant to planners are False Consensus, Groupthink,
and Shared Information Bias.
Forthcoming issues of Point Lookout
- Coming November 30: Avoiding Speed Bumps: II
- Many of the difficulties we encounter when working together don't create long-term harm, but they do cause delays, confusion, and frustration. Here's Part II of a little catalog of tactics for avoiding speed bumps. Available here and by RSS on November 30.
- And on December 7: Reaching Agreements in Technological Contexts
- Reaching consensus in technological contexts presents special challenges. Problems can arise from interactions between the technological elements of the issue at hand, and the social dynamics of the group addressing that issue. Here are three examples. Available here and by RSS on December 7.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenZLkFdSHmlHvCaSsuner@ChacbnsTPttsdDaRAswloCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info