Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 20, Issue 33;   August 12, 2020: Cognitive Biases at Work

Cognitive Biases at Work

by

Cognitive biases can lead us to misunderstand situations, overlook options, and make decisions we regret. The patterns of thinking that lead to cognitive biases provide speed and economy advantages, but we must manage the risks that come along with them.

There are at least two ways to interpret the title of this post. One interpretation implies that the post is about the effects of cognitive biases in the workplace, and maybe, Cognitive biases are powerful.
They determine, in part, how
we make decisions, how we
interact with one other, and
how well those decisions and
interactions serve us
in the workplace.
how to manage those effects. A rewording of the title, consistent with this interpretation, is "Cognitive Biases in the Workplace." A second interpretation is that the post is about noticing when cognitive biases are playing a role in a given situation, and maybe what to do about that or how to prevent it. A rewording of the title, consistent with this second interpretation, might be "When Cognitive Biases Are Playing a Role."

When you saw the title and read it for the first time, the interpretation that came to mind first was determined, in part, by cognitive biases. Cognitive biases are powerful. They determine, in part, how we make decisions, how we interact with one other, and how well those decisions and interactions serve us in the workplace — or anywhere else for that matter.

But the focus of this post is cognitive biases in the workplace.

Let's begin by clearing away some baggage related to the term cognitive bias. The word cognitive isn't (or until recently has not been) a common element of workplace vocabulary. In the context of this discussion, it just means, "of or related to thinking." The real problem with the term cognitive bias is the word bias, which has some very negative connotations. In lay language, bias relates to prejudice and unfairness. That isn't the sense we need for this context. For this context, the bias in question is a systematic skew of our thinking away from evidence-based reasoning.

And that's where the problems arise. At work, we tend to think of ourselves as making decisions and analyzing problems using only the tools of evidence-based reasoning. Although that is what we believe, science tells us another story. When we think and when we make decisions, we use a number of patterns of thinking that transcend — and sometimes exclude — evidence-based reasoning.

Some view these alternate patterns of thinking as "less than" or "subordinate to" or "of lesser value than" evidence-based reasoning. In this view, decisions or analysis performed on the basis of anything other than evidence-based reasoning are questionable and not to be relied upon. In this view, unless we can offer an evidence-based chain of reasoning to justify a decision or analysis, that decision or analysis is near worthless.

I disagree. But first let me offer support for the critics of alternate patterns of thinking.

An example of a cognitive bias in real life

A so-called "Paris Gun" of World War I

A so-called "Paris Gun" of World War I. Photo courtesy Wikipedia.

There is abundant experimental evidence that these alternate patterns of thinking do often lead to inferior results. One such alternate pattern is the Availability Heuristic. [Tversky 1973] Rather than illustrate the Availability Heuristic with a description of a sterile psychology experiment, let me offer a plausible speculation about the role of the Availability Heuristic in a historical situation.

Pictured here is a so-called "Paris Gun" of World War I. It was enormous. With a barrel of 110 feet (almost 34 meters), the barrel was so long that it needed an external truss to keep it from drooping. Only a handful of these weapons were constructed. In late March of 1918, just seven months before the armistice that ended the fighting, an explosion occurred in northeast Paris. It was mysterious. No German aircraft had been seen operating in the vicinity, and German ground forces at that time were over 60 miles (almost 100 kilometers) away, placing Paris out of range of artillery — or so it was believed at first. The mystery was resolved when two more explosions occurred nearby 15 minutes and 30 minutes later. The source had to be artillery.

At the time, no German artillery piece known to the Allies had sufficient range. But Germany had developed a weapon that became known as the Paris Gun, with a range of 80 miles (more than 125 kilometers). These guns would eventually fire 367 rounds, of which 183 struck within Paris city limits.

But the Paris Guns had little impact on the war. Because their accuracy wasn't sufficient to strike any particular urban target, such as a palace or government building, the guns were being used essentially as terror weapons. According to Major General David T. Zabecki, U.S. Army (Ret.), the Paris Guns could have had significant effect if they had targeted elements of the supply lines of the British Expeditionary Force, such as port facilities or rail facilities. But German strategists chose instead to target population centers. Zabecki regards this choice as a strategic error, which Germany repeated in World War II during the Blitz, targeting London and other population centers. [Zabecki 2015]

The decision by Germany in World War I to target cities instead of logistics assets could have been influenced by the cognitive bias known as the Availability Heuristic. Because it's much easier to imagine destruction of parts of a city than it is to imagine the widely dispersed and unspectacular consequences of disabling a port or railhead, targeting Paris instead of Dover or Calais might have seemed to be more advantageous to the German cause than it actually could have been — or would have been. The Availability Heuristic may have led German strategists astray.

How we benefit from cognitive biases

If cognitive biases lead us to such disadvantageous conclusions, why then do we have cognitive biases? What good are they?

Cognitive biases are not something we have. What we have are patterns of thinking that result in cognitive biases. We have ways of making decisions and analyzing situations that are far more economical and much faster than evidence-based reasoning. And much of the time, the results we achieve with these alternate patterns of thinking are close to what we could have achieved with evidence-based reasoning. In many situations those results are close enough.

The patterns of thinking that exhibit cognitive biases aren't defects. They aren't shortcomings in the design of humans that need to be rooted out and destroyed. On the contrary, they're actually wonderful tools — alternatives to evidence-based reasoning — that get us "pretty fair" results quickly and cheaply much of the time. The defect, if there is one, is our habit of relying on one or more of these alternate patterns of thinking when their results aren't close enough to what we could achieve if we had the time and resources to apply evidence-based reasoning. Or the defect is our habit of relying on them when their results aren't close enough often enough.

But there are hundreds of cognitive biases. Wikipedia lists 196 of them as of this writing. Now it's unlikely that every single one of these identified cognitive biases arises from a single unique alternate pattern of thinking that produces that bias and only that bias. My own guess is that there are many fewer of these alternate patterns of thinking — often called heuristics — and that they exhibit different cognitive biases in different situations.

Even so, we can't possibly manage the risks associated with these alternate patterns of thinking by considering all of the known cognitive biases all the time. We need a way of focusing our risk management efforts to address only the cognitive biases that are most likely to affect particular kinds of decisions or analyses, depending on what we're doing at the moment. In short, we need a heuristic to help us manage the risks of using heuristics. And we'll make a start on that project next time.  Next in this series Go to top Top  Next issue: Motivated Reasoning  Next Issue

52 Tips for Leaders of Project-Oriented OrganizationsAre your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Tversky 1973]
Amos Tversky and Daniel Kahneman. "Availability: a heuristic for judging frequency and probability," Cognitive Psychology 5 (1973), 207-232. Available here. Retrieved 23 April 2021. Back
[Zabecki 2015]
David T. Zabecki. "Paris Under the Gun," Military History, May 2015. Available here. Back

Your comments are welcome

Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

The Japanese battleship Yamato during machinery trials 20 October 1941The Focusing Illusion in Organizations
The judgments we make at work, like the judgments we make elsewhere in life, are subject to human fallibility in the form of cognitive biases. One of these is the Focusing Illusion. Here are some examples to watch for.
Brendan Nyhan and Jason ReiflerWishful Significance: I
When things don't work out, and we investigate why, we sometimes attribute our misfortune to "wishful thinking." In this part of our exploration of wishful thinking we examine how we arrive at mistaken assessments of the significance of what we see, hear, or learn.
Boeing 737 MAX grounded aircraft near Boeing Field, April 2019On Standing Aside
Occasionally we're asked to participate in deliberations about issues relating to our work responsibilities. Usually we respond in good faith. And sometimes we — or those around us — can't be certain that we're responding in good faith. In those situations, we must stand aside.
The battleship USS Arizona, burning during the Japanese attack on the U.S. naval base at Pearl Harbor, Hawaii, December 7, 1941Motivated Reasoning and the Pseudocertainty Effect
When we have a preconceived notion of what conclusion a decision process should produce, we sometimes engage in "motivated reasoning" to ensure that we get the result we want. That's risky enough as it is. But when we do this in relation to a chain of decisions in the context of uncertainty, trouble looms.
Braided streams in Grewingk Glacier RiverRisk Acceptance: One Path
When a project team decides to accept a risk, and when their project eventually experiences that risk, a natural question arises: What were they thinking? Cognitive biases, other psychological phenomena, and organizational dysfunction all can play roles.

See also Cognitive Biases at Work and Critical Thinking at Work for more related articles.

Forthcoming issues of Point Lookout

Three gears in a configuration that's inherently locked upComing April 24: Antipatterns for Time-Constrained Communication: 1
Knowing how to recognize just a few patterns that can lead to miscommunication can be helpful in reducing the incidence of problems. Here is Part 1 of a collection of communication antipatterns that arise in technical communication under time pressure. Available here and by RSS on April 24.
A dangerous curve in an icy roadAnd on May 1: Antipatterns for Time-Constrained Communication: 2
Recognizing just a few patterns that can lead to miscommunication can reduce the incidence of problems. Here is Part 2 of a collection of antipatterns that arise in technical communication under time pressure, emphasizing those that depend on content. Available here and by RSS on May 1.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.