Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 20, Issue 33;   August 12, 2020: Cognitive Biases at Work

Cognitive Biases at Work

by

Last updated: August 14, 2020

Cognitive biases can lead us to misunderstand situations, overlook options, and make decisions we regret. The patterns of thinking that lead to cognitive biases provide speed and economy advantages, but we must manage the risks that come along with them.

There are at least two ways to interpret the title of this post. One interpretation implies that the post is about the effects of cognitive biases in the workplace, and maybe, Cognitive biases are powerful.
They determine, in part, how
we make decisions, how we
interact with one other, and
how well those decisions and
interactions serve us
in the workplace.
how to manage those effects. A rewording of the title, consistent with this interpretation, is "Cognitive Biases in the Workplace." A second interpretation is that the post is about noticing when cognitive biases are playing a role in a given situation, and maybe what to do about that or how to prevent it. A rewording of the title, consistent with this second interpretation, might be "When Cognitive Biases Are Playing a Role."

When you saw the title and read it for the first time, the interpretation that came to mind first was determined, in part, by cognitive biases. Cognitive biases are powerful. They determine, in part, how we make decisions, how we interact with one other, and how well those decisions and interactions serve us in the workplace — or anywhere else for that matter.

But the focus of this post is cognitive biases in the workplace.

Let's begin by clearing away some baggage related to the term cognitive bias. The word cognitive isn't (or until recently has not been) a common element of workplace vocabulary. In the context of this discussion, it just means, "of or related to thinking." The real problem with the term cognitive bias is the word bias, which has some very negative connotations. In lay language, bias relates to prejudice and unfairness. That isn't the sense we need for this context. For this context, the bias in question is a systematic skew of our thinking away from evidence-based reasoning.

And that's where the problems arise. At work, we tend to think of ourselves as making decisions and analyzing problems using only the tools of evidence-based reasoning. Although that is what we believe, science tells us another story. When we think and when we make decisions, we use a number of patterns of thinking that transcend — and sometimes exclude — evidence-based reasoning.

Some view these alternate patterns of thinking as "less than" or "subordinate to" or "of lesser value than" evidence-based reasoning. In this view, decisions or analysis performed on the basis of anything other than evidence-based reasoning are questionable and not to be relied upon. In this view, unless we can offer an evidence-based chain of reasoning to justify a decision or analysis, that decision or analysis is near worthless.

I disagree. But first let me offer support for the critics of alternate patterns of thinking.

An example of a cognitive bias in real life

A so-called "Paris Gun" of World War I

A so-called "Paris Gun" of World War I. Photo courtesy Wikipedia.

There is abundant experimental evidence that these alternate patterns of thinking do often lead to inferior results. One such alternate pattern is the Availability Heuristic [Tversky 1973]. Rather than illustrate the Availability Heuristic with a description of a sterile psychology experiment, let me offer a plausible speculation about the role of the Availability Heuristic in a historical situation.

Pictured here is a so-called "Paris Gun" of World War I. It was enormous. With a barrel of 110 feet (almost 34 meters), the barrel was so long that it needed an external truss to keep it from drooping. Only a handful of these weapons were constructed. In late March of 1918, just seven months before the armistice that ended the fighting, an explosion occurred in northeast Paris. It was mysterious. No German aircraft had been seen operating in the vicinity, and German ground forces at that time were over 60 miles (almost 100 kilometers) away, placing Paris out of range of artillery — or so it was believed at first. The mystery was resolved when two more explosions occurred nearby 15 minutes and 30 minutes later. The source had to be artillery.

At the time, no German artillery piece known to the Allies had sufficient range. But Germany had developed a weapon that became known as the Paris Gun, with a range of 80 miles (more than 125 kilometers). These guns would eventually fire 367 rounds, of which 183 struck within Paris city limits.

But the Paris Guns had little impact on the war. Because their accuracy wasn't sufficient to strike any particular urban target, such as a palace or government building, the guns were being used essentially as terror weapons. According to Major General David T. Zabecki, U.S. Army (Ret.), the Paris Guns could have had significant effect if they had targeted elements of the supply lines of the British Expeditionary Force, such as port facilities or rail facilities. But German strategists chose instead to target population centers. Zabecki regards this choice as a strategic error, which Germany repeated in World War II during the Blitz, targeting London and other population centers [Zabecki 2015] .

The decision by Germany in World War I to target cities instead of logistics assets could have been influenced by the cognitive bias known as the Availability Heuristic. Because it's much easier to imagine destruction of parts of a city than it is to imagine the widely dispersed and unspectacular consequences of disabling a port or railhead, targeting Paris instead of Dover or Calais might have seemed to be more advantageous to the German cause than it actually could have been — or would have been. The Availability Heuristic may have led German strategists astray.

How we benefit from cognitive biases

If cognitive biases lead us to such disadvantageous conclusions, why then do we have cognitive biases? What good are they?

Cognitive biases are not something we have. What we have are patterns of thinking that result in cognitive biases. We have ways of making decisions and analyzing situations that are far more economical and much faster than evidence-based reasoning. And much of the time, the results we achieve with these alternate patterns of thinking are close to what we could have achieved with evidence-based reasoning. In many situations those results are close enough.

The patterns of thinking that exhibit cognitive biases aren't defects. They aren't shortcomings in the design of humans that need to be rooted out and destroyed. On the contrary, they're actually wonderful tools — alternatives to evidence-based reasoning — that get us "pretty fair" results quickly and cheaply much of the time. The defect, if there is one, is our habit of relying on one or more of these alternate patterns of thinking when their results aren't close enough to what we could achieve if we had the time and resources to apply evidence-based reasoning. Or the defect is our habit of relying on them when their results aren't close enough often enough.

But there are hundreds of cognitive biases. Wikipedia lists 196 of them as of this writing. Now it's unlikely that every single one of these identified cognitive biases arises from a single unique alternate pattern of thinking that produces that bias and only that bias. My own guess is that there are many fewer of these alternate patterns of thinking — often called heuristics — and that they exhibit different cognitive biases in different situations.

Even so, we can't possibly manage the risks associated with these alternate patterns of thinking by considering all of the known cognitive biases all the time. We need a way of focusing our risk management efforts to address only the cognitive biases that are most likely to affect particular kinds of decisions or analyses, depending on what we're doing at the moment. In short, we need a heuristic to help us manage the risks of using heuristics. And we'll make a start on that project next time.  Next in this series Go to top Top  Next issue: Motivated Reasoning  Next Issue

52 Tips for Leaders of Project-Oriented OrganizationsAre your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!

Footnotes

[Tversky 1973]
Amos Tversky and Daniel Kahneman. "Availability: a heuristic for judging frequency and probability," Cognitive Psychology 5, (1973), 207-232. Back
[Zabecki 2015]
David T. Zabecki. "Paris Under the Gun," Military History, May 2015. Available here. Back

Your comments are welcome

Would you like to see your comments posted here? rbrenmhXARWRMUvVyOdHlner@ChacxgDmtwOKrxnripPCoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

World global temperature departuresConfirmation Bias: Workplace Consequences Part I
We continue our exploration of confirmation bias, paying special attention to the consequences it causes in the workplace. In this part, we explore its effects on our thinking.
An artist's conception of a planetary accretion diskWhy Scope Expands: II
The scope of an effort underway tends to expand over time. Why do scopes not contract just as often? One cause might be cognitive biases that make us more receptive to expansion than contraction.
Prof. Jack Brehm, who developed the theory of psychological reactanceCognitive Biases and Influence: II
Most advice about influencing others offers intentional tactics. Yet, the techniques we actually use are often unintentional, and we're therefore unaware of them. Among these are tactics exploiting cognitive biases.
A set of wrenches from a toolkitEffects of Shared Information Bias: I
Shared information bias is the tendency for group discussions to emphasize what everyone already knows. It's widely believed to lead to bad decisions. But it can do much more damage than that.
Examples of nonlinear relationships among conceptsBullet Point Madness: I
Decision-makers in modern organizations commonly demand briefings in the form of bullet points or a series of series of bullet points. But this form of presentation has limited value for complex decisions. We need something more. We actually need to think.

See also Cognitive Biases at Work and Critical Thinking at Work for more related articles.

Forthcoming issues of Point Lookout

Auklet flock, Shumagins, March 2006Coming September 23: Seven More Planning Pitfalls: I
Planners and members of planning teams are susceptible to patterns of thinking that lead to unworkable plans. But planning teams also suffer vulnerabilities. Two of these are Group Polarization and Trips to Abilene. Available here and by RSS on September 23.
The Bay of Pigs, CubaAnd on September 30: Seven More Planning Pitfalls: II
Planning teams, like all teams, are susceptible to several patterns of interaction that can lead to counter-productive results. Three of these most relevant to planners are False Consensus, Groupthink, and Shared Information Bias. Available here and by RSS on September 30.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenmhXARWRMUvVyOdHlner@ChacxgDmtwOKrxnripPCoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500 words in your inbox in one hour. License any article from this Web site. More info

Public seminars

The Power Affect: How We Express Our Personal Power

Many The Power Affect: How We Express Personal Powerpeople who possess real organizational power have a characteristic demeanor. It's the way they project their presence. I call this the power affect. Some people — call them power pretenders — adopt the power affect well before they attain significant organizational power. Unfortunately for their colleagues, and for their organizations, power pretenders can attain organizational power out of proportion to their merit or abilities. Understanding the power affect is therefore important for anyone who aims to attain power, or anyone who works with power pretenders. Read more about this program.

Bullet Points: Mastery or Madness?

DecisBullet Point Madnession-makers in modern organizations commonly demand briefings in the form of bullet points or a series of series of bullet points. But this form of presentation has limited value for complex decisions. We need something more. We actually need to think. Briefers who combine the bullet-point format with a variety of persuasion techniques can mislead decision-makers, guiding them into making poor decisions. Read more about this program.

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at Twitter, or share a tweet Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.