When modern societies and their organizations address complex problems, they aspire to do so rationally. We rely on experts and specialists for advice, wisdom, and approval of the approaches we adopt. We do this in every field, including software user interface design, urban planning, the safety and effectiveness of drugs and medical procedures, and more. Or at least, we believe we do. But belief and reality differ from each other, sometimes dramatically, in part because of a phenomenon that has come to be known as myside bias.
Myside bias in brief
It's fair to say that researchers are still working out a consensus view of what myside bias is. Some regard it as interchangeable with confirmation bias [Nickerson 1998]; others hold that it's a type of confirmation bias; still others regard myside bias and confirmation bias as distinct.
As Wolfe puts it, "Although some authors use the terms interchangeably, confirmation bias typically refers to a biased search for or weighing of evidence, whereas myside bias refers to biases in generating reasons or arguments." [Wolfe 2011] That is, confirmation bias appears in the gathering and weighing of evidence, while myside bias appears in the way we use evidence in reasoned arguments. It is this view that I favor. Myside bias is our tendency to overlook or even dismiss flaws in our own rational arguments that we easily notice in the arguments of others.
The effects of myside bias
Along Myside bias is our tendency to overlook
or even dismiss flaws in our own rational
arguments that we easily notice
in the arguments of othersthe spectrum of the effects of myside bias, at the less-damaging end, we find the skepticism with which we treat the rational arguments of domain experts. At the more damaging end of that spectrum is our tendency to reject outright experts' recommendations. We might even relieve them of their positions and authority, or strip them of their credentials and certifications, or treat them with such disdain or disrespect that they voluntarily withdraw from the debate, or even resign their positions.
Myside bias is probably more likely to occur when the rational argument in question leads to conclusions different from our preconceptions. But "more likely" is the operative phrase. Myside bias can occur whether or not the conclusions of the argument in question align with our preferred positions, because it serves as a debate-rigging device. It helps us win arguments, whether we're on the "right" side or not.
It's probably for this reason that "red teams" are so effective in cyberdefense, business plan evaluation, military planning, and intelligence analysis. Red teaming might be providing a check on myside bias. And the scientific method, too, could be providing a check on myside bias through its requirement that independent researchers replicate results before the community grants acceptance. [Kolbert 2017]
In my own view, the term myside bias connotes binary polarity. That is, it suggests that there are only two sides: mine and not-mine. In actual interactions, though, we often find multiple sides, factions, alliances, schools of thought, or political parties. I wish we had settled on the term my-school bias, but oh well. Myside bias applies in any case: we tend to be better at noticing the shortcomings of the arguments others use than we are at noticing the shortcomings in the arguments we use for our own positions. Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Your comments are welcomeWould you like to see your comments posted here? rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- Confirmation Bias: Workplace Consequences Part I
- We continue our exploration of confirmation bias, paying special attention to the consequences it causes
in the workplace. In this part, we explore its effects on our thinking.
- Perfectionism and Avoidance
- Avoiding tasks we regard as unpleasant, boring, or intimidating is a pattern known as procrastination.
Perfectionism is another pattern. The interplay between the two makes intervention a bit tricky.
- Seven Planning Pitfalls: II
- Plans are well known for working out differently from what we intended. Sometimes, the unintended outcome
is due to external factors over which the planning team has little control. Two examples are priming
effects and widely held but inapplicable beliefs.
- Downscoping Under Pressure: II
- We sometimes "downscope" projects to bring them back on budget and schedule when they're headed
for overruns. Downscoping doesn't always work. Cognitive biases like the sunk cost effect and confirmation
bias can distort decisions about how to downscope.
- Clouted Thinking
- When we say that people have "clout" we mean that they have more organizational power or social
influence than most others do. But when people with clout try to use it in realms beyond those in which
they've earned it, trouble looms.
Forthcoming issues of Point Lookout
- Coming December 13: Contrary Indicators of Psychological Safety: I
- To take the risks that learning and practicing new ways require, we all need a sense that trial-and-error approaches are safe. Organizations seeking to improve processes would do well to begin by assessing their level of psychological safety. Available here and by RSS on December 13.
- And on December 20: Contrary Indicators of Psychological Safety: II
- When we begin using new tools or processes, we make mistakes. Practice is the cure, but practice can be scary if the grace period for early mistakes is too short. For teams adopting new methods, psychological safety is a fundamental component of success. Available here and by RSS on December 20.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info