When modern societies and their organizations address complex problems, they aspire to do so rationally. We rely on experts and specialists for advice, wisdom, and approval of the approaches we adopt. We do this in every field, including software user interface design, urban planning, the safety and effectiveness of drugs and medical procedures, and more. Or at least, we believe we do. But belief and reality differ from each other, sometimes dramatically, in part because of a phenomenon that has come to be known as myside bias.
Myside bias in brief
It's fair to say that researchers are still working out a consensus view of what myside bias is. Some regard it as interchangeable with confirmation bias [Nickerson 1998]; others hold that it's a type of confirmation bias; still others regard myside bias and confirmation bias as distinct.
As Wolfe puts it, "Although some authors use the terms interchangeably, confirmation bias typically refers to a biased search for or weighing of evidence, whereas myside bias refers to biases in generating reasons or arguments." [Wolfe 2011] That is, confirmation bias appears in the gathering and weighing of evidence, while myside bias appears in the way we use evidence in reasoned arguments. It is this view that I favor. Myside bias is our tendency to overlook or even dismiss flaws in our own rational arguments that we easily notice in the arguments of others.
The effects of myside bias
Along Myside bias is our tendency to overlook
or even dismiss flaws in our own rational
arguments that we easily notice
in the arguments of othersthe spectrum of the effects of myside bias, at the less-damaging end, we find the skepticism with which we treat the rational arguments of domain experts. At the more damaging end of that spectrum is our tendency to reject outright experts' recommendations. We might even relieve them of their positions and authority, or strip them of their credentials and certifications, or treat them with such disdain or disrespect that they voluntarily withdraw from the debate, or even resign their positions.
Myside bias is probably more likely to occur when the rational argument in question leads to conclusions different from our preconceptions. But "more likely" is the operative phrase. Myside bias can occur whether or not the conclusions of the argument in question align with our preferred positions, because it serves as a debate-rigging device. It helps us win arguments, whether we're on the "right" side or not.
It's probably for this reason that "red teams" are so effective in cyberdefense, business plan evaluation, military planning, and intelligence analysis. Red teaming might be providing a check on myside bias. And the scientific method, too, could be providing a check on myside bias through its requirement that independent researchers replicate results before the community grants acceptance. [Kolbert 2017]
Last words
In my own view, the term myside bias connotes binary polarity. That is, it suggests that there are only two sides: mine and not-mine. In actual interactions, though, we often find multiple sides, factions, alliances, schools of thought, or political parties. I wish we had settled on the term my-school bias, but oh well. Myside bias applies in any case: we tend to be better at noticing the shortcomings of the arguments others use than we are at noticing the shortcomings in the arguments we use for our own positions. Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- The Rhyme-as-Reason Effect
- When we speak or write, the phrases we use have both form and meaning. Although we usually think of
form and meaning as distinct, humans tend to assess as more meaningful and valid those phrases that
are more beautifully formed. The rhyme-as-reason effect causes us to confuse the validity of a phrase
with its aesthetics.
- Motivated Reasoning and the Pseudocertainty Effect
- When we have a preconceived notion of what conclusion a decision process should produce, we sometimes
engage in "motivated reasoning" to ensure that we get the result we want. That's risky enough
as it is. But when we do this in relation to a chain of decisions in the context of uncertainty, trouble
looms.
- Mental Accounting and Technical Debt
- In many organizations, technical debt has resisted efforts to control it. We've made important technical
advances, but full control might require applying some results of the behavioral economics community,
including a concept they call mental accounting.
- The Illusion of Explanatory Depth
- The illusion of explanatory depth is the tendency of humans to believe they understand something better
than they actually do. Discovering the illusion when you're explaining something is worse than embarrassing.
It can be career ending.
- Clouted Thinking
- When we say that people have "clout" we mean that they have more organizational power or social
influence than most others do. But when people with clout try to use it in realms beyond those in which
they've earned it, trouble looms.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming December 11: White Water Rafting as a Metaphor for Group Development
- Tuckman's model of small group development, best known as "Forming-Storming-Norming-Performing," applies better to development of some groups than to others. We can use a metaphor to explore how the model applies to Storming in task-oriented work groups. Available here and by RSS on December 11.
- And on December 18: Subgrouping and Conway's Law
- When task-oriented work groups address complex tasks, they might form subgroups to address subtasks. The structure of the subgroups and the order in which they form depend on the structure of the group's task and the sequencing of the subtasks. Available here and by RSS on December 18.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed