When modern societies and their organizations address complex problems, they aspire to do so rationally. We rely on experts and specialists for advice, wisdom, and approval of the approaches we adopt. We do this in every field, including software user interface design, urban planning, the safety and effectiveness of drugs and medical procedures, and more. Or at least, we believe we do. But belief and reality differ from each other, sometimes dramatically, in part because of a phenomenon that has come to be known as myside bias.
Myside bias in brief
It's fair to say that researchers are still working out a consensus view of what myside bias is. Some regard it as interchangeable with confirmation bias [Nickerson 1998]; others hold that it's a type of confirmation bias; still others regard myside bias and confirmation bias as distinct.
As Wolfe puts it, "Although some authors use the terms interchangeably, confirmation bias typically refers to a biased search for or weighing of evidence, whereas myside bias refers to biases in generating reasons or arguments." [Wolfe 2011] That is, confirmation bias appears in the gathering and weighing of evidence, while myside bias appears in the way we use evidence in reasoned arguments. It is this view that I favor. Myside bias is our tendency to overlook or even dismiss flaws in our own rational arguments that we easily notice in the arguments of others.
The effects of myside bias
Along Myside bias is our tendency to overlook
or even dismiss flaws in our own rational
arguments that we easily notice
in the arguments of othersthe spectrum of the effects of myside bias, at the less-damaging end, we find the skepticism with which we treat the rational arguments of domain experts. At the more damaging end of that spectrum is our tendency to reject outright experts' recommendations. We might even relieve them of their positions and authority, or strip them of their credentials and certifications, or treat them with such disdain or disrespect that they voluntarily withdraw from the debate, or even resign their positions.
Myside bias is probably more likely to occur when the rational argument in question leads to conclusions different from our preconceptions. But "more likely" is the operative phrase. Myside bias can occur whether or not the conclusions of the argument in question align with our preferred positions, because it serves as a debate-rigging device. It helps us win arguments, whether we're on the "right" side or not.
It's probably for this reason that "red teams" are so effective in cyberdefense, business plan evaluation, military planning, and intelligence analysis. Red teaming might be providing a check on myside bias. And the scientific method, too, could be providing a check on myside bias through its requirement that independent researchers replicate results before the community grants acceptance. [Kolbert 2017]
Last words
In my own view, the term myside bias connotes binary polarity. That is, it suggests that there are only two sides: mine and not-mine. In actual interactions, though, we often find multiple sides, factions, alliances, schools of thought, or political parties. I wish we had settled on the term my-school bias, but oh well. Myside bias applies in any case: we tend to be better at noticing the shortcomings of the arguments others use than we are at noticing the shortcomings in the arguments we use for our own positions. Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- Confirmation Bias: Workplace Consequences Part II
- We continue our exploration of confirmation bias. In this Part II, we explore its effects in management
processes.
- Perfectionism and Avoidance
- Avoiding tasks we regard as unpleasant, boring, or intimidating is a pattern known as procrastination.
Perfectionism is another pattern. The interplay between the two makes intervention a bit tricky.
- Seven Planning Pitfalls: I
- Whether in war or in projects, plans rarely work out as, umm well, as planned. In part, this is due
to our limited ability to foretell the future, or to know what we don't know. But some of the problem
arises from the way we think. And if we understand this we can make better plans.
- Seven More Planning Pitfalls: III
- Planning teams, like all teams, are vulnerable to several patterns of interaction that can lead to counter-productive
results. Two of these relevant to planners are a cognitive bias called the IKEA Effect, and a systemic
bias against realistic estimates of cost and schedule.
- Clouted Thinking
- When we say that people have "clout" we mean that they have more organizational power or social
influence than most others do. But when people with clout try to use it in realms beyond those in which
they've earned it, trouble looms.
See also Cognitive Biases at Work and Workplace Politics for more related articles.
Forthcoming issues of Point Lookout
- Coming April 3: Recapping Factioned Meetings
- A factioned meeting is one in which participants identify more closely with their factions, rather than with the meeting as a whole. Agreements reached in such meetings are at risk of instability as participants maneuver for advantage after the meeting. Available here and by RSS on April 3.
- And on April 10: Managing Dunning-Kruger Risk
- A cognitive bias called the Dunning-Kruger Effect can create risk for organizational missions that require expertise beyond the range of knowledge and experience of decision-makers. They might misjudge the organization's capacity to execute the mission successfully. They might even be unaware of the risk of so misjudging. Available here and by RSS on April 10.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed