When we try to understand why others do what they do, we engage in a kind of thought activity called attribution. Briefly, attribution is the process of identifying a cause — or more rarely, causes — for the behavior of another. For example, when several people laugh together, we would typically attribute that behavior to a cause such as their having heard something funny, as might have occurred if someone else had told them a joke. Other causes are possible, of course. For example, they might be engaged in a conspiracy to make another person feel excluded. Read more about attribution.
Attribution is a thought process that is both frequently used and essential for social interaction. Mistakes are common, too. People who work together solving difficult problems are especially vulnerable to making one particular kind of attribution error that I like to call the Stupidity Attribution Error. It's actually a special case of the Fundamental Attribution Error. (See "The Fundamental Attribution Error," Point Lookout for May 5, 2004, for more)
Here's how it works. In an extended debate about potential solutions to a difficult problem, one of the participants — I'll call her Jordan — proposes an elegant candidate solution. It happens that understanding why Jordan's proposed solution is worthy of detailed consideration requires some background that many of the other participants lack. After struggling for almost an hour to understand Jordan's proposal, the group decides to set it aside and consider alternatives that are less controversial. By that point in the discussion, Jordan is frustrated and angry. She thinks to herself, "These people are idiots. They rejected my idea because they're too stupid to understand it." Her conclusion is incorrect because she doesn't realize that many of the other participants lack the background needed to understand her idea.
In knowledge work, To commit the Stupidity Attribution
Error is to choose incorrectly to
attribute to stupidity the decision
of others to adopt or fail to adopt
a proposed solution to a problemto commit the Stupidity Attribution Error is to choose incorrectly to attribute to stupidity the decision of others to adopt or fail to adopt a proposed solution to a problem, or to adopt or fail to adopt a concept as part of their ongoing deliberations. In choosing to attribute their decision to stupidity in preference to any number of alternative possible attributions, we expose ourselves to risk of error when we fail to consider those alternatives.
Before listing some of those alternatives, we must distinguish them from "stupidity." In everyday parlance, stupidity is slow-wittedness. It is a lack of intelligence that limits the ability to think clearly and to reason to conclusions on the basis of evidence. With that definition of stupidity in mind, let's consider some alternative reasons why someone might make what we consider a wrong-headed decision. In what follows, I'll use the name Albert to refer to the person whom Jordan regards as "stupid."
- Ignorance is the state of being unaware. It's possible that Albert fails to find Jordan's arguments persuasive because he lacks knowledge that's required to understand Jordan's arguments. Or possibly Albert cannot weigh Jordan's points appropriately because he lacks the knowledge needed to do so.
- If we could provide Albert with the missing knowledge, he might be fully capable of grasping Jordan's ideas. But we don't always know what knowledge is missing. Even if we do know, Albert might not be willing to accept it, especially if it's forced upon him in a disrespectful or condescending manner.
- In terms of its effects, being misinformed is similar to being ignorant. Misinformation can lead Albert to evaluate Jordan's arguments incorrectly. But in some ways, misinformation can be more destructive of group interaction, because providing the correct information isn't always sufficient to motivate the misinformed to alter their decisions or judgments.
- As with ignorance, we don't always know what misinformation needs correcting. And providing correct information might not be sufficient — invalidating the misinformation might also be necessary. In the moment, that can be difficult.
- Some people are unable to deal with certain issues rationally. For example, race bias, gender bias, or other social biases can distort assessments of performance or capability. (See "The Ultimate Attribution Error at Work," Point Lookout for February 21, 2018, for more) Some feel compelled to bully or harass subordinates. Worse, perhaps, are those who feel compelled to abuse organizational authority to bully or harass others or to gain personal advantage — economic or otherwise.
- Using reasoned argument to adjust the attitudes or behavior of such people is rarely effective. But it's plainly incorrect to conclude that these people are "stupid" solely on the basis that their compulsions make them immune to rational argument.
- In some debate situations, the person who actually decides the issue in question isn't a participant in the debate. For example, Albert's supervisor, Super-Albert, might have instructed him not to agree to any of Jordan's proposals. And further, Super-Albert might have told Albert to "be creative" about the reasons for the positions he takes, so as to conceal the fact that he has been directed to reject Jordan's work. "Directed" might actually be a misleading term if Albert understands Super-Albert to be threatening him with termination or disciplinary action if he deviates from Super-Albert's plan.
- Even when coercion is at the root of what seems to be irrational behavior, we don't always have evidence that coercion is a factor. And frequently, coerced individuals are reluctant to disclose the coercion. Some experience shame about being coerced. Others fear — reasonably — that disclosing the coercion could have even more severe consequences than violating the original directive would have.
There are many more possible alternatives to stupidity as explanations for someone's apparent inability to grasp the truth in an argument. One alternative worthy of special mention is a lack of critical thinking skills (See "Critical Thinking and Midnight Pizza," Point Lookout for April 23, 2003). Critical thinking is the process of drawing sound inferences based on evidence and principles. To think critically requires discipline. More important, critical thinking requires strict avoidance of the traps and tricks that are so common in casual debate, such as Rhetorical Fallacies, deception, self-deception, entrapment, or withholding information. Someone who lacks critical thinking skills, or who chooses not to apply them, might appear to be slow-witted. But to repeatedly attribute to stupidity a failure to apply critical thinking skills could be perhaps the best example of the stupidity attribution error. Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Your comments are welcomeWould you like to see your comments posted here? rbrenmhXARWRMUvVyOdHlner@ChacxgDmtwOKrxnripPCoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- Self-Serving Bias in Organizations
- We all want to believe that we can rely on the good judgment of decision makers when they make decisions
that affect organizational performance. But they're human, and they are therefore subject to a cognitive
bias known as self-serving bias. Here's a look at what can happen.
- Confirmation Bias: Workplace Consequences Part II
- We continue our exploration of confirmation bias. In this Part II, we explore its effects in management
- Scope Creep, Hot Hands, and the Illusion of Control
- Despite our awareness of scope creep's dangerous effects on projects and other efforts, we seem unable
to prevent it. Two cognitive biases — the "hot hand fallacy" and "the illusion
of control" — might provide explanations.
- Cognitive Biases and Influence: I
- The techniques of influence include inadvertent — and not-so-inadvertent — uses of cognitive
biases. They are one way we lead each other to accept or decide things that rationality cannot support.
- Effects of Shared Information Bias: II
- Shared information bias is widely recognized as a cause of bad decisions. But over time, it can also
erode a group's ability to assess reality accurately. That can lead to a widening gap between reality
and the group's perceptions of reality.
Forthcoming issues of Point Lookout
- Coming January 29: Higher-Velocity Problem Definition
- Typical approaches to shortening time-to-market for new products usually involve accelerating problem solving. Accelerating problem definition can also help. Available here and by RSS on January 29.
- And on February 5: Unrecognized Bullying: I
- Much workplace bullying goes unrecognized. Three reasons: (a) conventional definitions of bullying exclude much actual bullying; (b) perpetrators cleverly evade detection; and (c) cognitive biases skew our perceptions so we don't see bullying as bullying. Available here and by RSS on February 5.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenmhXARWRMUvVyOdHlner@ChacxgDmtwOKrxnripPCoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500 words in your inbox in one hour. License any article from this Web site. More info
- The Power Affect: How We Express Our Personal Power
Many people who possess real organizational power have a characteristic demeanor. It's the way they project their presence. I call this the power affect. Some people — call them power pretenders — adopt the power affect well before they attain significant organizational power. Unfortunately for their colleagues, and for their organizations, power pretenders can attain organizational power out of proportion to their merit or abilities. Understanding the power affect is therefore important for anyone who aims to attain power, or anyone who works with power pretenders. Read more about this program.