Some people who have organizational power use it wisely. Some who have organizational power do not. This post is about the latter. Specifically, this post explores one way people with organizational power can go wrong, due to the Dunning-Kruger Effect. I begin by describing the Dunning-Kruger Effect. Next I'll describe how the effect generates risks for organizational missions. I then offer three recommendations for managing that risk.
The Dunning-Kruger Effect
A cognitive bias is the tendency to make systematic errors of judgment based on thought-related factors rather than evidence. For example, a bias known as self-serving bias causes us to tend to attribute our successes to our own capabilities, and our failures to situational disorder. In 1999, Justin Kruger and David Dunning demonstrated the effects of a cognitive bias that has become known as the Dunning-Kruger Effect. They found that when we assess our own competence or abilities in a particular field, either in an absolute sense, or relative to others, we tend to commit systematic errors. [Kruger 1999] Four of their principal findings are:- The less competent tend to overestimate their own competence
- The less competent don't recognize the superior competence of the more competent
- The more competent tend to underestimate their own relative competence
- The more competent tend to estimate accurately the incompetence of the less competent
Consequences for people with power
Because of the Dunning-Kruger Effect, people with high levels of organizational power are at risk of demanding that the organization achieve goals that are not in fact achievable. Everyone is subject to the Dunning-Kruger Effect. With respect to knowledge domains outside our areas of expertise, any of us can mistakenly regard as achievable an objective that isn't achievable. Or we can regard as achievable an objective that can be achieved only at such high cost as to be truly impractical. Because An organizational leader who steps beyond his orher domain of expertise to devise and advocate
an organizational mission is at risk of sending the
organization on a fool's errandof the Dunning-Kruger Effect, when people in organizations receive commands from those with power, there is always the possibility that those with power have assessed themselves and their organizations as more capable than they actually are. The Dunning-Kruger Effect holds that anyone, including decision-makers with organizational power, is at risk of making these errors of judgment. In organizations, decision-makers are at risk of requiring others to carry out impossible missions, if those decision-makers lack some of the expertise required to assess those missions accurately. Decision-makers would be wise to consult experts in all domains relevant to a given mission before charging the organization with achieving that mission. The Dunning-Kruger Effect implies that relying on organizational leaders alone for these decisions is risky.
Three recommendations
To mitigate these risks, decision-makers can rely on domain experts who can assess the organization and its leaders with respect to three criteria:- 1. The mission is within the reach of the organization
- Missions require financial resources. They also require people with skills, knowledge, and experience that completely cover the mission's needs. A mission is within the reach of the organization if the necessary resources and people are available or can be acquired within the necessary time frames.
- 2. The decision-maker is competent to make future mission-relevant decisions
- During mission execution, the decision-maker must be available and competent to address any issues the mission requires. If issues beyond the competence range of the decision-maker should arise, the decision-maker has access to others who can identify those issues and provide or obtain the needed expertise. Taking into account the Dunning-Kruger Effect, the experts recognize that the decision-maker is not a reliable source for assessing compliance with this criterion.
- 3. A process is in place to maintain compliance with Criterion 1 and Criterion 2
- Ensuring ongoing compliance with Criteria 1 and 2 requires access to consultant capacity equivalent to what was available at the approval stage of the decision to undertake the mission. A correction process takes effect if the consultant finds misalignment between the organization and any of these three criteria.
Last words
Managing the risk of the Dunning-Kruger Effect requires people with organizational power — decision-makers — to acknowledge limits to that power. That acknowledging will be a difficult challenge for many. But the choice is clear: either acknowledge the limits of organizational power or accept the risks of the Dunning-Kruger Effect. Top Next IssueAre your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- Neglect of Probability
- Neglect of Probability is a cognitive bias that leads to poor decisions. The risk of poor decisions
is elevated when we must select an option from a set in which some have outstandingly preferable possible
outcomes with low probabilities of occurring.
- Motivated Reasoning and the Pseudocertainty Effect
- When we have a preconceived notion of what conclusion a decision process should produce, we sometimes
engage in "motivated reasoning" to ensure that we get the result we want. That's risky enough
as it is. But when we do this in relation to a chain of decisions in the context of uncertainty, trouble
looms.
- Risk Acceptance: Naïve Realism
- When we suddenly notice a "project-killer" risk that hasn't yet materialized, we sometimes
accept the risk even though we know how seriously it threatens the effort. A psychological phenomenon
known as naïve realism plays a role in this behavior.
- Mental Accounting and Technical Debt
- In many organizations, technical debt has resisted efforts to control it. We've made important technical
advances, but full control might require applying some results of the behavioral economics community,
including a concept they call mental accounting.
- Clouted Thinking
- When we say that people have "clout" we mean that they have more organizational power or social
influence than most others do. But when people with clout try to use it in realms beyond those in which
they've earned it, trouble looms.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming December 11: White Water Rafting as a Metaphor for Group Development
- Tuckman's model of small group development, best known as "Forming-Storming-Norming-Performing," applies better to development of some groups than to others. We can use a metaphor to explore how the model applies to Storming in task-oriented work groups. Available here and by RSS on December 11.
- And on December 18: Subgrouping and Conway's Law
- When task-oriented work groups address complex tasks, they might form subgroups to address subtasks. The structure of the subgroups and the order in which they form depend on the structure of the group's task and the sequencing of the subtasks. Available here and by RSS on December 18.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed