Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 24, Issue 15;   April 10, 2024: Managing Dunning-Kruger Risk

Managing Dunning-Kruger Risk

by

A cognitive bias called the Dunning-Kruger Effect can create risk for organizational missions that require expertise beyond the range of knowledge and experience of decision-makers. They might misjudge the organization's capacity to execute the mission successfully. They might even be unaware of the risk of so misjudging.
Franz Halder, German general and the chief of staff of the Army High Command (OKH) in Nazi Germany from 1938 until September 1942

Franz Halder, German general and the chief of staff of the Army High Command (OKH) in Nazi Germany from 1938 until September 1942. Halder, as chief of staff, was compelled to deal with Adolph Hitler's penchant for involving himself in military strategy and tactics. In 1941, Hitler took over direct command of the field army. Tensions between Hitler and Halder steadily intensified until Halder was reassigned in September 1942. A more junior and compliant officer replaced him. In reassigning Halder, Hitler exposed Germany to the full effects of Dunning-Kruger Risk, because Hitler himself had stepped outside the range of his own expertise, which, it's fair to say, was politics and oratory. Image from 1938, from Bundesarchiv, Bild 146-1970-052-08 / CC-BY-SA 3.0. Courtesy Wikipedia.

Some people who have organizational power use it wisely. Some who have organizational power do not. This post is about the latter. Specifically, this post explores one way people with organizational power can go wrong, due to the Dunning-Kruger Effect. I begin by describing the Dunning-Kruger Effect. Next I'll describe how the effect generates risks for organizational missions. I then offer three recommendations for managing that risk.

The Dunning-Kruger Effect

A cognitive bias is the tendency to make systematic errors of judgment based on thought-related factors rather than evidence. For example, a bias known as self-serving bias causes us to tend to attribute our successes to our own capabilities, and our failures to situational disorder.

In 1999, Justin Kruger and David Dunning demonstrated the effects of a cognitive bias that has become known as the Dunning-Kruger Effect. They found that when we assess our own competence or abilities in a particular field, either in an absolute sense, or relative to others, we tend to commit systematic errors. [Kruger 1999] Four of their principal findings are:

  • The less competent tend to overestimate their own competence
  • The less competent don't recognize the superior competence of the more competent
  • The more competent tend to underestimate their own relative competence
  • The more competent tend to estimate accurately the incompetence of the less competent
There is some controversy about how Dunning and Kruger interpreted their experimental data. The behaviors they described do occur, but alternative explanations have drawn adherents. Still, at least for the purposes of navigating situations at work, Dunning and Kruger have provided a useful tool.

Consequences for people with power

Because of the Dunning-Kruger Effect, people with high levels of organizational power are at risk of demanding that the organization achieve goals that are not in fact achievable.

Everyone is subject to the Dunning-Kruger Effect. With respect to knowledge domains outside our areas of expertise, any of us can mistakenly regard as achievable an objective that isn't achievable. Or we can regard as achievable an objective that can be achieved only at such high cost as to be truly impractical.

Because An organizational leader who steps beyond his or
her domain of expertise to devise and advocate
an organizational mission is at risk of sending the
organization on a fool's errand
of the Dunning-Kruger Effect, when people in organizations receive commands from those with power, there is always the possibility that those with power have assessed themselves and their organizations as more capable than they actually are. The Dunning-Kruger Effect holds that anyone, including decision-makers with organizational power, is at risk of making these errors of judgment. In organizations, decision-makers are at risk of requiring others to carry out impossible missions, if those decision-makers lack some of the expertise required to assess those missions accurately.

Decision-makers would be wise to consult experts in all domains relevant to a given mission before charging the organization with achieving that mission. The Dunning-Kruger Effect implies that relying on organizational leaders alone for these decisions is risky.

Three recommendations

To mitigate these risks, decision-makers can rely on domain experts who can assess the organization and its leaders with respect to three criteria:

1. The mission is within the reach of the organization
Missions require financial resources. They also require people with skills, knowledge, and experience that completely cover the mission's needs. A mission is within the reach of the organization if the necessary resources and people are available or can be acquired within the necessary time frames.
2. The decision-maker is competent to make future mission-relevant decisions
During mission execution, the decision-maker must be available and competent to address any issues the mission requires. If issues beyond the competence range of the decision-maker should arise, the decision-maker has access to others who can identify those issues and provide or obtain the needed expertise. Taking into account the Dunning-Kruger Effect, the experts recognize that the decision-maker is not a reliable source for assessing compliance with this criterion.
3. A process is in place to maintain compliance with Criterion 1 and Criterion 2
Ensuring ongoing compliance with Criteria 1 and 2 requires access to consultant capacity equivalent to what was available at the approval stage of the decision to undertake the mission. A correction process takes effect if the consultant finds misalignment between the organization and any of these three criteria.

Last words

Managing the risk of the Dunning-Kruger Effect requires people with organizational power — decision-makers — to acknowledge limits to that power. That acknowledging will be a difficult challenge for many. But the choice is clear: either acknowledge the limits of organizational power or accept the risks of the Dunning-Kruger Effect. Go to top Top  Next issue: How to Answer When You Don't Know How to Answer  Next Issue

52 Tips for Leaders of Project-Oriented OrganizationsAre your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Kruger 1999]
Justin Kruger and David Dunning. "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments," Journal of Personality and Social Psychology, 77:6 (1999), 1121-1134. Available here. Retrieved 17 December 2008. Back

Your comments are welcome

Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

A visual illusionScope Creep and the Planning Fallacy
Much is known about scope creep, but it nevertheless occurs with such alarming frequency that in some organizations, it's a certainty. Perhaps what keeps us from controlling it better is that its causes can't be addressed with management methodology. Its causes might be, in part, psychological.
Brendan Nyhan and Jason ReiflerWishful Significance: I
When things don't work out, and we investigate why, we sometimes attribute our misfortune to "wishful thinking." In this part of our exploration of wishful thinking we examine how we arrive at mistaken assessments of the significance of what we see, hear, or learn.
The battleship <cite>USS Arizona</cite>, burning during the Japanese attack on the U.S. naval base at Pearl Harbor, Hawaii, December 7, 1941Motivated Reasoning and the Pseudocertainty Effect
When we have a preconceived notion of what conclusion a decision process should produce, we sometimes engage in "motivated reasoning" to ensure that we get the result we want. That's risky enough as it is. But when we do this in relation to a chain of decisions in the context of uncertainty, trouble looms.
The Bay of Pigs, CubaSeven More Planning Pitfalls: II
Planning teams, like all teams, are susceptible to several patterns of interaction that can lead to counter-productive results. Three of these most relevant to planners are False Consensus, Groupthink, and Shared Information Bias.
A drone carrying a camera, flying under remote controlIllusory Management: I
Many believe that managers control organizational performance, but a puzzle emerges when we consider the phenomena managers clearly cannot control. Why do we believe in Management control when the phenomena Management cannot control are so many and powerful?

See also Cognitive Biases at Work and Managing Your Boss for more related articles.

Forthcoming issues of Point Lookout

A close-up view of a chipseal road surfaceComing July 3: Additive bias…or Not: II
Additive bias is a cognitive bias that many believe contributes to bloat of commercial products. When we change products to make them more capable, additive bias might not play a role, because economic considerations sometimes favor additive approaches. Available here and by RSS on July 3.
The standard conception of delegationAnd on July 10: On Delegating Accountability: I
As the saying goes, "You can't delegate your own accountability." Despite wide knowledge of this aphorism, people try it from time to time, especially when overcome by the temptation of a high-risk decision. What can you delegate, and how can you do it? Available here and by RSS on July 10.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.