
XP-80 prototype Lulu-Belle on the ground, in a photo probably taken in 1944. Although this aircraft wasn't the first US jet-powered aircraft, its operational version, the P-80 later designated the F-80, was the first operational jet aircraft to have its engine integrated into the fuselage. It arrived too late for combat in World War II but it did see service in Korea.
The XP-80 was developed as the first project of the Lockheed Skunk Works, which was a facility established by Lockheed to enable development of advanced aircraft. The concept of a skunk works, now widely used in business, is that by giving innovators autonomy, and protection from bureaucracy, we can accelerate development and facilitate innovation. A skunk works is the organizational analog of what idea generators need when groups solve problems. Photo courtesy United States Air Force.
An embedded technology group is a group that operates within a larger organization (the Host), and which provides critical supportive technology services to the Host. Critical supportive technology is technology that's necessary for accomplishing the Host's mission, but which isn't central to that mission. An example of an embedded technology group is the cybersecurity function within a consumer cosmetics manufacturing company. Cybersecurity is critically necessary, but not central to cosmetics manufacturing. Another example: the IT department within a regional supermarket chain.
Such embedded technology groups face special challenges as they meet the needs of the Host. For example, the IT department of the supermarket chain might encounter difficulty explaining to senior management why it is not economical to continue to use a perfectly operational computer software product that's within six months of end-of-life support.
Most challenges like this seem mundane and predictable, but their frequency of occurrence, and the intensity of their consequences, could be due, in part, to a cognitive bias known as the Dunning-Kruger Effect. [Kruger & Dunning 1999] In this post I explain the Effect and why it intensifies otherwise-mundane decisions.
Brief summary of the Dunning-Kruger Effect
The As Kruger and Dunning put it, "…the skills that engender
competence in a particular domain are often the very
same skills necessary to evaluate competence in
that domain — one's own or anyone else's."Dunning-Kruger Effect is a limitation with respect to achieving mastery of a knowledge domain. That limitation derives from one fundamental principle that underlies the Dunning-Kruger Effect. It is that with respect to any particular knowledge domain, our ability to accurately assess our own or others' relative competence requires competence in that knowledge domain.
As Kruger and Dunning put it, "…the skills that engender competence in a particular domain are often the very same skills necessary to evaluate competence in that domain — one's own or anyone else's."
Dunning and Kruger express the consequences of this principle as four predictions, paraphrased as follows:
- The less competent tend to overestimate their own competence
- The less competent tend to be less able to recognize superior competence
- The less competent tend to be less able to use information about the performances of others to assess their own competence
- The incompetent can gain insight about their own incompetence, but only by becoming more competent
By implication, we can also predict that:
- The more competent tend to underestimate their own competence
- The more competent tend to gauge accurately the incompetence of the less competent
Consequences of Dunning-Kruger for embedded technology groups
The Dunning-Kruger Effect has consequences for the relationship between an embedded technology group and its Host. A bit of notation will simplify the discussion.
Let H-Manager denote a Senior Manager of the Host organization; H-Domain denote the knowledge domain central to the Host operations; and H-Competence denote competence in the H-Domain. Let E-Manager denote a manager of the embedded technology group; E-Domain denote the knowledge domain central to embedded technology group operations; and E-Competence denote competence in the E-Domain.
Now consider a situation in which an H-Manager must make decisions that affect the strategy, staffing, and/or resources of an embedded technology group. In what follows, I suggest that this arrangement is vulnerable to the Dunning-Kruger Effect. When the H-Manager is less E-Competent than any of the E-Managers, Dunning and Kruger predict that the H-Manager will tend to overestimate his/her own E-Competence. Moreover, the H-Manager will tend to be less able to recognize the superior E-Competence of E-Managers.
The consequences of these tendencies are negative and severe for the Host. If a situation arises in which the judgment of the H-Manager suggests adoption of a decision contrary to the recommendations of the E-Managers, then according to the Dunning-Kruger Effect, the H-Managers will tend to choose a path more closely aligned to their own judgment than a rational judgment would predict.
Workarounds for the Dunning-Kruger Effect
We might have been compensating for the Dunning-Kruger Effect in ways outside of our awareness for some time. Consider, for example, the practice of establishing what we call "skunk works." The term "Skunk Works" is the official pseudonym for Lockheed Martin's Advanced Development Programs (ADP), formerly Lockheed Advanced Development Projects. But the term is now used generically to denote a unit that's insulated — physically and culturally — from the host organization's procedures and customs. In one way of understanding the insulation, it provides freedom to break out from psychological constraints on ideation and experimentation. But it might also reduce the risk of having the Dunning-Kruger Effect enable bureaucratic meddling in the operations of the protected unit. In that sense, a skunk works is a workaround for the Dunning-Kruger Effect.
Last words
When described as "tendencies," the consequences of the Dunning-Kruger Effect sound mild — even tolerable. But they are not. In organizations, the Dunning-Kruger Effect causes people in responsible roles with wide spans of control to make erroneous decisions that have broad organizational impact — broad enough to present risk great enough to threaten the organization's existence. Even more damaging is the inability of H-Managers to recognize that undesirable outcomes of past decisions could be the result of the Dunning-Kruger Effect. Top
Next Issue
Are you fed up with tense, explosive meetings? Are you or a colleague the target of a bully? Destructive conflict can ruin organizations. But if we believe that all conflict is destructive, and that we can somehow eliminate conflict, or that conflict is an enemy of productivity, then we're in conflict with Conflict itself. Read 101 Tips for Managing Conflict to learn how to make peace with conflict and make it an organizational asset. Order Now!
More about the Dunning-Kruger Effect
How to Reject Expert Opinion: II [January 4, 2012]
- When groups of decision makers confront complex problems, and they receive opinions from recognized experts, those opinions sometimes conflict with the group's own preferences. What tactics do groups use to reject the opinions of people with relevant expertise?
Devious Political Tactics: More from the Field Manual [August 29, 2012]
- Careful observation of workplace politics reveals an assortment of devious tactics that the ruthless use to gain advantage. Here are some of their techniques, with suggestions for effective responses.
Overconfidence at Work [April 15, 2015]
- Confidence in our judgments and ourselves is essential to success. Confidence misplaced — overconfidence — leads to trouble and failure. Understanding the causes and consequences of overconfidence can be most useful.
Wishful Thinking and Perception: II [November 4, 2015]
- Continuing our exploration of causes of wishful thinking and what we can do about it, here's Part II of a little catalog of ways our preferences and wishes affect our perceptions.
Wishful Significance: II [December 23, 2015]
- When we're beset by seemingly unresolvable problems, we sometimes conclude that "wishful thinking" was the cause. Wishful thinking can result from errors in assessing the significance of our observations. Here's a second group of causes of erroneous assessment of significance.
Cognitive Biases and Influence: I [July 6, 2016]
- The techniques of influence include inadvertent — and not-so-inadvertent — uses of cognitive biases. They are one way we lead each other to accept or decide things that rationality cannot support.
The Paradox of Carefully Chosen Words [November 16, 2016]
- When we take special care in choosing our words, so as to avoid creating misimpressions, something strange often happens: we create a misimpression of ignorance or deceitfulness. Why does this happen?
Risk Acceptance: One Path [March 3, 2021]
- When a project team decides to accept a risk, and when their project eventually experiences that risk, a natural question arises: What were they thinking? Cognitive biases, other psychological phenomena, and organizational dysfunction all can play roles.
Cassandra at Work [April 13, 2022]
- When a team makes a wrong choice, and only a tiny minority advocated for what turned out to have been the right choice, trouble can arise when the error at last becomes evident. Maintaining team cohesion can be a difficult challenge for team leaders.
Embedded Technology Groups and the Dunning-Kruger Effect [March 12, 2025]
- Groups of technical specialists in fields that differ markedly from the main business of the enterprise that hosts them must sometimes deal with wrong-headed decisions made by people who think they know more about the technology than they actually do.
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrentSgXnAlNVWlhxNIJner@ChacAtZoEYrrmofzZnjPoCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and
found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
Effects of Shared Information Bias: II
- Shared information bias is widely recognized as a cause of bad decisions. But over time, it can also
erode a group's ability to assess reality accurately. That can lead to a widening gap between reality
and the group's perceptions of reality.
Neglect of Probability
- Neglect of Probability is a cognitive bias that leads to poor decisions. The risk of poor decisions
is elevated when we must select an option from a set in which some have outstandingly preferable possible
outcomes with low probabilities of occurring.
Seven Planning Pitfalls: II
- Plans are well known for working out differently from what we intended. Sometimes, the unintended outcome
is due to external factors over which the planning team has little control. Two examples are priming
effects and widely held but inapplicable beliefs.
Choice-Supportive Bias
- Choice-supportive bias is a cognitive bias that causes us to assess our past choices as more fitting
than they actually were. The erroneous judgments it produces can be especially costly to organizations
interested in improving decision processes.
Evaluability Bias
- Evaluability Bias is a cognitive bias. Like many other cognitive biases, it affects our ability to choose
rationally. At work, biased choice can cause us to commit to courses of action that interfere with our
achieving goals we claim to be pursuing.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
Coming April 2: Mitigating the Trauma of Being Laid Off
- Trauma is an emotional response to horrible events — accidents, crimes, disasters, physical abuse, emotional abuse, gross injustices — and layoffs. Layoff trauma is real. Employers know how to execute layoffs with compassion, but some act out of cruelty. Know how to defend yourself. Available here and by RSS on April 2.
And on April 9: Defining Workplace Bullying
- When we set out to control the incidence of workplace bullying, problem number one is defining bullying behavior. We know much more about bullying in children than we do about adult bullying, and more about adult bullying than we know about workplace bullying. Available here and by RSS on April 9.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrentSgXnAlNVWlhxNIJner@ChacAtZoEYrrmofzZnjPoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick





Recommend this issue to a friend
Send an email message to a friend
rbrentSgXnAlNVWlhxNIJner@ChacAtZoEYrrmofzZnjPoCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed
