Overconfidence is the state of having too much confidence — confidence beyond levels justified by evidence. One trouble with that definition is that it provides little useful insight: how much is too much? A second trouble is that it exemplifies itself, in that it presumes that levels of confidence can be assigned with, um, confidence. Often we can do no such thing. Situations affected by overconfidence include hiring, making strategic choices, chartering projects, cancelling projects — indeed, most workplace decision-making. Despite the vagueness of the concept of overconfidence we can make useful conclusions, if we examine the concept more closely.
That's what Don Moore and Paul Healy did in a 2008 paper — cited in over 800 other works (according to Google Scholar), which is a goodly number for such a short time. The authors note that conflicting results in overconfidence research can be resolved when one realizes that the term overconfidence had been used to denote three different classes of judgment errors. They are:
- Overestimation: assessing as too high one's actual ability, performance, level of control, or chance of success.
- Overplacement: the belief that one is better than others, such as when a majority of people rate themselves "better than average."
- Overprecision: excessive certainty regarding the accuracy of one's beliefs.
These tendencies are not character flaws. Rather, they arise from the state of being human — not in the sense of "to err is human," but, as a direct consequence of human psychology.
What is surprising is how little we do in organizations to protect ourselves and the organization from the effects of overconfidence. Indeed, some of our behaviors and policies actually induce overconfidence. Here are three examples.
- Unrealistic assessments of the capabilities of others
- A phenomenon known as the Dunning-Kruger Effect causes us to confuse competence and confidence. That is, we assess people as more capable when they project confidence, and inversely, less capable when they project uncertainty. This can lead to decision-making errors in hiring, and in evaluating the advice we receive from subordinates, consultants, experts, and the media.
- Unrealistic standards of precision
- When we Situations affected by overconfidence
include hiring, making strategic choices,
chartering or cancelling projects —
indeed, most workplace decision-makingevaluate projected performance for projects or business units, we require alignment between projections and actuals. The standards we apply when we assess performance typically exceed by far any reasonable expectations of the precision of those projections. This behavior encourages those making projections to commit the overprecision error.
- Unrealistic risk appetite
- Assessments of success in the context of risk, and our ability to mitigate risk, are subject to overestimation errors. By overestimating our chances of success, and our ability to deal with adversity, we repeatedly subject ourselves to higher levels of risk than we realize.
Cognitive biases that contribute to overconfidence in its various forms include, among others, the planning fallacy, optimism bias, illusory superiority, and, of course, the overconfidence effect. Most important, the bias blind spot causes us to be overconfident about the question of whether we ourselves are ever overconfident. We surely are. At least, I think so. Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
For an extensive investigation of the role of overconfidence in governmental policies that lead to war, see Dominic D. P. Johnson, Overconfidence and War: The Havoc and Glory of Positive Illusions, Cambridge, Massachusetts: Harvard University Press, 2004. Order from Amazon.com.
For more about the Dunning-Kruger Effect, see "The Paradox of Confidence," Point Lookout for January 7, 2009; "How to Reject Expert Opinion: II," Point Lookout for January 4, 2012; "Devious Political Tactics: More from the Field Manual," Point Lookout for August 29, 2012; "Wishful Thinking and Perception: II," Point Lookout for November 4, 2015; "Wishful Significance: II," Point Lookout for December 23, 2015; "Cognitive Biases and Influence: I," Point Lookout for July 6, 2016; and "The Paradox of Carefully Chosen Words," Point Lookout for November 16, 2016.
Your comments are welcomeWould you like to see your comments posted here? rbrenmhXARWRMUvVyOdHlner@ChacxgDmtwOKrxnripPCoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Personal, Team, and Organizational Effectiveness:
- Keep a Not-To-Do List
- Unless you execute all your action items immediately, they probably end up on your To-Do list. Since
they're a source of stress, you'll feel better if you can find a way to avoid acquiring them. Having
a Not-To-Do list reminds you that some things are really not your problem.
- Problem-Solving Preferences
- When people solve problems together, differences in preferred approaches can surface. Some prefer to
emphasize the goal or objective, while others focus on the obstacles. This difference is at once an
asset and annoyance.
- Deciding to Change: Trusting
- When organizations change by choice, people who are included in the decision process understand the
issues. Whether they agree with the decision or not, they participate in the decision in some way. But
not everyone is included in the process. What about those who are excluded?
- Meeting Troubles: Culture
- Sometimes meetings are less effective than they might be because of cultural factors that are outside
our awareness. Here are some examples.
- Issues-Only Team Meetings
- Time spent in regular meetings is productive to the extent that it moves the team closer to its objectives.
Because uncovering and clarifying issues is more productive than distributing information or listening
to status reports, issues-only team meetings focus energy where it will help most.
Forthcoming issues of Point Lookout
- Coming September 18: The Planning Fallacy and Self-Interest
- A well-known cognitive bias, the planning fallacy, accounts for many unrealistic estimates of project cost and schedule. Overruns are common. But another cognitive bias, and organizational politics, combine with the planning fallacy to make a bad situation even worse. Available here and by RSS on September 18.
- And on September 25: Planning Disappointments
- When we plan projects, we make estimates of total costs and expected delivery dates. Often these estimates are so wrong — in the wrong direction — that we might as well be planning disappointments. Why is this? Available here and by RSS on September 25.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenmhXARWRMUvVyOdHlner@ChacxgDmtwOKrxnripPCoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500 words in your inbox in one hour. License any article from this Web site. More info
- The Race to the South Pole: Lessons in Leadership
On 14 December 1911, four men led by Roald Amundsen reached the South Pole. Thirty-five days later, Robert F. Scott and four others followed. Amundsen had won the race to the pole. Amundsen's party returned to base on 26 January 1912. Scott's party perished. As historical drama, why this happened is interesting enough. But to organizational leaders, business analysts, project sponsors, and project managers, the story is fascinating. We'll use the history of this event to explore lessons in leadership and its application to organizational efforts. A fascinating and refreshing look at leadership from the vantage point of history. Read more about this program.
Here's a date for this program:
- Baldwin-Wallace University, 275 Eastland Road, Berea, Ohio
44017: November 7,
Kerzner Lecture Series/International Project Management Day, sponsored by Baldwin Wallace University and the Northeast Ohio Chapter of the Project Management Institute.
- Baldwin-Wallace University, 275 Eastland Road, Berea, Ohio 44017: November 7, Kerzner Lecture Series/International Project Management Day, sponsored by Baldwin Wallace University and the Northeast Ohio Chapter of the Project Management Institute. Register now.
- The Power Affect: How We Express Our Personal Power
Many people who possess real organizational power have a characteristic demeanor. It's the way they project their presence. I call this the power affect. Some people — call them power pretenders — adopt the power affect well before they attain significant organizational power. Unfortunately for their colleagues, and for their organizations, power pretenders can attain organizational power out of proportion to their merit or abilities. Understanding the power affect is therefore important for anyone who aims to attain power, or anyone who works with power pretenders. Read more about this program.