To be overconfident is to have a greater sense of confidence in one's talents, ability, or judgment than is justified. Overconfidence can be a general condition, or it can be relative to particular circumstances. In any case, the overconfidence effect is a well-known cognitive bias that causes us to make errors when assessing our own abilities. Some innovative experiments have indicated that a string of recent successes can increase the degree of overconfidence one exhibits. [Hilary 2011]
That finding suggests that overconfidence and the overconfidence effect present real risks to organizations, where it is common practice to allocate increasingly important responsibilities to people who have had recent successes. That's why it struck me as important to contemplate a subset of those risks — the risks of astonishing successes.
Those risks arise from interactions among several cognitive biases. Self-serving bias causes us to attribute success to our own abilities, and to attribute failures to external factors. Hindsight bias is our tendency to perceive past events as having been more predictable than they were. Confirmation bias causes us to seek out information that confirms our preconceptions, rejecting information that would tend to disconfirm those preconceptions. [Nickerson 1998]
And it's reasonable to suppose that rational decision-making is most at risk from these biases when emotions are strong and the situation is relatively chaotic. The central enabler of interaction among these cognitive biases is astonishment, which leads to the emotions and chaos to which rational thought can be so vulnerable.
In what follows I use the term actor to refer the person (or organization) that has experienced a string of astonishing successes. The story below sketches how self-serving bias, the overconfidence effect, hindsight bias, and confirmation bias can conspire to expose the actor to elevated risk of failure when the actor has experienced a string of astonishing successes. We begin with self-serving bias.
Self-serving bias distorts interpretations of success and failure. It causes us to tend to interpret successes as direct results of our talents and abilities. And it causes us to tend to interpret failures and troubles as results of external factors.
Self-serving bias thus establishes a framework in which the search for explanations of astonishing success tends to focus on the actor. That focus reduces the likelihood that the actor will appreciate the importance of external factors in producing the success.
The overconfidence effect
The overconfidence Rational decision-making is most
at risk from cognitive biases when
emotions are strong and the
situation is relatively chaoticeffect causes us to commit errors of three kinds. First, we tend to overestimate the quality of our own performance. Second, we tend to rate our own performance as higher than it actually is relative to the performance of others. And finally, we tend to underestimate the uncertainty of our beliefs — that is, we have excessive confidence that we know the truth.
Experimental evidence for these effects is strong — the overconfidence effect has been observed in a wide range of circumstances. We don't need to place ourselves under special circumstances of stress or challenge or misinformation to induce overconfidence. Just as walking consumes calories, being human occasionally leads to overconfidence. And a string of successes increases the likelihood that we will exhibit overconfidence.
The overconfidence effect thus establishes tension between the feeling of superior capability and the astonishment arising from the success. In words, the question arises, "If I am as capable as I believe I am, why am I so surprised at my success?" The desire to resolve that tension causes us to search for an explanation. And that search makes us vulnerable to the next cognitive bias of the story: hindsight bias.
Hindsight bias, also known as the knew-it-all-along phenomenon, is the human tendency to perceive past events as having been more predictable than they actually were. [Roese 2012] In a real sense, hindsight bias is a form of overconfidence in one's ability to predict how a situation will evolve with time. Hindsight bias might arise from the superiority of our ability to recall the chain of events that occurred compared to our ability to recall the sometimes-complex and changing contexts in which that chain of events occurred.
When we experience a string of successes, we gain a sense of confidence. But occasionally the successes are so unusual that they create a sense of astonishment. That emotion can disrupt not only our view of the circumstances in which the success occurred, but also our view of ourselves. An unsettling question arises: How did this happen? Finding answers can be challenging when we recall so little about the effects of then-contemporary circumstances.
Self-serving bias causes us to tend to discount the effects of those circumstances, leading is to interpret the successes as direct results of our talents and abilities. Given that understanding of what happened, hindsight bias causes us to observe, "I knew I was good at this!" But then a problem arises.
The effects of astonishment
Surprise, the emotion we experience when something unexpected occurs, has a three-fold role in hindsight bias. [Müller 2007] First, we use surprise to estimate the degree of conflict between what we would have predicted and what actually occurred. Second, surprise triggers the processes we use to make sense of what did happen. Sense-making is a step necessary for adjusting our understanding of the world. Finally, surprise biases the sense-making process to help us pick out the factors that we missed or undervalued in the process of predicting what would have happened. When time and emotion and resources permit, these three roles collaborate to help us make predictions more accurately.
Astonishment is an extreme form of surprise. Astonishment is more likely than mere surprise to instill emotions like fear, wonder, or awe. An example of a surprise is the emotion you experience upon finding an extra bag of fries in your fast food order, or when you find that you won $25 in a lottery. An example of astonishment is the feeling you might experience upon discovering that your home has been burglarized, or that a distant relative, recently deceased, has bequeathed you the equivalent of a two years of annual earnings.
From astonishment to doubt
Astonishment at a success causes us to ask, "How did I not know I was this good?" Or, "If I didn't know I was this good, am I really good?" And that is a most discomfiting question. Doubt enters the scene.
The doubts raise questions about the validity of the superiority premise — the premise that the actor has superior talent, capability or judgment.
To resolve the doubts, a search of records and recollections begins. If the search is formal, it's called a retrospective, an after-action review, a lessons-learned session or something similar. But formal or not, the search is vulnerable to another cognitive bias — confirmation bias. Confirmation bias tends to focus the search on items that confirm the superiority premise. And the search will likely be successful, because, after all, the actor's effort was a success.
Risk becomes elevated after astonishing success because the impression created by the search for explanations can produce a severely distorted assessment of the superiority of the talent, capability, or judgment of the actor. The search becomes a search for overlooked clues as to superior capability instead of a search for overlooked external reasons for success. The result can be a form of overconfidence as exaggerated as the success was astonishing. Top Next Issue
Occasionally we have the experience of belonging to a great team. Thrilling as it is, the experience is rare. In part, it's rare because we usually strive only for adequacy, not for greatness. We do this because we don't fully appreciate the returns on greatness. Not only does it feel good to be part of great team — it pays off. Check out my Great Teams Workshop to lead your team onto the path toward greatness. More info
Your comments are welcomeWould you like to see your comments posted here? rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- The Focusing Illusion in Organizations
- The judgments we make at work, like the judgments we make elsewhere in life, are subject to human fallibility
in the form of cognitive biases. One of these is the Focusing Illusion. Here are some examples to watch for.
- Confirmation Bias: Workplace Consequences Part II
- We continue our exploration of confirmation bias. In this Part II, we explore its effects in management
- The Planning Fallacy and Self-Interest
- A well-known cognitive bias, the planning fallacy, accounts for many unrealistic estimates of project
cost and schedule. Overruns are common. But another cognitive bias, and organizational politics, combine
with the planning fallacy to make a bad situation even worse.
- On Standing Aside
- Occasionally we're asked to participate in deliberations about issues relating to our work responsibilities.
Usually we respond in good faith. And sometimes we — or those around us — can't be certain
that we're responding in good faith. In those situations, we must stand aside.
- Seven Planning Pitfalls: II
- Plans are well known for working out differently from what we intended. Sometimes, the unintended outcome
is due to external factors over which the planning team has little control. Two examples are priming
effects and widely held but inapplicable beliefs.
Forthcoming issues of Point Lookout
- Coming December 13: Contrary Indicators of Psychological Safety: I
- To take the risks that learning and practicing new ways require, we all need a sense that trial-and-error approaches are safe. Organizations seeking to improve processes would do well to begin by assessing their level of psychological safety. Available here and by RSS on December 13.
- And on December 20: Contrary Indicators of Psychological Safety: II
- When we begin using new tools or processes, we make mistakes. Practice is the cure, but practice can be scary if the grace period for early mistakes is too short. For teams adopting new methods, psychological safety is a fundamental component of success. Available here and by RSS on December 20.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info