
Matt Schaub, as quarterback for the American football team known as the Houston Texans, about to deliver a pass. American professional football provides another example of neglect of probability. As I approach secondary school graduation, I receive offers for both a full athletic scholarship as a football player at one institution with a championship football team, and a full academic scholarship at another institution with a leading program in Computer Science. The athletic scholarship has the largest post-graduation upside if I can avoid injury and play well enough to be drafted by a professional team. But according to the U.S. National Collegiate Athletic Association, the probability of selection by a pro football team is 1.6%. This is low compared to the probability of a Computer Science graduate landing a high-paying job — 80% or more. Nevertheless, many choose the athletic scholarship over the academic scholarship. They neglect probability. In this example, the probabilities have a significant effect on the expected value of the outcome of each choice.
When we've made a decision that has led to serious trouble, it's possible that a cognitive bias known as neglect of probability might have played a role. This cognitive bias can affect decision making by causing us to choose among options based solely on the values of the respective "best outcomes," respectively, of those options. Using this criterion leads to trouble when the outcomes of the options are uncertain, because it ignores the probabilities of actually achieving those outcomes.
Stated a Neglect of Probability causes us
to choose among options based
solely on the values of their
respective "best outcomes,"
ignoring the likelihood of those
outcomes actually coming aboutbit more precisely, a rational decision would be based on considering both the value of an option's potential outcome and the probability of actually generating that outcome. Instead, when we're under the spell of Neglect of Probability, we tend to assess the goodness of an option by focusing solely (or excessively) on the value of its best outcome, while ignoring the probability of achieving that outcome.
An illustration might clarify this effect further.
The IT department at Dewey, Cheatham and Howe, LLP, (a fictitious global law firm), is upgrading the operating systems of DCH's fleet of personal computers from PC-OS 11 to PC-OS 12 (a fictitious operating system). The task would be relatively straightforward were it not for the enormous number of commercial and custom applications running on those computers. DCH's experts expect that the change from PC-OS 11 to PC-OS 12 will render many of those applications unusable. Among the 18,000 total applications, most are expected to operate correctly, but many will not. The full list of questionable applications is unknown.
Testing all 18,000 applications is an impractically large effort. But by working with the vendors of the commercial applications, and by collaborating with IT departments in other law firms, DCH has reduced the number of applications whose status is unknown to a mere 1,500. That number is less daunting, but it's still impossibly large.
IT has therefore decided to let users of each of the questionable applications perform the testing and certification, in three stages. In Phase I an application expert checks that the app operates in PC-OS 12. If it does, then in Phase II for that app, 10% of the app's users are authorized to use the app for up to 30 days. If they report no problems, then the app is cleared for Phase III, general use. If the Phase II users do report problems, usage of that app is suspended and IT works with the vendor or internal author to resolve the issue. When the issue is resolved, the app is returned to Phase II for another 30 days and the procedure repeats until the app is cleared.
In this way, IT can reduce the number of apps that need more thorough testing. And when an app functions properly, the cost of determining that it does so is very low. These cost-control features are very attractive to IT decision makers.
But there's a problem with this approach. In the 30 days during which Phase II users operate untested apps, those untested apps might corrupt existing data or documents, without the knowledge of the users of the apps. Based on prior experience with PC-OS 10, this corruption is almost certain to occur in many apps. Therefore, the probability of a successful outcome for IT's intended three-phase approach is very low. But the decision makers in IT who conceived of this plan are neglecting the probability of that good outcome. They're attracted by the low cost of the best outcome, and that has caused them to ignore the fact that some applications — IT knows not which ones — will almost certainly cause data corruption. In this example, IT's approach might have a very good outcome, but the probability of that outcome is small.
Neglect of Probability is most likely to play a role in decision making when the decision-making scenario involves choosing among a set of options, some of which have outcomes very much more attractive than others. In these decision-making scenarios, Neglect of Probability tends to cause the decision makers to choose options with too little regard for — or without regard for — the probabilities of success of the various options.
And even when decision makers do consider probabilities, the risk of a poor decision remains unacceptably high for some kinds of mission-critical decisions. Decision makers might in some cases estimate probabilities in a biased way that tends to favor the options with the outcomes they find most appealing. There are indeed many ways to mess things up. Top
Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Your comments are welcome
Would you like to see your comments posted here? rbrenHoWzUJVeioCfozEIner@ChacbnsTPttsdDaRAswloCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
Confirmation Bias: Workplace Consequences Part I
- We continue our exploration of confirmation bias, paying special attention to the consequences it causes
in the workplace. In this part, we explore its effects on our thinking.
On Standing Aside
- Occasionally we're asked to participate in deliberations about issues relating to our work responsibilities.
Usually we respond in good faith. And sometimes we — or those around us — can't be certain
that we're responding in good faith. In those situations, we must stand aside.
Cognitive Biases at Work
- Cognitive biases can lead us to misunderstand situations, overlook options, and make decisions we regret.
The patterns of thinking that lead to cognitive biases provide speed and economy advantages, but we
must manage the risks that come along with them.
Risk Acceptance: Naïve Realism
- When we suddenly notice a "project-killer" risk that hasn't yet materialized, we sometimes
accept the risk even though we know how seriously it threatens the effort. A psychological phenomenon
known as naïve realism plays a role in this behavior.
Downscoping Under Pressure: II
- We sometimes "downscope" projects to bring them back on budget and schedule when they're headed
for overruns. Downscoping doesn't always work. Cognitive biases like the sunk cost effect and confirmation
bias can distort decisions about how to downscope.
See also Cognitive Biases at Work and Critical Thinking at Work for more related articles.
Forthcoming issues of Point Lookout
Coming June 14: Pseudo-Collaborations
- Most workplace collaborations produce results of value. But some collaborations — pseudo-collaborations — are inherently incapable of producing value, due to performance management systems, or lack of authority, or lack of access to information. Available here and by RSS on June 14.
And on June 21: Asking Burning Questions
- When we suddenly realize that an important question needs answering, directly asking that question in a meeting might not be an effective way to focus the attention of the group. There are risks. Fortunately, there are also ways to manage those risks. Available here and by RSS on June 21.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenHoWzUJVeioCfozEIner@ChacbnsTPttsdDaRAswloCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick





Recommend this issue to a friend
Send an email message to a friend
rbrenHoWzUJVeioCfozEIner@ChacbnsTPttsdDaRAswloCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed
