
Matt Schaub, as quarterback for the American football team known as the Houston Texans, about to deliver a pass. American professional football provides another example of neglect of probability. As I approach secondary school graduation, I receive offers for both a full athletic scholarship as a football player at one institution with a championship football team, and a full academic scholarship at another institution with a leading program in Computer Science. The athletic scholarship has the largest post-graduation upside if I can avoid injury and play well enough to be drafted by a professional team. But according to the U.S. National Collegiate Athletic Association, the probability of selection by a pro football team is 1.6%. This is low compared to the probability of a Computer Science graduate landing a high-paying job — 80% or more. Nevertheless, many choose the athletic scholarship over the academic scholarship. They neglect probability. In this example, the probabilities have a significant effect on the expected value of the outcome of each choice.
When we've made a decision that has led to serious trouble, it's possible that a cognitive bias known as neglect of probability might have played a role. This cognitive bias can affect decision making by causing us to choose among options based solely on the values of the respective "best outcomes," respectively, of those options. Using this criterion leads to trouble when the outcomes of the options are uncertain, because it ignores the probabilities of actually achieving those outcomes.
Stated a Neglect of Probability causes us
to choose among options based
solely on the values of their
respective "best outcomes,"
ignoring the likelihood of those
outcomes actually coming aboutbit more precisely, a rational decision would be based on considering both the value of an option's potential outcome and the probability of actually generating that outcome. Instead, when we're under the spell of Neglect of Probability, we tend to assess the goodness of an option by focusing solely (or excessively) on the value of its best outcome, while ignoring the probability of achieving that outcome.
An illustration might clarify this effect further.
The IT department at Dewey, Cheatham and Howe, LLP, (a fictitious global law firm), is upgrading the operating systems of DCH's fleet of personal computers from PC-OS 11 to PC-OS 12 (a fictitious operating system). The task would be relatively straightforward were it not for the enormous number of commercial and custom applications running on those computers. DCH's experts expect that the change from PC-OS 11 to PC-OS 12 will render many of those applications unusable. Among the 18,000 total applications, most are expected to operate correctly, but many will not. The full list of questionable applications is unknown.
Testing all 18,000 applications is an impractically large effort. But by working with the vendors of the commercial applications, and by collaborating with IT departments in other law firms, DCH has reduced the number of applications whose status is unknown to a mere 1,500. That number is less daunting, but it's still impossibly large.
IT has therefore decided to let users of each of the questionable applications perform the testing and certification, in three stages. In Phase I an application expert checks that the app operates in PC-OS 12. If it does, then in Phase II for that app, 10% of the app's users are authorized to use the app for up to 30 days. If they report no problems, then the app is cleared for Phase III, general use. If the Phase II users do report problems, usage of that app is suspended and IT works with the vendor or internal author to resolve the issue. When the issue is resolved, the app is returned to Phase II for another 30 days and the procedure repeats until the app is cleared.
In this way, IT can reduce the number of apps that need more thorough testing. And when an app functions properly, the cost of determining that it does so is very low. These cost-control features are very attractive to IT decision makers.
But there's a problem with this approach. In the 30 days during which Phase II users operate untested apps, those untested apps might corrupt existing data or documents, without the knowledge of the users of the apps. Based on prior experience with PC-OS 10, this corruption is almost certain to occur in many apps. Therefore, the probability of a successful outcome for IT's intended three-phase approach is very low. But the decision makers in IT who conceived of this plan are neglecting the probability of that good outcome. They're attracted by the low cost of the best outcome, and that has caused them to ignore the fact that some applications — IT knows not which ones — will almost certainly cause data corruption. In this example, IT's approach might have a very good outcome, but the probability of that outcome is small.
Neglect of Probability is most likely to play a role in decision making when the decision-making scenario involves choosing among a set of options, some of which have outcomes very much more attractive than others. In these decision-making scenarios, Neglect of Probability tends to cause the decision makers to choose options with too little regard for — or without regard for — the probabilities of success of the various options.
And even when decision makers do consider probabilities, the risk of a poor decision remains unacceptably high for some kinds of mission-critical decisions. Decision makers might in some cases estimate probabilities in a biased way that tends to favor the options with the outcomes they find most appealing. There are indeed many ways to mess things up. Top
Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Your comments are welcome
Would you like to see your comments posted here? Send me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and
found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
Effects of Shared Information Bias: II
- Shared information bias is widely recognized as a cause of bad decisions. But over time, it can also
erode a group's ability to assess reality accurately. That can lead to a widening gap between reality
and the group's perceptions of reality.
On Standing Aside
- Occasionally we're asked to participate in deliberations about issues relating to our work responsibilities.
Usually we respond in good faith. And sometimes we — or those around us — can't be certain
that we're responding in good faith. In those situations, we must stand aside.
Choice-Supportive Bias
- Choice-supportive bias is a cognitive bias that causes us to assess our past choices as more fitting
than they actually were. The erroneous judgments it produces can be especially costly to organizations
interested in improving decision processes.
The Risk of Astonishing Success
- When we experience success, we're more likely to develop overconfidence. And when the success is so
extreme as to induce astonishment, we become even more vulnerable to overconfidence. It's a real risk
of success that must be managed.
Additive Bias…or Not: I
- When we alter existing systems to enhance them, we tend to favor adding components even when subtracting
might be better. This effect has been attributed to a cognitive bias known as additive bias. But other
forces more important might be afoot.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
Coming May 21: Mismanaging Project Managers: Mechanics
- Most organizations hold project managers accountable for project performance. But they don?t grant those project managers control of needed resources. Nor do they hold Project Sponsors or other Senior Managers accountable for the consequences of their actions when they interfere with project work. Here?s a catalog of behaviors worth looking at. Available here and by RSS on May 21.
And on May 28: Mismanaging Project Managers: Leadership
- Most organizations hold project managers accountable for project performance. But they don't hold Project Sponsors or other Senior Managers accountable for the consequences of their actions when they interfere with the project manager's ability to lead the project team. Available here and by RSS on May 28.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenner@ChacoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, USD 11.95)
- Get 2003-4 in Why Dogs Wag (PDF, USD 11.95)
- Get 2005-6 in Loopy Things We Do (PDF, USD 11.95)
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, USD 11.95)
- Get 2009-10 in The Questions Not Asked (PDF, USD 11.95)
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, USD 28.99)
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
