When we've made a decision that has led to serious trouble, it's possible that a cognitive bias known as neglect of probability might have played a role. This cognitive bias can affect decision making by causing us to choose among options based solely on the values of the respective "best outcomes," respectively, of those options. Using this criterion leads to trouble when the outcomes of the options are uncertain, because it ignores the probabilities of actually achieving those outcomes.
Stated a Neglect of Probability causes us
to choose among options based
solely on the values of their
respective "best outcomes,"
ignoring the likelihood of those
outcomes actually coming aboutbit more precisely, a rational decision would be based on considering both the value of an option's potential outcome and the probability of actually generating that outcome. Instead, when we're under the spell of Neglect of Probability, we tend to assess the goodness of an option by focusing solely (or excessively) on the value of its best outcome, while ignoring the probability of achieving that outcome.
An illustration might clarify this effect further.
The IT department at Dewey, Cheatham and Howe, LLP, (a fictitious global law firm), is upgrading the operating systems of DCH's fleet of personal computers from PC-OS 11 to PC-OS 12 (a fictitious operating system). The task would be relatively straightforward were it not for the enormous number of commercial and custom applications running on those computers. DCH's experts expect that the change from PC-OS 11 to PC-OS 12 will render many of those applications unusable. Among the 18,000 total applications, most are expected to operate correctly, but many will not. The full list of questionable applications is unknown.
Testing all 18,000 applications is an impractically large effort. But by working with the vendors of the commercial applications, and by collaborating with IT departments in other law firms, DCH has reduced the number of applications whose status is unknown to a mere 1,500. That number is less daunting, but it's still impossibly large.
IT has therefore decided to let users of each of the questionable applications perform the testing and certification, in three stages. In Phase I an application expert checks that the app operates in PC-OS 12. If it does, then in Phase II for that app, 10% of the app's users are authorized to use the app for up to 30 days. If they report no problems, then the app is cleared for Phase III, general use. If the Phase II users do report problems, usage of that app is suspended and IT works with the vendor or internal author to resolve the issue. When the issue is resolved, the app is returned to Phase II for another 30 days and the procedure repeats until the app is cleared.
In this way, IT can reduce the number of apps that need more thorough testing. And when an app functions properly, the cost of determining that it does so is very low. These cost-control features are very attractive to IT decision makers.
But there's a problem with this approach. In the 30 days during which Phase II users operate untested apps, those untested apps might corrupt existing data or documents, without the knowledge of the users of the apps. Based on prior experience with PC-OS 10, this corruption is almost certain to occur in many apps. Therefore, the probability of a successful outcome for IT's intended three-phase approach is very low. But the decision makers in IT who conceived of this plan are neglecting the probability of that good outcome. They're attracted by the low cost of the best outcome, and that has caused them to ignore the fact that some applications — IT knows not which ones — will almost certainly cause data corruption. In this example, IT's approach might have a very good outcome, but the probability of that outcome is small.
Neglect of Probability is most likely to play a role in decision making when the decision-making scenario involves choosing among a set of options, some of which have outcomes very much more attractive than others. In these decision-making scenarios, Neglect of Probability tends to cause the decision makers to choose options with too little regard for — or without regard for — the probabilities of success of the various options.
And even when decision makers do consider probabilities, the risk of a poor decision remains unacceptably high for some kinds of mission-critical decisions. Decision makers might in some cases estimate probabilities in a biased way that tends to favor the options with the outcomes they find most appealing. There are indeed many ways to mess things up. Top Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Your comments are welcome
Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- The Trap of Beautiful Language
- As we assess the validity of others' statements, we risk making a characteristically human error —
we confuse the beauty of their language with the reliability of its meaning. We're easily thrown off
by alliteration, anaphora, epistrophe, and chiasmus.
- Some Perils of Reverse Scheduling
- Especially when time is tight, project sponsors sometimes ask their project managers to produce "reverse
schedules." They want to know what would have to be done by when to complete their projects "on
time." It's a risky process that produces aggressive schedules.
- The Risk of Astonishing Success
- When we experience success, we're more likely to develop overconfidence. And when the success is so
extreme as to induce astonishment, we become even more vulnerable to overconfidence. It's a real risk
of success that must be managed.
- Lessons Not Learned: I
- The planning fallacy is a cognitive bias that causes us to underestimate the cost and effort involved
in projects large and small. Mitigating its effects requires understanding how we go wrong when we plan
projects by referencing our own past experience.
- Managing Dunning-Kruger Risk
- A cognitive bias called the Dunning-Kruger Effect can create risk for organizational missions that require
expertise beyond the range of knowledge and experience of decision-makers. They might misjudge the organization's
capacity to execute the mission successfully. They might even be unaware of the risk of so misjudging.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming December 11: White Water Rafting as a Metaphor for Group Development
- Tuckman's model of small group development, best known as "Forming-Storming-Norming-Performing," applies better to development of some groups than to others. We can use a metaphor to explore how the model applies to Storming in task-oriented work groups. Available here and by RSS on December 11.
- And on December 18: Subgrouping and Conway's Law
- When task-oriented work groups address complex tasks, they might form subgroups to address subtasks. The structure of the subgroups and the order in which they form depend on the structure of the group's task and the sequencing of the subtasks. Available here and by RSS on December 18.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed