Among the more recently described cognitive biases is one known as the Evaluability Bias. It is "the tendency to weight the importance of an attribute in proportion to its ease of evaluation, rather than based on criteria that are deemed as more relevant after reflection." [Caviola, et al. 2014] Said differently, when assessing the value of an option, we assign importance to the option's attributes. The Evaluability Bias causes us to tend to assign too little importance to attributes that are relatively difficult to evaluate, compared to the importance we assign to attributes that are relatively easy to evaluate.
What Evaluability Bias is
Caviola et al. provide a careful study of this phenomenon in the domain of charitable giving. They find that when deciding which charities to support, donors tend to assign too much importance to the "overhead ratio," an easily measured attribute that corresponds to the ratio of administrative expenses to total donations. And donors assign too little importance to cost-effectiveness, which is a much more difficult-to-measure quantity that is, essentially, the value of good works done per unit value of donations.
We can Evaluability Bias causes us to tend to assign too little
importance to attributes that are relatively difficult
to evaluate, compared to the importance we assign to
attributes that are relatively easy to evaluateunderstand other cognitive biases, described earlier in the history of cognitive biases, in terms of Evaluability Bias. One example is a bias known asscope insensitivity or scope neglect. [Kahneman 2000] Originally named extension neglect by Kahneman, scope neglect is the tendency to assign inappropriately low weight to the quantity, scale, or scope of the option in question. For example, when comparing the importance of abuse of different drugs, people tend not to take into account the scale of the drug's abuse: differences in the number of abusers of each drug.
Evaluability Bias and technical debt
In the workplace, Evaluability Bias can have alarmingly deleterious effects. For example, nearly every organization depends on rational decision-making in the context of software development, either because they produce software products, or because they have business technology functions that produce software for internal use.
And because technical debt is a live issue that can afflict all software, it's important to make rational decisions about retiring existing technical debt and about preventing formation of new technical debt. Evaluability Bias is relevant because those decisions inevitably involve choosing which instances of technical debt we will retire. One of the important attributes of these choices is the cost of not retiring it. The cost of not retiring a specific class or instance of technical debt is often neglected, but even when we consider it, it is an attribute most notoriously difficult to measure.
Last words
When consider training for ourselves or for others, we tend not to consider training in limiting the effects of cognitive biases in decision-making. Surely it would be helpful, even if calculating its value with any useful degree of accuracy would be challenging. Evaluability Bias, ironically, might be playing a role in preventing organizations from training their people in methods for limiting the effects of Evaluability Bias. Top Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- Effects of Shared Information Bias: II
- Shared information bias is widely recognized as a cause of bad decisions. But over time, it can also
erode a group's ability to assess reality accurately. That can lead to a widening gap between reality
and the group's perceptions of reality.
- Neglect of Probability
- Neglect of Probability is a cognitive bias that leads to poor decisions. The risk of poor decisions
is elevated when we must select an option from a set in which some have outstandingly preferable possible
outcomes with low probabilities of occurring.
- Motivated Reasoning
- When we prefer a certain outcome of a decision process, we risk falling into a pattern of motivated
reasoning. That can cause us to gather data and construct arguments that erroneously lead to the
outcome we prefer, often outside our awareness. And it can happen even when the outcome we prefer is
known to threaten our safety and security.
- Seven Planning Pitfalls: I
- Whether in war or in projects, plans rarely work out as, umm well, as planned. In part, this is due
to our limited ability to foretell the future, or to know what we don't know. But some of the problem
arises from the way we think. And if we understand this we can make better plans.
- Managing Dunning-Kruger Risk
- A cognitive bias called the Dunning-Kruger Effect can create risk for organizational missions that require
expertise beyond the range of knowledge and experience of decision-makers. They might misjudge the organization's
capacity to execute the mission successfully. They might even be unaware of the risk of so misjudging.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming December 11: White Water Rafting as a Metaphor for Group Development
- Tuckman's model of small group development, best known as "Forming-Storming-Norming-Performing," applies better to development of some groups than to others. We can use a metaphor to explore how the model applies to Storming in task-oriented work groups. Available here and by RSS on December 11.
- And on December 18: Subgrouping and Conway's Law
- When task-oriented work groups address complex tasks, they might form subgroups to address subtasks. The structure of the subgroups and the order in which they form depend on the structure of the group's task and the sequencing of the subtasks. Available here and by RSS on December 18.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed