In many contexts in organizations, decision makers seek briefings from subject matter experts who are often subordinates or external consultants. These experts then present elements of a "business case" (for or against) some position relative to the issue at hand. And the preferred format for these cases is a series of bullet points. Often, it's a series of series, called a "presentation" or "slide deck." No matter how complex the argument of the business case, bullet points are the preferred form. This fixation on a single form for all arguments is bullet point madness, because it creates a risk of making poor decisions.
In support of this assertion, consider two possible risks. First, the bullet point form might be inherently limited with respect to the kinds of arguments it can represent with clarity. And second, the authors of bullet points might exploit any of the many available tools that can distort the audience's ability to assess the validity of the arguments presented. I'll address this second risk next time. For now, let's consider the inherent limitations of the bullet point format for making complex logical arguments.
The best any presentation can do is to convey an impression that's a simplified but serviceable representation of reality. Presenters and audience alike hope that the simplified forms correspond well enough to reality to ensure the workability of any decisions based on that representation. But the requirement that we must distill all arguments into bullet points, or a series of series of bullet points, limits the set of realities that we can represent accurately enough and completely enough. That leads to trouble, because the bullet point form isn't capable of presenting faithful representations of every reality or every logical argument. And the root of the problem lies in the nature of the bullet point itself.
Experts tell The requirement that we must
distill all arguments into bullet
points, or a series of series of
bullet points, limits the set
of realities that we can
represent accurately enough
and completely enoughus that each bullet point must be concise, crisp, and restricted to a single salient idea. When we write our bullet points, we necessarily trim them down to conform to this ideal. Then, when we actually make the presentation, we restore the connections and supporting ideas that coalesce the bullet points into a coherent whole. When we do that, we try to convey the thought process that the bullet points represent. And therein lies the risk. We have difficulty tying together the bullet points because of a cognitive bias known as the illusion of transparency.
The illusion of transparency is the human tendency to attribute to others greater awareness of our own mental or emotional state than those others actually possess. Common examples of this effect relate to our emotions or our feelings about our own performance. For example, if we feel unsure about our public speaking skills, we tend to believe that the inadequacy we feel is more evident to others than it actually is.
But the illusion of transparency is more powerful than that. It can also affect our assessment of the level of understanding others have of what we're trying to communicate to them. We tend to overestimate how well aligned is the audience's understanding to the message we're trying to convey. And so, when we explain our bullet points — an activity necessitated by our having trimmed them down to their ideal level of crisp conciseness — we tend to overestimate the firmness of the audience's grasp of our complete message.
This overestimate of the audience's understanding arises from our inability to know what audience members are thinking about what we're presenting. We cannot know everything about their background or experience. We cannot know what meaning they're making of our bullet points or our words. We cannot even know how closely they're paying attention, unless something really unusual happens.
A second area of difficulty for the bullet point format is its inherently linear structure. The bullet points in each cluster of bullet points are arranged in some order. When people read them, or listen to them as they're presented, they take them in order, as if one leads logically to the next, or as if one depends logically on its predecessor. In many actual situations, there is no ordering among the bullet points. In other situations, there is an ordering, but the ordering isn't linear. Or there might be a linear ordering for some of the bullets, but the remaining bullets might affect each other mutually, in a loop, or even a web.
Makers of presentation software have provided templates for some of these situations. These templates, some of which are illustrated above, do help when the bullets in question are "near" each other in the thread of the logical argument, and near each other in their physical placement in documents. But some arguments truly do require sprawling webs of relationships among concepts.
Consider, for example, moving an entire information management system from on-premises configurations to the cloud. Hard work is involved, for both presenter and audience. A linear series of bullet points probably wouldn't be able to fairly present the business case for such a complex decision. Most likely, a sound decision would depend on an examination of the issues involved based on something more complex than a series (or series of series) of bullet points.
The bullet point format does have its place — for simple decisions, or for smaller, self-contained sectors of the knowledge space supporting more complex decisions. But that role is limited. For complex decisions, we actually do need to think.
Next time we'll examine some of the tools advocates can use to make the bullet point format appear to provide a stronger foundation for complex decisions than it actually can provide. Next in this series Top Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Your comments are welcomeWould you like to see your comments posted here? rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- The Planning Fallacy and Self-Interest
- A well-known cognitive bias, the planning fallacy, accounts for many unrealistic estimates of project
cost and schedule. Overruns are common. But another cognitive bias, and organizational politics, combine
with the planning fallacy to make a bad situation even worse.
- Seven More Planning Pitfalls: II
- Planning teams, like all teams, are susceptible to several patterns of interaction that can lead to
counter-productive results. Three of these most relevant to planners are False Consensus, Groupthink,
and Shared Information Bias.
- Choice-Supportive Bias
- Choice-supportive bias is a cognitive bias that causes us to assess our past choices as more fitting
than they actually were. The erroneous judgments it produces can be especially costly to organizations
interested in improving decision processes.
- Illusory Management: II
- Many believe that managers control organizational performance more precisely than they actually do.
This illusion might arise, in part, from a mechanism that causes leaders and the people they lead to
tend to misattribute organizational success.
- The Illusion of Explanatory Depth
- The illusion of explanatory depth is the tendency of humans to believe they understand something better
than they actually do. Discovering the illusion when you're explaining something is worse than embarrassing.
It can be career ending.
Forthcoming issues of Point Lookout
- Coming December 13: Contrary Indicators of Psychological Safety: I
- To take the risks that learning and practicing new ways require, we all need a sense that trial-and-error approaches are safe. Organizations seeking to improve processes would do well to begin by assessing their level of psychological safety. Available here and by RSS on December 13.
- And on December 20: Contrary Indicators of Psychological Safety: II
- When we begin using new tools or processes, we make mistakes. Practice is the cure, but practice can be scary if the grace period for early mistakes is too short. For teams adopting new methods, psychological safety is a fundamental component of success. Available here and by RSS on December 20.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info