When a project is underway, or even when we're still in discussion stages, we sometimes stumble upon an unanticipated risk. Some unanticipated risks can be serious — fatal or near fatal to the project if they materialize. For convenience, I call these risks Very Bad News (VBN) risks. After serious study, if we can't devise practical approaches to avoid, transfer, reduce, or compensate for the VBN risk, we cancel the project and move on. That's what happens if we're thinking clearly.
Sometimes we don't think clearly enough. We decide to accept the VBN risk. Here are three scenarios that frequently lead to risk acceptance:
- Someone who's politically very powerful insists that this project must succeed no matter what, and "you'll find a way around the problem somehow."
- The team has found some approaches that might mitigate the VBN risk somewhat. The time, budget, or capabilities required are beyond what's available, but somebody says, "If we make a solid demo, they'll find the money."
- There's division of opinion about whether the VBN risk is real. Most expert members of the project team acknowledge the risk, but one ambitious fellow, currying favor with Management, disagrees.
These situations are fertile ground for toxic conflict. When the VBN risk is fundamentally technological, a common pattern for this conflict is an oppositional debate between the "technologists" (IT or product engineers) on the one hand, and on the other hand the "business" (product managers, executives, or Sales or Marketing officials). When the conflict becomes toxic enough, it can leave lasting social scars that limit future organizational performance.
The conflict that arises in the context of debates about VBN risks isn't unique. We can understand its dynamics in terms of a psychological phenomenon known as naïve realism. Naïve realism is our human tendency to assume that our perceptions of the world are accurate and objective. [Ross 1995] Consequently, when our understanding of the world conflicts with that of others, we attribute the difference to distortions in others' perceptions, due to their ignorance, false beliefs, irrationality, or biases. As we do this, so do our partners in conflict attribute to us ignorance, false beliefs, irrationality, or biases. Naïve realism thus provides an elegantly symmetric setup for toxic conflict. It can transform a disagreement from one in which each party critiques the value of the other's approach to the issues to one in which each party attacks the other's value as a person.
But although a setup for toxic conflict is necessary, it isn't sufficient to seriously damage relationships. For truly destructive toxic conflict, the participants need to care deeply about the outcome of the conflict. VBN risks can provide the missing element. We know that conflict participants care deeply about the outcomes of debates about VBN risks, because they all regard VBN risks as "Very Bad News."
On the "business" side, After we identify a "Very Bad News" risk,
a common pattern for the ensuing
conflict is oppositional debate between
product engineers and the "business"conflict participants strongly desire project success. They want what the project promises to deliver, and they can get what they want only if the project goes forward. On the "technologist" side, conflict participants want to work on successful projects, and they want to avoid working on projects that are doomed from the start. Too often, in their experience, project failures have been unjustly and incorrectly attributed not to foolhardy decisions to accept VBN risks, but to the lack of professionalism and low work quality of project teams. In the context of the VBN risk, the technologists can get what they want only if the VBN risk is properly acknowledged and managed, the cost of which can be prohibitive. The "business" faction wants the project to go forward despite the VBN risk; the "technologist" faction wants the project to be reconfigured or cancelled because of the VBN risk.
In many organizations, the "business" prevails. In practice, the technologists are directed to execute the project with the VBN risk wholly or largely unmitigated. Because the people who represent the "business" want the project to proceed, and because they cannot afford to manage the VBN risk, they decide to opt for "risk acceptance."
But exactly how does this happen? What patterns of thought and decision making enable the group to proceed despite this stark disagreement? I'll explore this part of the story next time. Next issue in this series Top Next Issue
Projects never go quite as planned. We expect that, but we don't expect disaster. How can we get better at spotting disaster when there's still time to prevent it? How to Spot a Troubled Project Before the Trouble Starts is filled with tips for executives, senior managers, managers of project managers, and sponsors of projects in project-oriented organizations. It helps readers learn the subtle cues that indicate that a project is at risk for wreckage in time to do something about it. It's an ebook, but it's about 15% larger than "Who Moved My Cheese?" Just . Order Now! .
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- Effects of Shared Information Bias: I
- Shared information bias is the tendency for group discussions to emphasize what everyone already knows.
It's widely believed to lead to bad decisions. But it can do much more damage than that.
- The Stupidity Attribution Error
- In workplace debates, we sometimes conclude erroneously that only stupidity can explain why our debate
partners fail to grasp the elegance or importance of our arguments. There are many other possibilities.
- The Trap of Beautiful Language
- As we assess the validity of others' statements, we risk making a characteristically human error —
we confuse the beauty of their language with the reliability of its meaning. We're easily thrown off
by alliteration, anaphora, epistrophe, and chiasmus.
- Seven Planning Pitfalls: I
- Whether in war or in projects, plans rarely work out as, umm well, as planned. In part, this is due
to our limited ability to foretell the future, or to know what we don't know. But some of the problem
arises from the way we think. And if we understand this we can make better plans.
- Unrecognized Bullying: III
- Much workplace bullying goes unrecognized because of cognitive biases that can cause targets, perpetrators,
bystanders, and supervisors of perpetrators not to notice bullying. The Halo Effect and the Horn Effect
are two of these biases.
See also Cognitive Biases at Work and Cognitive Biases at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming September 4: Beating the Layoffs: I
- If you work in an organization likely to conduct layoffs soon, keep in mind that exiting voluntarily before the layoffs can carry significant advantages. Here are some that relate to self-esteem, financial anxiety, and future employment. Available here and by RSS on September 4.
- And on September 11: Beating the Layoffs: II
- If you work in an organization likely to conduct layoffs soon, keep in mind that exiting voluntarily can carry advantages. Here are some advantages that relate to collegial relationships, future interviews, health, and severance packages. Available here and by RSS on September 11.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed