In Part I of this small survey of "red flags" that indicate future trouble for projects or collaborations, I described the effects of toxic conflict and the loss of capable leaders and team members. Although these are warning signs, they tend not to be the very earliest warning signs. Indeed, when they occur, it's fair to say that trouble has already arrived.
Fortunately, there are several other behavioral phenomena that do tend to provide more advanced warning of trouble ahead. Among these behaviors are those that relate to interpersonal communication. Below are four examples.
- Fear of speaking truth to power
- When we speak (or think) of "the elephant in the room," we're referring to the idea that there's something many of us know about but dare not mention. Although many causes can create these elephants, one of the most difficult to address is the need to critique a position or belief that a powerful person holds. The common phrase that describes such actions is "speaking truth to power."
- When people with the power to address a problem are unaware of the problem, for whatever reason, the problem is likely to remain in place. With time, its negative consequences can increase in severity, and the cost of addressing it can grow. People with the power to address problems need to be made aware of the problems, even if they're reluctant to learn of them.
- One member of a team afraid to speak truth to power is a problem and a red flag. An entire team afraid to do so is much worse. And when the team lead is among the fearful, catastrophe is almost certain to arrive eventually.
- Fear of speaking truth
- Widespread fear Although there are many different
kinds of elephants-in-the-room, one
of the most difficult to address is
the need to critique a position that
a politically powerful person holdsof speaking truth to power is a red flag. But an even more significant red flag is fear of speaking truth at all. This fear is more damaging because it prevents truth from surfacing within the team even when people with power aren't participants in the conversation. It is therefore more effective in its ability to limit the spread of unpleasant truths. - But there is an even more dangerous possible interpretation of the observation that people are reluctant to speak truth. It's possible that team members fear that one among them is acting on behalf of someone with power, gathering intelligence about who is saying what to whom. It's possible that team members fear the consequences of expressing to each other opinions that differ from what the person with power wants them to believe.
- A team in which some members fear that other members are acting as "spies" for powerful people is a team that cannot solve problems that require acknowledging facts that differ from what the people in power want to believe. It is a team that cannot grapple with reality.
- High incidence of plausible miscommunication
- On occasion, someone — whom I'll refer to as Edgar — someone with responsibility for addressing an issue is found to have failed to address it effectively, or failed to address it in a timely fashion. An appropriate response to these failures would be an investigation to determine what actually happened and what process changes might reduce the probability of recurrences. But too often, Edgar takes preemptive action. He provides a tale of plausible miscommunication that he hopes will obviate the need for investigation.
- A tale of plausible miscommunication is a narrative that explains Edgar's failure to act in a timely, effective fashion. It provides a believable story of how knowledge of the problem failed to reach Edgar. Edgar can then hardly be held responsible for the failure because he didn't know about the problem. And the tale neutralizes any desire for investigation because it suggests a root cause that's unlikely to be repeated or is beyond the control of anyone or any process inside the enterprise. Example: "The hurricane damage cut us off from the Internet." Another example: "Ella's sudden hospitalization for COVID prevented her from alerting Edgar about the system crash."
- Tales of plausible miscommunication can of course be truthful. But when they occur with any regularity, they could indicate that some are using the technique to provide safety for themselves without harming innocent parties.
- Power-serving spin
- In politics and public relations, to spin a narrative is to present an intentionally misleading story by weaving together a series of facts and half-truths to suggest an interpretation that favors a particular position vis-à-vis some incident or situation. Within organizations, power-serving spin is spin that strengthens the position of those with political power.
- Power-serving spin carries risk for the enterprise because it can limit the chances of success for people seeking the truth of a situation. For example, those investigating the causes of miscommunication might fail to find causes if they let themselves be guided by tales that are biased by power-serving spin.
- Organizational leaders who seek accurate information about what's happening in their organizations would do well to learn how to de-spin the information that does come their way. Even better: they can adjust organizational culture in ways that encourage delivery of information free of spin.
In the final part of this series on red flags, I'll examine the abuse of political power in organizations. First issue in this series Next issue in this series Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Your comments are welcome
Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Personal, Team, and Organizational Effectiveness:
- The Zebra Effect
- If you're feeling overwhelmed by all the items on your To-Do list, and if you start on one only to realize
that you have to tackle three more you didn't know about before you can finish that one, you could be
experiencing the Zebra Effect.
- Knowing Where You're Going
- Groups that can't even agree on what to do can often find themselves debating about how
to do it. Here are some simple things to remember to help you focus on defining the goal.
- Using the Parking Lot
- In meetings, keeping a list we call the "parking lot" is a fairly standard practice. As the
discussion unfolds, we "park" there any items that arise that aren't on the agenda, but which
we believe could be important someday soon. Here are some tips for making your parking lot process more
effective.
- Down in the Weeds: I
- When someone says, "I think we're down in the weeds," a common meaning is that we're focusing
on inappropriate — and possibly irrelevant — details. How does this happen and what can
we do about it?
- Disjoint Awareness: Assessment
- When collaborators misunderstand each other's work and intentions, they're at risk of inadvertently
interfering with each other. Three causes of misunderstandings are complexity, specialization, and rapid
change.
See also Personal, Team, and Organizational Effectiveness and Personal, Team, and Organizational Effectiveness for more related articles.
Forthcoming issues of Point Lookout
- Coming January 1: The Storming Puzzle: II
- For some task-oriented work groups, Tuckman's model of small group development doesn't seem to fit. Storming seems to be absent, or Storming never ends. To learn how this illusion forms, look closely at Satir's Change Model and at what we call a task-oriented work group. Available here and by RSS on January 1.
- And on January 8: The Storming Puzzle: III
- For some task-oriented work groups, Tuckman's model of small group development seems not to fit. Storming seems to be either absent or continuous. To learn how this illusion forms, look closely at the processes that can precipitate episodes of Storming in task-oriented work groups. Available here and by RSS on January 8.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed