Researchers have identified a cognitive bias known as shared information bias, which is the tendency of groups to discuss topics and share information that most members already know. [Forsyth 2010] Groups seem to prefer such discussions to discussions of topics that only a few members know. The consequences of this behavior include poor decision quality.
But just exactly how does shared information bias compromise decision quality? And what other consequences might there be? In Part I of this exploration, I sketched how this bias can lead to a false sense of security, increased likelihood of groupthink, biased assessments of the relative importance of ideas, and increased persistence of wrong beliefs. Here are five more ways shared information bias can harm group processes.
- Increased difficulty in adopting valid ideas
- Shared information acquires credibility correlated with the frequency of its open mentions, independent of its validity. True or correct information that remains privately held has less opportunity to acquire credibility. When it finally comes into the open, if it ever does, its relative credibility can be significantly less than that of the information that has previously been disclosed. In this way, truth and fact can lose out to fiction and rumor.
- Establishment of alliances of the confused
- In groups that repeat the pattern of shared information bias, the comfort of the pattern can strengthen relationships between members of the group who are caught in the consequences of the bias. Repeatedly unchallenged in their beliefs and preconceptions, and confirmed every time they hear another group member offer information they themselves already know, participants in these strengthened relationships can re-enforce the pattern. When this happens the group can encounter difficulty dealing with members who raise issues or offer perspectives not shared by most of the group's members.
- Shunning and expulsion of critical thinkers
- Critical thinking is the application of evidence and reason to reach an objective judgment. Data on the incidence of critical thinkers isn't generally available, but my own impression, based on my experience, is that critical thinking is less widely practiced in the workplace than one might have hoped. In groups intent on reaching conclusions to their deliberations, shared information bias might lead them to believe that a valid conclusion is nearer than it actually is. In such situations, critical thinkers are likely to raise issues and pose questions that could make evident the true status of the group's deliberations. Because repeatedly "disrupting" deliberations in this way can cause discomfort to those seeking closure, critical thinkers risk shunning and expulsion. The irony is that groups that expel critical thinkers could be expelling the very people they most need.
- If the critical Shared information acquires credibility
correlated with the frequency
of its open mentions,
independent of its validitythinker who is shunned in this way is also a member of a disfavored demographic group, while the other group members are members of favored demographic groups — or worse, while those other group members are all members of the same favored demographic group — the harm to group cohesion can become complicated and severe.
- Elevation of uncritical thinkers
- Just as critical thinkers risk expulsion, group members can enjoy elevated status if they offer contributions that make other group members feel more comfortable, and which seem to support the group as it reaches for closure. Although critical thinkers can certainly choose to offer such contributions, most such contributions probably result from thinking uncritically.
- One word of warning: a devious person, capable of critical thinking, and intent on sabotaging the group's deliberations, can exploit shared information bias either to cause the group to come to closure prematurely, or to elevate his or her own standing within the group by confirming its members' preconceptions.
- Mistakenly favorable self-assessments
- Group members who make contributions consisting of information in the possession of most other group members tend to experience validation by their colleagues. And when group members hear their colleagues make contributions that are consistent with their own beliefs and preconceptions, they also tend to feel validated. If they later make private, internal, self-assessments, those assessments will likely be more favorable than is objectively justified, because the self-assessor's beliefs and preconceptions haven't actually withstood serious challenge.
- Group members repeatedly exposed to shared information bias thus tend to feel that they and their fellow members are more creative and insightful than they actually are. This mistaken assessment can make the group vulnerable when it confronts situations that are inconsistent with its beliefs and preconceptions. Overconfidence and unfamiliarity with reality can combine to lead the group to choose unsuitable leaders, or to rely upon the judgment of those of its members who don't actually merit such reliance.
Shared information bias does indeed lead to poor decisions. It does so by not only limiting what the group considers when it makes decisions, but also by distorting the group's thinking, by creating intra-group alliances based on shared confusion, and by affecting the group's choices of leaders. You can watch this happening in groups you belong to. If you see it happening, and if you're willing to risk being shunned, raise the issue and refer people to these two articles. First in this series Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Your comments are welcomeWould you like to see your comments posted here? rbrendbTtLLSVlUPPCNkAner@ChacthFxWKdRwnLylOCDoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- Scope Creep and the Planning Fallacy
- Much is known about scope creep, but it nevertheless occurs with such alarming frequency that in some
organizations, it's a certainty. Perhaps what keeps us from controlling it better is that its causes
can't be addressed with management methodology. Its causes might be, in part, psychological.
- Overconfidence at Work
- Confidence in our judgments and ourselves is essential to success. Confidence misplaced — overconfidence
— leads to trouble and failure. Understanding the causes and consequences of overconfidence can
be most useful.
- The Ultimate Attribution Error at Work
- When we attribute the behavior of members of groups to some cause, either personal or situational, we
tend to make systematic errors. Those errors can be expensive and avoidable.
- Risk Acceptance: Naïve Realism
- When we suddenly notice a "project-killer" risk that hasn't yet materialized, we sometimes
accept the risk even though we know how seriously it threatens the effort. A psychological phenomenon
known as naïve realism plays a role in this behavior.
- Illusory Management: II
- Many believe that managers control organizational performance more precisely than they actually do.
This illusion might arise, in part, from a mechanism that causes leaders and the people they lead to
tend to misattribute organizational success.
Forthcoming issues of Point Lookout
- Coming October 5: Downscoping Under Pressure: I
- When projects overrun their budgets and/or schedules, we sometimes "downscope" to save time and money. The tactic can succeed — and fail. Three common anti-patterns involve politics, the sunk cost effect, and cognitive biases that distort estimates. Available here and by RSS on October 5.
- And on October 12: Downscoping Under Pressure: II
- We sometimes "downscope" projects to bring them back on budget and schedule when they're headed for overruns. Downscoping doesn't always work. Cognitive biases like the sunk cost effect and confirmation bias can distort decisions about how to downscope. Available here and by RSS on October 12.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendbTtLLSVlUPPCNkAner@ChacthFxWKdRwnLylOCDoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info