Researchers have identified a cognitive bias known as shared information bias, which is the tendency of groups to discuss topics and share information that most members already know [Forsyth 2010]. Groups seem to prefer such discussions to discussions of topics that only a few members know. The consequences of this behavior include poor decision quality.
But just exactly how does shared information bias compromise decision quality? And what other consequences might there be? In Part I of this exploration, I sketched how this bias can lead to a false sense of security, increased likelihood of groupthink, biased assessments of the relative importance of ideas, and increased persistence of wrong beliefs. Here are five more ways shared information bias can harm group processes.
- Increased difficulty in adopting valid ideas
- Shared information acquires credibility correlated with the frequency of its open mentions, independent of its validity. True or correct information that remains privately held has less opportunity to acquire credibility. When it finally comes into the open, if it ever does, its relative credibility can be significantly less than that of the information that has previously been disclosed. In this way, truth and fact can lose out to fiction and rumor.
- Establishment of alliances of the confused
- In groups that repeat the pattern of shared information bias, the comfort of the pattern can strengthen relationships between members of the group who are caught in the consequences of the bias. Repeatedly unchallenged in their beliefs and preconceptions, and confirmed every time they hear another group member offer information they themselves already know, participants in these strengthened relationships can re-enforce the pattern. When this happens the group can encounter difficulty dealing with members who raise issues or offer perspectives not shared by most of the group's members.
- Shunning and expulsion of critical thinkers
- Critical thinking is the application of evidence and reason to reach an objective judgment. Data on the incidence of critical thinkers isn't generally available, but my own impression, based on my experience, is that critical thinking is less widely practiced in the workplace than one might have hoped. In groups intent on reaching conclusions to their deliberations, shared information bias might lead them to believe that a valid conclusion is nearer than it actually is. In such situations, critical thinkers are likely to raise issues and pose questions that could make evident the true status of the group's deliberations. Because repeatedly "disrupting" deliberations in this way can cause discomfort to those seeking closure, critical thinkers risk shunning and expulsion. The irony is that groups that expel critical thinkers could be expelling the very people they most need.
- If the critical Shared information acquires credibility
correlated with the frequency
of its open mentions,
independent of its validitythinker who is shunned in this way is also a member of a disfavored demographic group, while the other group members are members of favored demographic groups — or worse, while those other group members are all members of the same favored demographic group — the harm to group cohesion can become complicated and severe.
- Elevation of uncritical thinkers
- Just as critical thinkers risk expulsion, group members can enjoy elevated status if they offer contributions that make other group members feel more comfortable, and which seem to support the group as it reaches for closure. Although critical thinkers can certainly choose to offer such contributions, most such contributions probably result from thinking uncritically.
- One word of warning: a devious person, capable of critical thinking, and intent on sabotaging the group's deliberations, can exploit shared information bias either to cause the group to come to closure prematurely, or to elevate his or her own standing within the group by confirming its members' preconceptions.
- Mistakenly favorable self-assessments
- Group members who make contributions consisting of information in the possession of most other group members tend to experience validation by their colleagues. And when group members hear their colleagues make contributions that are consistent with their own beliefs and preconceptions, they also tend to feel validated. If they later make private, internal, self-assessments, those assessments will likely be more favorable than is objectively justified, because the self-assessor's beliefs and preconceptions haven't actually withstood serious challenge.
- Group members repeatedly exposed to shared information bias thus tend to feel that they and their fellow members are more creative and insightful than they actually are. This mistaken assessment can make the group vulnerable when it confronts situations that are inconsistent with its beliefs and preconceptions. Overconfidence and unfamiliarity with reality can combine to lead the group to choose unsuitable leaders, or to rely upon the judgment of those of its members who don't actually merit such reliance.
Shared information bias does indeed lead to poor decisions. It does so by not only limiting what the group considers when it makes decisions, but also by distorting the group's thinking, by creating intra-group alliances based on shared confusion, and by affecting the group's choices of leaders. You can watch this happening in groups you belong to. If you see it happening, and if you're willing to risk being shunned, raise the issue and refer people to these two articles. First in this series Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Your comments are welcomeWould you like to see your comments posted here? rbrenmhXARWRMUvVyOdHlner@ChacxgDmtwOKrxnripPCoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- Managing Hindsight Bias Risk
- Performance appraisal practices and project retrospectives both rely on evaluating performance after
outcomes are known. Unfortunately, a well-known bias — hindsight bias — can limit the effectiveness
of many organizational processes, including both performance appraisal and project retrospectives.
- Scope Creep, Hot Hands, and the Illusion of Control
- Despite our awareness of scope creep's dangerous effects on projects and other efforts, we seem unable
to prevent it. Two cognitive biases — the "hot hand fallacy" and "the illusion
of control" — might provide explanations.
- Scope Creep and Confirmation Bias
- As we've seen, some cognitive biases can contribute to the incidence of scope creep in projects and
other efforts. Confirmation bias, which causes us to prefer evidence that bolsters our preconceptions,
is one of these.
- Cognitive Biases and Influence: II
- Most advice about influencing others offers intentional tactics. Yet, the techniques we actually use
are often unintentional, and we're therefore unaware of them. Among these are tactics exploiting cognitive
- The Ultimate Attribution Error at Work
- When we attribute the behavior of members of groups to some cause, either personal or situational, we
tend to make systematic errors. Those errors can be expensive and avoidable.
Forthcoming issues of Point Lookout
- Coming January 22: Disjoint Awareness: Bias
- Some cognitive biases can cause people in collaborations to have inaccurate understandings of what each other is doing. Confirmation bias and self-serving bias are two examples of cognitive biases that can contribute to disjoint awareness in some situations. Available here and by RSS on January 22.
- And on January 29: Higher-Velocity Problem Definition
- Typical approaches to shortening time-to-market for new products usually involve accelerating problem solving. Accelerating problem definition can also help. Available here and by RSS on January 29.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenmhXARWRMUvVyOdHlner@ChacxgDmtwOKrxnripPCoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500 words in your inbox in one hour. License any article from this Web site. More info
- The Power Affect: How We Express Our Personal Power
Many people who possess real organizational power have a characteristic demeanor. It's the way they project their presence. I call this the power affect. Some people — call them power pretenders — adopt the power affect well before they attain significant organizational power. Unfortunately for their colleagues, and for their organizations, power pretenders can attain organizational power out of proportion to their merit or abilities. Understanding the power affect is therefore important for anyone who aims to attain power, or anyone who works with power pretenders. Read more about this program.