Shared information bias is the tendency of groups to spend time and energy discussing information that most group members already know. Consequently they have less time and energy to devote to information that only a few members know. [Stasser 1985] [Van Swol 2007] [Forsyth 2010] This bias in the way the group invests its resources leads to misalignment between reality and the group's perceptions, and eventually to bad decisions.
For example, in discussing possible solutions to a technical problem, the portion of the discussion devoted to information that most group members already know will tend to be disproportionately large, in terms of importance, compared to the portion of the discussion regarding technical subtleties known only to the few group members with relevant expertise. In part, this happens because the number of people who are familiar with the commonly shared information is greater than the number of people who are familiar with the less commonly shared information. But research suggests that the shared information bias is greater than mere numbers would predict.
Although bad decisions are the most commonly cited effect of shared information bias, the damage it causes transcends the substance of the immediate decision at hand. That's why it's important to consider other effects of the bias, to motivate groups to address shared information bias with the attention it deserves.
Here, in Part I of this exploration, are four ways shared information bias harms group processes.
- Members experience a false sense of comfort and well being
- Repeated Shared information bias leads to
misalignment between reality and
the group's perceptions, and
eventually to bad decisionsexperiences of discussions that fail to challenge group members' beliefs and preconceptions can enhance their sense of comfort and well being, however false it might be. This misapprehension of the group's actual state can expose it to great risk of chaos if it encounters a situation to which it has been rendered vulnerable by this false sense of security.
- Enhanced likelihood of groupthink
- Groupthink is a group-psychological dynamic that causes the group to converge on an outcome not on the basis of the tenets to which the group claims it subscribes, but instead as a means of achieving group harmony and conformity. The probability of an irrational and dysfunctional outcome is thus elevated. When groupthink is in effect, the group tries to minimize conflict and reach consensus, even at the cost of abandoning critical thinking, suppressing alternative viewpoints, and preventing access to external influence. Shared information bias thus facilitates groupthink by providing a false sense of comfort and well being and a variety of contributions that are consistent with the views and preconceptions of group members. For more about groupthink, see "Design Errors and Groupthink," Point Lookout for April 16, 2014.
- Biased assessments of importance
- In groups, especially in real or virtual meetings, a commonly used heuristic for assessing the importance of an idea or insight is group members' sense of the number of times it arises in discussion. People don't actually count occurrences; a subjective sense seems to be sufficient. If the group is experiencing a shared information bias, that bias skews the subjective sense of the frequency of mentions of ideas. The group members then tend to assess the importance of frequently cited ideas as greater than they might actually be. And that can skew the discussion away from directions that might reveal insights and perspective far more important than anything discussed so far.
- Increased persistence of wrong beliefs
- If someone withholds an incorrect opinion, misinformation, or misapprehension, that they themselves have accepted, it's less likely to be refuted by another group member who knows that the withheld contribution is incorrect, misinformed, or confused, but who doesn't know that any group members subscribe to it. And the longer the confusion remains in the mind of the holder, the longer it's available in that person's mind to discredit truthful beliefs and accurate perceptions.
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Your comments are welcomeWould you like to see your comments posted here? rbrenZLkFdSHmlHvCaSsuner@ChacbnsTPttsdDaRAswloCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- Self-Serving Bias in Organizations
- We all want to believe that we can rely on the good judgment of decision makers when they make decisions
that affect organizational performance. But they're human, and they are therefore subject to a cognitive
bias known as self-serving bias. Here's a look at what can happen.
- The Stupidity Attribution Error
- In workplace debates, we sometimes conclude erroneously that only stupidity can explain why our debate
partners fail to grasp the elegance or importance of our arguments. There are many other possibilities.
- Seven Planning Pitfalls: I
- Whether in war or in projects, plans rarely work out as, umm well, as planned. In part, this is due
to our limited ability to foretell the future, or to know what we don't know. But some of the problem
arises from the way we think. And if we understand this we can make better plans.
- Seven Planning Pitfalls: II
- Plans are well known for working out differently from what we intended. Sometimes, the unintended outcome
is due to external factors over which the planning team has little control. Two examples are priming
effects and widely held but inapplicable beliefs.
- Seven More Planning Pitfalls: I
- Planners and members of planning teams are susceptible to patterns of thinking that lead to unworkable
plans. But planning teams also suffer vulnerabilities. Two of these are Group Polarization and Trips
Forthcoming issues of Point Lookout
- Coming February 1: The Big Power of Little Words
- Big, fancy words, like commensurate or obfuscation, tend to be more noticed than the little everyday words, like yet or best. That might be why the little words can be so much more powerful, steering conversations where their users want them to go. Available here and by RSS on February 1.
- And on February 8: Kerfuffles That Seem Like Something More
- Much of what we regard as political conflict is a series of squabbles commonly called kerfuffles. They captivate us while they're underway, but after a month or two they're forgotten. Why do they happen? Why do they persist? Available here and by RSS on February 8.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenZLkFdSHmlHvCaSsuner@ChacbnsTPttsdDaRAswloCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info