Shared information bias is the tendency of groups to spend time and energy discussing information that most group members already know. Consequently they have less time and energy to devote to information that only a few members know [Forsyth 2010]. This bias in the way the group invests its resources leads to misalignment between reality and the group's perceptions, and eventually to bad decisions.
For example, in discussing possible solutions to a technical problem, the portion of the discussion devoted to information that most group members already know will tend to be disproportionately large, in terms of importance, compared to the portion of the discussion regarding technical subtleties known only to the few group members with relevant expertise. In part, this happens because the number of people who are familiar with the commonly shared information is greater than the number of people who are familiar with the less commonly shared information. But research suggests that the shared information bias is greater than mere numbers would predict.
Although bad decisions are the most commonly cited effect of shared information bias, the damage it causes transcends the substance of the immediate decision at hand. That's why it's important to consider other effects of the bias, to motivate groups to address shared information bias with the attention it deserves.
Here, in Part I of this exploration, are four ways shared information bias harms group processes.
- Members experience a false sense of comfort and well being
- Repeated Shared information bias leads to
misalignment between reality and
the group's perceptions, and
eventually to bad decisionsexperiences of discussions that fail to challenge group members' beliefs and preconceptions can enhance their sense of comfort and well being, however false it might be. This misapprehension of the group's actual state can expose it to great risk of chaos if it encounters a situation to which it has been rendered vulnerable by this false sense of security.
- Enhanced likelihood of groupthink
- Groupthink is a group-psychological dynamic that causes the group to converge on an outcome not on the basis of the tenets to which the group claims it subscribes, but instead as a means of achieving group harmony and conformity. The probability of an irrational and dysfunctional outcome is thus elevated. When groupthink is in effect, the group tries to minimize conflict and reach consensus, even at the cost of abandoning critical thinking, suppressing alternative viewpoints, and preventing access to external influence. Shared information bias thus facilitates groupthink by providing a false sense of comfort and well being and a variety of contributions that are consistent with the views and preconceptions of group members. For more about groupthink, see "Design Errors and Groupthink," Point Lookout for April 16, 2014.
- Biased assessments of importance
- In groups, especially in real or virtual meetings, a commonly used heuristic for assessing the importance of an idea or insight is group members' sense of the number of times it arises in discussion. People don't actually count occurrences; a subjective sense seems to be sufficient. If the group is experiencing a shared information bias, that bias skews the subjective sense of the frequency of mentions of ideas. The group members then tend to assess the importance of frequently cited ideas as greater than they might actually be. And that can skew the discussion away from directions that might reveal insights and perspective far more important than anything discussed so far.
- Increased persistence of wrong beliefs
- If someone withholds an incorrect opinion, misinformation, or misapprehension, that they themselves have accepted, it's less likely to be refuted by another group member who knows that the withheld contribution is incorrect, misinformed, or confused, but who doesn't know that any group members subscribe to it. And the longer the confusion remains in the mind of the holder, the longer it's available in that person's mind to discredit truthful beliefs and accurate perceptions.
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Your comments are welcomeWould you like to see your comments posted here? rbrenmhXARWRMUvVyOdHlner@ChacxgDmtwOKrxnripPCoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Cognitive Biases at Work:
- The Focusing Illusion in Organizations
- The judgments we make at work, like the judgments we make elsewhere in life, are subject to human fallibility
in the form of cognitive biases. One of these is the Focusing Illusion. Here are some examples to watch for.
- Historical Debates at Work
- One obstacle to high performance in teams is the historical debate — arguing about who said what
and when, or who agreed to what and when. Here are suggestions for ending and preventing historical debates.
- Wishful Significance: I
- When things don't work out, and we investigate why, we sometimes attribute our misfortune to "wishful
thinking." In this part of our exploration of wishful thinking we examine how we arrive at mistaken
assessments of the significance of what we see, hear, or learn.
- Cognitive Biases and Influence: I
- The techniques of influence include inadvertent — and not-so-inadvertent — uses of cognitive
biases. They are one way we lead each other to accept or decide things that rationality cannot support.
- Effects of Shared Information Bias: II
- Shared information bias is widely recognized as a cause of bad decisions. But over time, it can also
erode a group's ability to assess reality accurately. That can lead to a widening gap between reality
and the group's perceptions of reality.
Forthcoming issues of Point Lookout
- Coming October 23: Power Distance and Teams
- One of the attributes of team cultures is something called power distance, which is a measure of the overall comfort people have with inequality in the distribution of power. Power distance can determine how well a team performs when executing high-risk projects. Available here and by RSS on October 23.
- And on October 30: Power Distance and Risk
- Managing or responding to project risks is much easier when team culture encourages people to report problems and question any plans they have reason to doubt. Here are five examples that show how such encouragement helps to manage risk. Available here and by RSS on October 30.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenmhXARWRMUvVyOdHlner@ChacxgDmtwOKrxnripPCoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500 words in your inbox in one hour. License any article from this Web site. More info
- The Race to the South Pole: Lessons in Leadership
On 14 December 1911, four men led by Roald Amundsen reached the South Pole. Thirty-five days later, Robert F. Scott and four others followed. Amundsen had won the race to the pole. Amundsen's party returned to base on 26 January 1912. Scott's party perished. As historical drama, why this happened is interesting enough. But to organizational leaders, business analysts, project sponsors, and project managers, the story is fascinating. We'll use the history of this event to explore lessons in leadership and its application to organizational efforts. A fascinating and refreshing look at leadership from the vantage point of history. Read more about this program.
Here's a date for this program:
- Baldwin-Wallace University, 275 Eastland Road, Berea, Ohio
44017: November 7,
Kerzner Lecture Series/International Project Management Day, sponsored by Baldwin Wallace University and the Northeast Ohio Chapter of the Project Management Institute.
- Baldwin-Wallace University, 275 Eastland Road, Berea, Ohio 44017: November 7, Kerzner Lecture Series/International Project Management Day, sponsored by Baldwin Wallace University and the Northeast Ohio Chapter of the Project Management Institute. Register now.
- The Power Affect: How We Express Our Personal Power
Many people who possess real organizational power have a characteristic demeanor. It's the way they project their presence. I call this the power affect. Some people — call them power pretenders — adopt the power affect well before they attain significant organizational power. Unfortunately for their colleagues, and for their organizations, power pretenders can attain organizational power out of proportion to their merit or abilities. Understanding the power affect is therefore important for anyone who aims to attain power, or anyone who works with power pretenders. Read more about this program.