Point Lookout: a free weekly publication of Chaco Canyon Consulting
Volume 18, Issue 50;   December 12, 2018: Effects of Shared Information Bias: II

Effects of Shared Information Bias: II

by

Shared information bias is widely recognized as a cause of bad decisions. But over time, it can also erode a group's ability to assess reality accurately. That can lead to a widening gap between reality and the group's perceptions of reality.
Thomas Paine, considered one of the Founding Fathers of the United States

Thomas Paine (1737-1809), considered one of the Founding Fathers of the United States, was an English-born political theorist and activist, who was the author of the two most influential pamphlets that inspired the American Revolution: Common Sense and The American Crisis. The latter is less well known by its title, but far more famous for its opening line: "These are the times that try men's souls…" In a pamphlet entitled Agrarian Justice, published in 1797, Paine proposed that inheritance and wealth taxes could be used to fund a guaranteed income for all, as well as pensions for the elderly and "lame and blind." In The Age of Reason (published in three parts in 1794, 1795, and 1807), he disparaged all organized religion as, "no other than human inventions, set up to terrify and enslave mankind, and monopolize power and profit." Although many nations today have implemented elements of his views, it is perhaps positions like these that account for his eventual shunning by contemporary society.

Critical thinking — thinking based on evidence and logic — can lead to views that make groups uncomfortable and can lead to the shunning of people who express those views.

Photo of a painting by Auguste Millière, after an engraving by William Sharp, after George Romney oil on canvas, circa 1876, based on a work of 1792. Courtesy Wikipedia.

Researchers have identified a cognitive bias known as shared information bias, which is the tendency of groups to discuss topics and share information that most members already know. [Forsyth 2010] Groups seem to prefer such discussions to discussions of topics that only a few members know. The consequences of this behavior include poor decision quality.

But just exactly how does shared information bias compromise decision quality? And what other consequences might there be? In Part I of this exploration, I sketched how this bias can lead to a false sense of security, increased likelihood of groupthink, biased assessments of the relative importance of ideas, and increased persistence of wrong beliefs. Here are five more ways shared information bias can harm group processes.

Increased difficulty in adopting valid ideas
Shared information acquires credibility correlated with the frequency of its open mentions, independent of its validity. True or correct information that remains privately held has less opportunity to acquire credibility. When it finally comes into the open, if it ever does, its relative credibility can be significantly less than that of the information that has previously been disclosed. In this way, truth and fact can lose out to fiction and rumor.
Establishment of alliances of the confused
In groups that repeat the pattern of shared information bias, the comfort of the pattern can strengthen relationships between members of the group who are caught in the consequences of the bias. Repeatedly unchallenged in their beliefs and preconceptions, and confirmed every time they hear another group member offer information they themselves already know, participants in these strengthened relationships can re-enforce the pattern. When this happens the group can encounter difficulty dealing with members who raise issues or offer perspectives not shared by most of the group's members.
Shunning and expulsion of critical thinkers
Critical thinking is the application of evidence and reason to reach an objective judgment. Data on the incidence of critical thinkers isn't generally available, but my own impression, based on my experience, is that critical thinking is less widely practiced in the workplace than one might have hoped. In groups intent on reaching conclusions to their deliberations, shared information bias might lead them to believe that a valid conclusion is nearer than it actually is. In such situations, critical thinkers are likely to raise issues and pose questions that could make evident the true status of the group's deliberations. Because repeatedly "disrupting" deliberations in this way can cause discomfort to those seeking closure, critical thinkers risk shunning and expulsion. The irony is that groups that expel critical thinkers could be expelling the very people they most need.
If the critical Shared information acquires credibility
correlated with the frequency
of its open mentions,
independent of its validity
thinker who is shunned in this way is also a member of a disfavored demographic group, while the other group members are members of favored demographic groups — or worse, while those other group members are all members of the same favored demographic group — the harm to group cohesion can become complicated and severe.
Elevation of uncritical thinkers
Just as critical thinkers risk expulsion, group members can enjoy elevated status if they offer contributions that make other group members feel more comfortable, and which seem to support the group as it reaches for closure. Although critical thinkers can certainly choose to offer such contributions, most such contributions probably result from thinking uncritically.
One word of warning: a devious person, capable of critical thinking, and intent on sabotaging the group's deliberations, can exploit shared information bias either to cause the group to come to closure prematurely, or to elevate his or her own standing within the group by confirming its members' preconceptions.
Mistakenly favorable self-assessments
Group members who make contributions consisting of information in the possession of most other group members tend to experience validation by their colleagues. And when group members hear their colleagues make contributions that are consistent with their own beliefs and preconceptions, they also tend to feel validated. If they later make private, internal, self-assessments, those assessments will likely be more favorable than is objectively justified, because the self-assessor's beliefs and preconceptions haven't actually withstood serious challenge.
Group members repeatedly exposed to shared information bias thus tend to feel that they and their fellow members are more creative and insightful than they actually are. This mistaken assessment can make the group vulnerable when it confronts situations that are inconsistent with its beliefs and preconceptions. Overconfidence and unfamiliarity with reality can combine to lead the group to choose unsuitable leaders, or to rely upon the judgment of those of its members who don't actually merit such reliance.

Shared information bias does indeed lead to poor decisions. It does so by not only limiting what the group considers when it makes decisions, but also by distorting the group's thinking, by creating intra-group alliances based on shared confusion, and by affecting the group's choices of leaders. You can watch this happening in groups you belong to. If you see it happening, and if you're willing to risk being shunned, raise the issue and refer people to these two articles. First in this series  Go to top Top  Next issue: Embarrassment, Shame, and Guilt at Work: Creation  Next Issue

303 Secrets of Workplace PoliticsIs every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info

Footnotes

Comprehensive list of all citations from all editions of Point Lookout
[Forsyth 2010]
Donelson R. Forsyth. Group Dynamics, Fifth Edition. Belmont, California: Wadsworth, 2010, pp. 327ff. Available here. Back

Your comments are welcome

Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.

About Point Lookout

This article in its entirety was written by a 
          human being. No machine intelligence was involved in any way.Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.

This article in its entirety was written by a human being. No machine intelligence was involved in any way.

Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.

Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.

Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.

Related articles

More articles on Cognitive Biases at Work:

Prof. Jack Brehm, who developed the theory of psychological reactanceCognitive Biases and Influence: II
Most advice about influencing others offers intentional tactics. Yet, the techniques we actually use are often unintentional, and we're therefore unaware of them. Among these are tactics exploiting cognitive biases.
An unfinished building, known as SzkieletorThe Planning Fallacy and Self-Interest
A well-known cognitive bias, the planning fallacy, accounts for many unrealistic estimates of project cost and schedule. Overruns are common. But another cognitive bias, and organizational politics, combine with the planning fallacy to make a bad situation even worse.
Assembling an IKEA chairSeven More Planning Pitfalls: III
Planning teams, like all teams, are vulnerable to several patterns of interaction that can lead to counter-productive results. Two of these relevant to planners are a cognitive bias called the IKEA Effect, and a systemic bias against realistic estimates of cost and schedule.
Opera house, Sydney, AustraliaLessons Not Learned: I
The planning fallacy is a cognitive bias that causes us to underestimate the cost and effort involved in projects large and small. Mitigating its effects requires understanding how we go wrong when we plan projects by referencing our own past experience.
A metaphor for preventing risk propagationThe Risk Planning Fallacy
The planning fallacy is a cognitive bias that causes underestimates of cost, time required, and risks for projects. Analogously, I propose a risk planning fallacy that causes underestimates of probabilities and impacts of risk events.

See also Cognitive Biases at Work and Effective Meetings for more related articles.

Forthcoming issues of Point Lookout

A meeting in a typical conference roomComing April 3: Recapping Factioned Meetings
A factioned meeting is one in which participants identify more closely with their factions, rather than with the meeting as a whole. Agreements reached in such meetings are at risk of instability as participants maneuver for advantage after the meeting. Available here and by RSS on April 3.
Franz Halder, German general and the chief of staff of the Army High Command (OKH) in Nazi Germany from 1938 until September 1942And on April 10: Managing Dunning-Kruger Risk
A cognitive bias called the Dunning-Kruger Effect can create risk for organizational missions that require expertise beyond the range of knowledge and experience of decision-makers. They might misjudge the organization's capacity to execute the mission successfully. They might even be unaware of the risk of so misjudging. Available here and by RSS on April 10.

Coaching services

I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.

Get the ebook!

Past issues of Point Lookout are available in six ebooks:

Reprinting this article

Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info

Follow Rick

Send email or subscribe to one of my newsletters Follow me at LinkedIn Follow me at X, or share a post Subscribe to RSS feeds Subscribe to RSS feeds
The message of Point Lookout is unique. Help get the message out. Please donate to help keep Point Lookout available for free to everyone.
Technical Debt for Policymakers BlogMy blog, Technical Debt for Policymakers, offers resources, insights, and conversations of interest to policymakers who are concerned with managing technical debt within their organizations. Get the millstone of technical debt off the neck of your organization!
Go For It: Sometimes It's Easier If You RunBad boss, long commute, troubling ethical questions, hateful colleague? Learn what we can do when we love the work but not the job.
303 Tips for Virtual and Global TeamsLearn how to make your virtual global team sing.
101 Tips for Managing ChangeAre you managing a change effort that faces rampant cynicism, passive non-cooperation, or maybe even outright revolt?
101 Tips for Effective MeetingsLearn how to make meetings more productive — and more rare.
Exchange your "personal trade secrets" — the tips, tricks and techniques that make you an ace — with other aces, anonymously. Visit the Library of Personal Trade Secrets.
If your teams don't yet consistently achieve state-of-the-art teamwork, check out this catalog. Help is just a few clicks/taps away!
Ebooks, booklets and tip books on project management, conflict, writing email, effective meetings and more.