Nan pushed the door open, and she and Trish stepped out of the conference center into the morning air. On their first break from the off-site meeting, they hadn't quite yet relaxed from the pressure cooker that was the final stretch of Marigold, their latest project. It hadn't gone well, and they were all spending three days trying to figure out what happened.
"So what do you think?" Nan opened.
"I've been to off-sites before," said Trish. "But this is the first time I've felt hopeful that truth would come out."
Nan agreed. "Me too. I liked the bit about myths and fallacies."
Nan sat down on one of the plastic chairs. Trish sat too. "But knowing these fallacies," she asked, "won't we just get better at fooling ourselves? If we could get any better which I seriously doubt."
of common fallacies
deters us all
from using themNan smiled. "Well, I think his point was that by naming the fallacies, it gets harder to use them."
Trish sipped her coffee and set down the paper cup. Missing her own coffee mug was one thing she hated about off-sites. "I didn't quite get some of those fallacies," she said to Nan. "They're a little confusing."
Nan nodded. "Yeah, me too. But what did he say about that — something about the confusion is what makes them so common?"
Just then, Peter came through the doorway, carrying a paper cupful of coffee and three huge chocolate chip cookies wrapped in a napkin. He sat down in the empty chair next to Nan.
Nan smiled at Peter and, gazing at the cookies, she said, "Peter, how nice of you to think of us."
Peter smiled back, took a cookie, and pushed the others to Nan. Then he turned to Trish. "So what's your favorite project fallacy?"
Trish reached for a cookie. "I don't know," she said. "We were just saying that they're a bit confusing."
"Yeah," said Nan. "I think he was saying that their wrongness is so subtle that we just accept them as conventional wisdom."
And so it is with most fallacies. Their simplicity makes them seductive, and their subtlety makes them durable.
Our best defense against project management fallacies is to study them. By naming the fallacies, the patterns become obvious to everyone, which deters us from using them. Two common fallacies that arise from our wish for simple solutions:
And two fallacies survive on the basis of their subtlety:
Two fallacies have a deep connection to what we are as human beings:
And three fallacies arise from failures of critical thinking:
The Fallacy of Positivism
The Fallacy of Positivism holds that if we believe we can accomplish something, we're more likely to actually accomplish it; and inversely, if we express doubts about accomplishing something, we're less likely to execute it successfully.
This fallacy is especially tempting to leaders who want to motivate reluctant teams to attempt (or keep trying to do) the impossible. They're using it as a tool of manipulation. All things being equal, it's probably helpful to have a positive attitude.
But Truth is most important. Be positive when it's appropriate, and express doubts when they're real and relevant. Both staying positive and expressing doubt inappropriately can lead to catastrophe.
The Bad Actor Fallacy
If a team exhibits a repeated pattern of dysfunction, we commit the Bad Actor Fallacy when we assume that one single team member is the likely cause of the problem.
Isolating the cause of a team problem to a single individual is tempting because it suggests that dealing with that individual can resolve the problem. No need for messy and expensive team interventions; no need for involving more than one person.
While it's possible for a single individual to keep a team in a state of dysfunction, more typically many individuals contribute to team problems. Team performance is an attribute of the team's system, and the organization in which that team is embedded.
The Naturalistic Fallacy
A cousin of the Fundamental Attribution Error, this fallacy holds that professional credentials — experience, education, seniority, or past performance — are equivalent to abilities. For instance, if a particular project manager led a few projects that failed, we conclude that he or she is incapable.
Judgments based on credentials and past performance alone are likely to omit from consideration the past prevailing context, which might have been a significant contributor to past results.
To assess the capabilities of a person, an organization, a technology, or a design, consider not only credentials and past performance, but also contextual factors.
The Culturalistic Fallacy
We commit this fallacy when we believe that the project manager, or some other organizational leader, creates a high performance team, without the assistance or influence of the people who belong to that team.
To measure the prevalence of this fallacy, track the attributed causes of team performance. In organizations where the credit for high performance tends to flow to leaders, while the blame for dysfunction tends to flow to team members, it's likely that the Culturalistic Fallacy is at work.
While any one person can undermine a team's performance, no single person is responsible for creating high performance. External factors certainly contribute, but a team's performance is most directly due to the choices of the members of that team.
These last two fallacies are closely related — the Naturalistic Fallacy undervalues contextual factors, while the Culturalistic Fallacy undervalues the contributions of people. They're two different ways to misperceive reality.
Now let's look at some fallacies that relate to our humanness.
In 1997, the Commonwealth of Massachusetts adopted a bill proposed by the third grade class of a school in Somerset, and thereby designated the chocolate chip cookie as the official state cookie of Massachusetts. Photo courtesy Lara Schneider.
Nan broke off a tiny chunk of her cookie, ate it, and sipped her coffee. "Mmmm, I thought so too," she said. "Knowing that we fall into these fallacy traps because of our humanness made me more accepting of it, less guilty."
Trish was puzzled. "Yeah, but how does that help the project?"
"That's just it," said Nan. "Knowing that the fallacies are part of being human makes it easier to acknowledge these errors when we make them."
Peter finished Nan's thought. "And that way we can own up to them faster, maybe even before they do any damage."
Nan picked up the last chunk of cookie and ate it. Peter and Trish had long ago finished theirs, but Nan liked making cookies last. "The critical thinking fallacies were my favorites," she said. "I like learning how to think more clearly."
Peter sipped his coffee. "Mmm." He swallowed. "But how do we avoid those fallacies?"
Nan had an idea. "Maybe we should inspect our project plans, like we inspect components."
Trish was intrigued. "Yeah, and I know what I'd put at the top of the checklist."
"OK, I'll bite," said Peter. "What?"
Trish was ready. "The Nine Project Management Fallacies."
Not a bad idea. These last five fallacies have a deep connection to what we are as human beings, and include errors of critical thinking. The first two of these arise because of our all-too-human hopes and wishes. We long for a world where we can substitute any person for any other, and where doubling resources halves the schedule. Sadly, longing doesn't make it so.
The Fungibility Fallacy
The Fungibility Fallacy holds that each person produces one hour of output in one hour, and that we can substitute people for one another. Terms that suggest this fallacy are man-month, headcount, and FTE.
Often, only a few people can perform certain tasks. Using the project management tools that distinguish the skills of large numbers of unique individuals takes time and effort, and even then they produce somewhat fictitious results.
And running "lean and mean" makes the problem worse. If you count the cost of delays and lost market windows due to overloading key people, running a little "fatter and kinder" might actually be more profitable.
The Linearity Fallacy
This fallacy holds that the human effort required to execute a project scales in proportion to project attributes such as project size or total budget.
Not only do operating costs per unit of output grow rapidly with project size, but the converse is also true: costs decline unexpectedly slowly as we scale the project down in size. This happens because we have difficulty abandoning control processes as we move down in size. We lose in both directions.
Project management is an inherently nonlinear activity. The complexity of an effort grows not in proportion to the effort, but combinatorially with the size of the effort, following the growth in the number of possible person-to-person interactions with increasing team size.
Finally here are three fallacies that arise from failures of critical thinking.
The Normative Fallacy
This fallacy holds that when we ask some people their opinions, and most of them agree, then they're correct. Usually we select people non-randomly, choosing those who will give us desirable answers, or those we can trust, or those of high rank.
Non-random polling might provide comfort, but it's hardly scientific, and it almost always leads to biased conclusions.
To get truly useful polling data, you must poll people randomly.
The Availability Heuristic
In risk management, we often estimate the probabilities of certain events. We're using the Availability Heuristic [Tversky et al, Wikipedia] when we estimate these probabilities by sensing the difficulty of imagining or understanding the string of events that lead to the risk.
For instance, when we ask people whether being attacked by a shark is more or less likely than being hit by falling airplane parts, they usually answer that shark attack is more likely. Actually, being hit by falling airplane parts is 30 times more likely, but people are fooled because it's easier to imagine shark attacks.
Estimating probabilities is unlikely to produce reliable results. Use real data, or use huge error bars.
The Grandiosity Fallacy
Confronting a problem, we sometimes address a generalization of the problem instead, hoping to solve a host of similar problems, and thereby solving the original problem almost "for free." Rarely does the reality match the wish.
Grandiosity usually generates two kinds of trouble. First, it's often more expensive and time-consuming than originally estimated. Second, the people of the organization rarely want the general solution. If they did, they probably would have sought it in the first place.
Sometimes customers don't know the value of the general solution, and telling them about it might produce a better outcome. But usually they want only what they asked for. Work with them on that first.
One more fallacy is perhaps most common, though I don't consider it a project management fallacy. It's the Purity Fallacy, which holds that we're personally pure: we never use fallacies ourselves. We all use them, of course — we're human. The trick is to catch yourself when you do.
[Wikipedia] Wikipedia's article about the Availability Heuristic.