Google "scope creep" and you get over 10.1 million hits. Mission creep gets 15.6 million. Feature creep also gets over 20.2 million. That's pretty good for a technical concept, even if it isn't quite "miley" territory (222 million). The concept is known widely enough that anyone with project involvement has probably had first-hand experience with scope creep.
If we know so much about scope creep, why haven't we eliminated it? One simple possible answer: Whatever techniques we're using probably aren't working. Maybe the explanation is that we're psychologically challenged — that is, we, as humans, have a limited ability to detect scope creep, or to acknowledge that it's happening. If that's the case, a reasonable place to search for mechanisms to explain its prevalence is the growing body of knowledge about cognitive biases.
A cognitive bias is the tendency to make systematic errors of judgment based on thought-related factors rather than evidence. For example, a bias known as self-serving bias causes us to tend to attribute our successes to our own capabilities, and our failures to situational disorder.
Cognitive biases offer an enticing possible explanation for the prevalence of scope creep despite our awareness of it, because "erroneous intuitions resemble visual illusions in an important respect: the error remains compelling even when one is fully aware of its nature." [Kahneman 1977] [Kahneman 1979] Let's consider one example of how a cognitive bias can make scope creep more likely.
In their 1977 report, Kahneman and Tversky identify one particular cognitive bias, the planning fallacy, which afflicts planners. They The planning fallacy can lead to
scope creep because underestimates
of cost and schedule can lead
decision makers to feel that
they have time and resources
that don't actually existdiscuss two types of information used by planners. Singular information is specific to the case at hand; distributional information is drawn from similar past efforts. The planning fallacy is the tendency of planners to pay too little attention to distributional evidence and too much to singular evidence, even when the singular evidence is scanty or questionable. Failing to harvest lessons from the distributional evidence, which is inherently more diverse than singular evidence, the planners tend to underestimate cost and schedule.
But because the planning fallacy leads to underestimates of cost and schedule, it can also lead to scope creep. Underestimates can lead decision makers to feel that they have time and resources that don't actually exist: "If we can get the job done so easily, it won't hurt to append this piece or that."
Accuracy in cost and schedule estimates thus deters scope creep. We can enhance the accuracy of estimates by basing them not on singular data alone, but instead on historical data regarding organizational performance for efforts of similar kind and scale. And we can require planners who elect not to exploit distributional evidence in developing their estimates to explain why they made that choice.
In coming issues we'll examine other cognitive biases that can contribute to scope creep. Next issue in this series Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. More info
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Project Management:
- Nine Project Management Fallacies: I
- Most of what we know about managing projects is useful and effective, but some of what we "know"
just isn't so. Identifying the fallacies of project management reduces risk and enhances your ability
to complete projects successfully.
- Long-Loop Conversations: Asking Questions
- In virtual or global teams, where remote collaboration is the rule, waiting for the answer to a simple
question can take a day or more. And when the response finally arrives, it's often just another question.
Here are some suggestions for framing questions that are clear enough to get answers quickly.
- Some Risks of Short-Term Fixes
- When we encounter a problem at work, we must choose between short-term fixes (also known as workarounds)
and long-term solutions. Often we choose workarounds without appreciating the risks we're accepting
— until too late.
- Tuckman's Model and Joint Leadership Teams
- Tuckman's model of the stages of group development, applied to Joint Leadership Teams, reveals characteristics
of these teams that signal performance levels less than we hope for. Knowing what to avoid when we designate
these teams is therefore useful.
- Lessons Not Learned: II
- The planning fallacy is a cognitive bias that causes us to underestimate the cost and effort involved
in projects large and small. Efforts to limit its effects are more effective when they're guided by
interactions with other cognitive biases.
See also Project Management and Project Management for more related articles.
Forthcoming issues of Point Lookout
- Coming December 11: White Water Rafting as a Metaphor for Group Development
- Tuckman's model of small group development, best known as "Forming-Storming-Norming-Performing," applies better to development of some groups than to others. We can use a metaphor to explore how the model applies to Storming in task-oriented work groups. Available here and by RSS on December 11.
- And on December 18: Subgrouping and Conway's Law
- When task-oriented work groups address complex tasks, they might form subgroups to address subtasks. The structure of the subgroups and the order in which they form depend on the structure of the group's task and the sequencing of the subtasks. Available here and by RSS on December 18.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrenyrWpTxHuyCrjZbUpner@ChacnoFNuSyWlVzCaGfooCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed