In many kinds of knowledge work the quality of the output is important enough that we require the people who perform the work to meet specified quality standards. When they don't, we intervene to take corrective action. That corrective action often requires changes to the way the work is performed, with the hope that the changes will lead to improved output quality. Sometimes this approach works well. But at other times, even though the changes seemed to bring about the desired improvement at the pilot scale, the results of the full-scale intervention are disappointing.
This post explores a plausible explanation of disappointing results of some interventions intended to address defects in knowledge work. That explanation depends on a model of how defects arise based on the idea of defect streams and their sources.
For concreteness, consider the case of an organization engaged in deploying a software security platform. That platform will support a large number of applications that an even larger number of people use in the course of their daily work. To ensure that users can do their work in a secure environment, it's important that the deployment meet strict security criteria — for the platform itself and for each application.
The people performing the work are required to follow complex procedures to ensure that each software application can be used safely. Although the procedures are similar in large part, they differ in detail depending on the type of software being installed, and on the particular product and release of that product. If the person performing the work executes it incorrectly, the evidence of the error might not appear immediately. Often, the error is discovered by the Quality Assurance team, which reports the error in a weekly defect report. We can regard these errors as comprising a defect stream.
Streams Those who devise procedure changes
to reduce the incidence of defects
must provide the resources necessary
to adopt their recommendationshave sources. We usually assume that the sources of the errors lie somewhere within the organization that is executing the deployment. Although it's common to assert that the cause of the errors is shoddy work by the people who last touched the defective work product, their behavior might not actually be the root cause.
Three possible candidate root causes are below:
- Unfunded mandated effort
- Announcing a procedure or procedure change is an important step in taking corrective actions. But unless the people who execute the procedure have access to the resources and time required to learn a procedure, or to learn about the latest changes, they cannot afford to acquire the knowledge they need to execute the altered procedure correctly.
- Those who design interventions intended to reduce the volume of the defect stream must provide the resources necessary to adopt the changes they are recommending. Absent those necessary resources, the people who execute the procedures in question will scrounge equivalent resources from someplace else. The result, often, is formation of new defect sources that feed the old defect streams, or which create new defect streams.
- Misallocating the cost of procedure competence
- The cost of procedure competence is the cost of training the people who execute the procedure, plus the cost of ensuring their continued competence after training is complete. We usually allocate that cost to the business unit in which those people perform their work. But in many cases, that allocation is erroneous. And it might even be the root cause of the defects.
- For example, consider a cumbersome and overly complicated procedure. In an extreme case, the procedure can be so gratuitously complex that nobody can apply it correctly with regularity. Defects are inevitable. Should not the cost of these defects be charged to the poor design of the procedure? Said differently, should not the cost of the training required to prevent defects be charged to the poor design of the procedure?
- That is, the cost of procedure competence ought not to be borne solely by the business units responsible for executing the procedure. Some of the costs might be due to procedure design. Allocating some costs to the unit responsible for the procedure design helps to ensure that procedures are designed with learning costs and execution costs in mind.
- Frequent changes in procedures
- Those who execute the procedures need to understand the procedures to avoid making errors. If changes are frequent, they need to monitor the status of procedures so that they'll be aware of changes that can affect their work. That monitoring activity competes with other responsibilities for their attention.
- The cost of a procedure change should include more than the cost of designing the change. Training and retraining are real costs. The unit that makes the change must provide resources to the units that must execute the change.
Although these observations are expressed in terms of defects in work product, they apply equally to compliance with policies. To see how, consider policies to correspond to procedures in the discussion above, and deviations from policy to be defects. Top Next Issue
Projects never go quite as planned. We expect that, but we don't expect disaster. How can we get better at spotting disaster when there's still time to prevent it? How to Spot a Troubled Project Before the Trouble Starts is filled with tips for executives, senior managers, managers of project managers, and sponsors of projects in project-oriented organizations. It helps readers learn the subtle cues that indicate that a project is at risk for wreckage in time to do something about it. It's an ebook, but it's about 15% larger than "Who Moved My Cheese?" Just . Order Now! .
Your comments are welcomeWould you like to see your comments posted here? rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.comSend me your comments by email, or by Web form.
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
More articles on Project Management:
- Risk Management Risk: I
- Risk Management Risk is the risk that a particular risk management plan is deficient. It's often overlooked,
and therefore often unmitigated. We can reduce this risk by applying some simple procedures.
- More Limitations of the Eisenhower Matrix
- The Eisenhower Matrix is useful for distinguishing which tasks deserve attention and in what order.
It helps us by removing perceptual distortion about what matters most. But it can't help as much with
some kinds of perceptual distortion.
- False Summits: I
- Mountaineers often experience "false summits," when just as they thought they were nearing
the summit, it turns out that there is much more climbing to do. So it is in project work.
- On the Risk of Undetected Issues: I
- In complex projects, things might have gone wrong long before we notice them. Noticing them as early
as possible — and addressing them — is almost always advantageous. How can we reduce the
incidence of undetected issues?
- Premortems are simulated retrospective examinations of future events, conducted as if those future events
had already occurred. By combining the benefits of psychological safety with a shift in temporal perspective,
they offer advantages for planners.
Forthcoming issues of Point Lookout
- Coming December 13: Contrary Indicators of Psychological Safety: I
- To take the risks that learning and practicing new ways require, we all need a sense that trial-and-error approaches are safe. Organizations seeking to improve processes would do well to begin by assessing their level of psychological safety. Available here and by RSS on December 13.
- And on December 20: Contrary Indicators of Psychological Safety: II
- When we begin using new tools or processes, we make mistakes. Practice is the cure, but practice can be scary if the grace period for early mistakes is too short. For teams adopting new methods, psychological safety is a fundamental component of success. Available here and by RSS on December 20.
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenogMhuqCxAnbfLvzbner@ChacigAthhhYwzZDgxshoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info