
Terminal 3 of Beijing Capital International Airport. An extraordinary number of processes must work with precision for an airline to operate "within normal parameters" for a single day. Aircraft, fuel, seating assignments, luggage, flight crew, security, and on and on. That they do as well as they do is worthy of admiration.
Early in the morning on May 27, one of Britain's busiest annual travel days, British Airways canceled all flights from London's two biggest airports. More than 1,000 flights and 75,000 passengers were affected. In a statement, the airline announced that "a major IT system failure" had disrupted flight operations worldwide. [Johnston 2017] [Dans 2017]
On September 28 "network problems" struck the firm Amadeus IT Group SA, whose Altea software "is used by more than 100 airlines worldwide," including "Air France, Southwest, Lufthansa, British Airways, Qantas, China Air and Korean Air." [Rizzo 2017] Passengers around the world reported long lines, and although the system did recover that same day, delays of hours were widespread, and many international passengers missed connections.
On that same day, in the midst of worldwide air traffic disruption, Reuters reported that the United States General Accounting Office would be investigating these disruptions and a string of others that had occurred in the previous six months. [Shepardson 2017] Fires, network outages, human error, and goodness knows what else were suspected causes.
Clearly something was not right with the airlines' management of technological risk. And since no major industry understands technological risk management better than the airlines, it's reasonable to suppose that if the airline industry is having trouble managing technological risk, just about everyone is.
However assiduously we avoid risk, we sometimes find — suddenly, as the airlines did — that we're up to our necks in it. How does this happen? How does risk creep into our projects and our operations? Let's consider projects, because they're time-limited and therefore a little less complicated.
When project champions are required to "sell" When project champions are
required to "sell" a project
internally, they sometimes overcommita project internally, they sometimes overcommit. If that happens because of an inordinately high bar imposed by senior management, one possible cause is a most curious phenomenon, related to what Boehm et al. call a "conspiracy of optimism" [Boehm 2016], and which is actually a variant of the n-person prisoner's dilemma. [Hamburger 1973] Specifically, senior management might be trying to manage enterprise-scale risk by requiring high returns at low risk from individual projects (or even individual portfolios of projects). Ironically, this approach results in risk elevation for the individual projects or portfolios, because project champions must promise the nearly impossible, or the outright impossible, to gain access to resources. The paradoxical result is that risk aversion on the part of senior management fosters an environment in which nearly all activities that are underway are high risk. By attempting to wring risk out of the enterprise, management opens the door and invites it in.
It gets worse. It turns out that the risks confronting individual projects, arising from the unrealistic promises of project champions, are correlated. And that means that when one risk event materializes, others will too. We'll explore how project champions contribute to risk creep next time. Top
Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and
found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Project Management:
On Beginnings
- A new year has begun, and I'm contemplating beginnings. Beginnings can inspire, and sometimes lead to
letdown when our hopes or expectations aren't met. How can we handle beginnings more powerfully?
False Summits: I
- Mountaineers often experience "false summits," when just as they thought they were nearing
the summit, it turns out that there is much more climbing to do. So it is in project work.
The Risks of Too Many Projects: II
- Although taking on too many projects risks defocusing the organization, the problems just begin there.
Here are three more ways over-commitment causes organizations to waste resources or lose opportunities.
Power Distance and Teams
- One of the attributes of team cultures is something called ``Em''power distance``/Em'', which is a measure
of the overall comfort people have with inequality in the distribution of power. Power distance can
determine how well a team performs when executing high-risk projects.
The Risk Planning Fallacy
- The planning fallacy is a cognitive bias that causes underestimates of cost, time required, and risks
for projects. Analogously, I propose a risk planning fallacy that causes underestimates of probabilities
and impacts of risk events.
See also Project Management and Project Management for more related articles.
Forthcoming issues of Point Lookout
Coming March 12: Embedded Technology Groups and the Dunning-Kruger Effect
- Groups of technical specialists in fields that differ markedly from the main business of the enterprise that hosts them must sometimes deal with wrong-headed decisions made by people who think they know more about the technology than they actually do. Available here and by RSS on March 12.
And on March 19: On Lying by Omission
- Of the many devious strategies of workplace politics, deception is among the most commonly used. And perhaps the most commonly used tactic of deception is lying. Since getting caught in a lie can be costly, people try to lie without lying. Available here and by RSS on March 19.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick





Recommend this issue to a friend
Send an email message to a friend
rbrenjTnUayrCbSnnEcYfner@ChacdcYpBKAaMJgMalFXoCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed
