In some groups, disagreeing with the majority, or disagreeing with the Leader, can be a personally expensive act. Here is Part I of a set of tactics used by Leaders who choose not to tolerate dissent. Available here and by RSS on November 25.
And on December 2: Suppressing Dissent: Part II
Disagreeing with the majority in a meeting, or in some cases, merely disagreeing with the Leader, can lead to isolation and other personal difficulties. Here is Part II of a set of tactics used by Leaders who choose not to tolerate differences of opinion, emphasizing the meeting context. Available here and by RSS on December 2.
Wishful "thinking," as we call it, can arise in different ways. One source is the pattern of choices we make when we interpret what we see, what we hear, or any other information we receive. Here's Part II of an inventory of ways our preferences and wishes affect how we interpret the world.
Wishful thinking comes from more than mere imagination. It can enter when we interpret our own observations or what others tell us. Here's Part I of a little catalog of ways our wishes affect how we interpret the world.
Words of wisdom are pithy sayings that can be valuable so often that we believe them absolutely. Although these sayings are often valuable, they aren't universally valid. Here's Part IV of a growing collection.
Confidence in our judgments and ourselves is essential to success. Confidence misplaced — overconfidence — leads to trouble and failure. Understanding the causes and consequences of overconfidence can be most useful.
When things go wrong and remain undetected, trouble looms. We continue our efforts, increasing investment on a path that possibly leads nowhere. Worse, time — that irreplaceable asset — passes. How can we improve our ability to detect undetected issues?
In complex projects, things might have gone wrong long before we notice them. Noticing them as early as possible — and addressing them — is almost always advantageous. How can we reduce the incidence of undetected issues?
Words of wisdom are so often helpful that many of them have solidified into easily remembered capsules. And that's where the trouble begins. We remember them too easily and we apply them too liberally. Here's Part II of a collection of often-misapplied words of wisdom.
What we fail to notice about any situation — and what we do notice that isn't really there — can be the difference between the outcomes we fear, the outcomes we seek, and the outcomes that exceed our dreams. How can we improve our ability to notice?
The Halo Effect is a cognitive bias that causes our evaluation of people, concepts, or objects to be influenced by our perceptions of one attribute of those people, concepts, or objects. It can lead us to make significant errors of judgment.
The urge to identify as meaningful the patterns we see in winning streaks in sports, or streaks of successes in business, can lead us to accept bogus explanations prematurely. It's a common human tendency that can put people and organizations in desperate situations.
The nastiest part about solving complex problems isn't their complexity. It's the feeling of being overwhelmed when we realize we haven't a clue about how to get from where we are to where we need to be. Here's one way to get a clue.
Words of wisdom are so often helpful that many of them have solidified into easily remembered capsules. We do tend to over-generalize them, though, and when we do, trouble follows. Here are a few of the more dangerous ones.
Discussions in meetings and in written media can get long and complex. When a chain of reasoning gets long enough, we sometimes make fundamental errors of logic, especially when we're under time pressure. Here are just a few.
Recent research has uncovered a human tendency — possibly universal — to believe that we know others better than others know them, and that we know ourselves better than others know themselves. These beliefs, rarely acknowledged and often wrong, are at the root of many a toxic conflict of long standing.
Rhetorical fallacies are errors of reasoning that introduce flaws in the logic of arguments. Used either intentionally or by accident, they often lead us to mistaken conclusions. The Fallacy of Composition is one of the more subtle fallacies, which makes it especially dangerous.
Most of us interpret a confident manner as evidence of competence, and a hesitant manner as evidence of lesser ability. Recent research suggests that confidence and competence are inversely correlated. If so, our assessments of credibility and competence are thrown into question.
Often, at work, we make interpretations of the behavior of others. Sometimes we base these interpretations not on actual facts, but on our perceptions of facts. And our perceptions are sometimes erroneous.
When we see or hear the goings-on around us, we interpret them to make meaning and significance. Some interpretations are thoughtful, but most are almost instantaneous. Since the instantaneous ones are sometimes goofy or dangerous, here's a look at how we make interpretations.
Some decisions are difficult because they trigger us emotionally. They involve conflicts of interest, yielding to undesirable realities, or possibly pain and suffering for the deciders or for others. How can we make these emotionally difficult decisions with greater clarity and better outcomes?
Root Cause Analysis uses powerful tools for finding the sources of process problems. The approach has been so successful that it has become a way of thinking about organizational patterns. Yet, resolving organizational problems this way sometimes works — and sometimes fails. Why?
Some of what we believe is true about work comes not from the culture at work, but from the larger culture. These beliefs are much more difficult to root out, but sometimes just a little consideration does help. Here are some examples.
Maxims and rules make life simpler by eliminating decisions. And they have a price: they sometimes foreclose options that would have worked better than anything else. Here are some things we believe in maybe a little too much.
Stuck in uncomfortable situations, we tend to think of ourselves as trapped. But sometimes it is our own actions that keep us stuck. Understanding how these traps work is the first step to learning how to deal with them.
The phrase "You get what you measure," has acquired the status of "truism." Yet many measurement-based initiatives have produced disappointing results. Here's Part III of an examination of the idea — a look at management's role in these surprises.
Most of what we know about person-to-person communication applies when levels of stress are low. But when stress is high, as it is in emergencies, we're more likely to make mistakes. Knowing those mistakes in advance can be helpful in avoiding them.
You've just had some bad news at work, and you're angry or really upset. Maybe you feel like the target of a vicious insult or the victim of a serious injustice. You have work to do, and you want to respond, but you must first regain your composure. What can you do to calm down and start feeling better?
Keeping a journal about your work can change how you work. You can record why you did what you did, and why you didn't do what you didn't. You can record what you saw and what you only thought you saw. And when you read the older entries, you can see patterns you might never have noticed any other way.
Up and down the org chart, you can find bits of business wisdom about motivating people. We generally believe these theories without question. How many of them are true? How many are myths? What are some of these myths and why do they persist?
Have you ever regretted saying something that you wouldn't have said if only you had known just one more little fact? Yeah, me too. We all have. Here are some tips for dealing with this sticky situation.
It goes by various names — self-talk, inner dialog, or internal conversation. Because it is so often disorganized and illogical, I like to call it inner babble.} But whatever you call it, it's often misleading, distracting, and unhelpful. How can you recognize inner babble?
At times, we need information from each other. For example, we want to learn about how someone approached a similar problem, or we must interview someone about system requirements. Yet, even when the source is willing, we sometimes fail to expose critical facts. How can we elicit information from the willing more effectively?
Some of what we "know" about managing projects just isn't so. Understanding these last three of the nine fallacies of project management helps reduce risk and enhances your ability to complete projects successfully.
When we suddenly realize that what we've believed is wrong, or that what we've been doing won't work, our fear and discomfort can cause us to persevere in our illusions. If we can get better at accepting reality and dealing with it, we can make faster progress toward real achievement.
Most of what we know about managing projects is useful and effective, but some of what we "know" just isn't so. Identifying the fallacies of project management reduces risk and enhances your ability to complete projects successfully.
I have good news and bad news. The bad news is that if you wait long enough, there will be some bad news. The good news is that the good news helps us deal with the bad news. And it helps a lot more if we get the bad news first.
When things go badly, many of us experience stress, and we might indulge various appetites in harmful ways. Some of us say things like "My boss is driving me nuts," or "She made me so angry." These explanations are rarely legitimate.
Project status reports rarely acknowledge negative progress until after it becomes undeniable. But projects do sometimes move backwards, outside of our awareness. What are the warning signs that negative progress might be underway?
Where do the days go? How can it be that we spend eight, ten, or twelve hours at work each day and get so little done? To recover time, limit the fragmentation of your day. Here are some tips for structuring your working day in larger chunks.
We usually think of quibbling as an innocent swan dive into unnecessary detail, like calculating shares of a lunch check to the nearest cent. In debate about substantive issues, a detour into quibbling can be far more threatening — it can indicate much deeper problems.
When we try to understand the behavior of others, we often make a particularly human mistake. We tend to attribute too much to character and disposition and too little to situation and context. When we seek a better balance, we can adopt a more accepting view of events around us.
When we steer the discussion away from issues to attack the credibility, motives, or character of our debate partners, we often resort to a technique known as the ad hominem attack. It's unfair, it's unethical, and it leads to bad, expensive decisions that we'll probably regret.
Politicians know that answering hypothetical questions is dangerous, but it's equally dangerous for managers and project managers to answer them in the project context. What's the problem? Why should you be careful of the "What If?"
In project work, we often make decisions with incomplete information. Sometimes we narrow the options to a few, examine their strengths and risks, and make a choice. In our deliberations, some advocates use a technique called the Straw Man fallacy. It threatens the soundness of the decision, and its use is very common.
When we notice similarities between events, or possible patterns of events, we often attribute meaning to them beyond what we can prove. Sometimes we guess right, and sometimes not. How can we improve our guesses?
Working on complex projects, we often face a choice between "just do it" and "wait, let's think this through first." Choosing to just do it can seem to be the shortest path to the goal, but it rarely is. It's an example of a Finger Puzzle.
When we notice patterns or coincidences, we draw conclusions about things we can't or didn't directly observe. Sometimes the conclusions are right, and sometimes not. When they're not, organizations, careers, and people can suffer. To be right more often, we must master critical thinking.
Feeling trapped, with no clear way out, often leads to anger. One way to defuse your anger is to notice false traps, particularly the false dichotomy. When you notice that you're the target of a false dichotomy, you can control your anger more easily — and then the trap often disappears.
When we think, "Paul doesn't trust me," we could be fooling ourselves into believing that we can read his mind. Unless he has directly expressed his distrust, we're just guessing, and we can reach whatever conclusion we wish, unconstrained by reality. In project management, as anywhere else, that's a recipe for trouble.
Although we sometimes make decisions with incomplete information, we do the best we can, given what we know. Sometimes, we make wrong decisions not because we have incomplete information, but because we make mistakes in how we reason about the information we do have.
At a dinner party I attended recently, Kris said to Suzanne, "You remind me of Helen Hunt." I looked at Suzanne, and sure enough, she did look like Helen Hunt. Later, I noticed that I was seeing Suzanne a little differently. These are the effects of hat hanging. At work, it can damage careers and even businesses.
"If we promote you, we'll have to promote all of them, too." This "slippery-slope" tactic for winning debates works by exploiting our fears. Another in a series about rhetorical tricks that push our buttons.
A Tip a Day arrives by email, or by RSS Feed, each business day. It's 20 to 30 words at most, and gives
you a new perspective on the hassles and rewards of work life. Most tips also contain links to
related articles. Free!
Audiences at technical presentations, more than most, are at risk of death by dullness. Spare your
audiences! Captivate them. Create and deliver technical presentations with elegance, power and