How to reduce the effects of emotion on decisionmaking
By Cmdr. Tony Schwarz
As military planners formulate recommendations for their commanders, they draw upon many kinds of data: joint and service doctrine, friendly and enemy capabilities, personal and shared experience. But one category is often missing from their calculations: emotion.
All humans experience emotion (broadly defined as “feelings”), which means that no one is capable of absolutely rational behavior. Not convinced? You’re not alone. In the study of economics, for example, it has long been generally assumed that human decision making is purely rational — that with few exceptions, humans can be expected to act in their own best interests. This assumption, already under assault by behavioral economists, was swept away by the economic crash of 2008, which obliterated $14 trillion of American wealth and proved just how irrational decision makers can be. (Even Alan Greenspan, one of the staunchest believers in rational economic theory, admitted that his ideology had pushed him to make regrettable decisions.)
Today, the work of the behavioralists is being embraced throughout academia, and it is quickly becoming generally accepted that while humans usually try to make rational decisions, our conscious, rational discernment processes are limited by emotions (or put another way, irrational wants and tendencies in our subconscious).
It is not possible, or even wholly desirable, to eliminate emotion when making decisions. But planners can — and indeed, should — work to manage its effects.
Emotions and controls
Our efforts to reduce the negative effects of emotions should not be confused with an attempt to eliminate emotion itself. In the first place, that’s impossible; in the second, some emotions may actually improve one’s ability to solve problems. For example, research by Jennifer Lerner, a professor at Harvard’s Kennedy School, shows that anger can increase decisiveness. (As British Field Marshal Bernard Montgomery reminds us, “When all said and done, the greatest quality required in commanders is ‘decision.’”)
So how do we constructively think about emotion in decisions? The theory of Bounded Rationality, first published by Herbert Simon in the 1980s, offers a useful description. Simon’s theory says that the rationality by which we process information and make decisions is limited in several ways: by the amount and type of information we possess, the cognitive limits on our minds, and the finite amount of time we have to make a decision.
By this definition, the rational abilities of planners are extremely limited. Intelligence is never complete, planners (that is, humans) possess internal biases and tendencies, and, of course, there is never enough time to plan. But this construct also allows us to contemplate ways to reduce the negative effects of emotions on decision making. By implementing various controls, planners can exceed default limitations.
Such controls can take many forms. A simple one: the OPT Lead can acknowledge, early and frequently, the existence of relevant emotions. She might say something like “We all know our commander’s tendency to prefer COAs with minimal risk to forces, but our guidance is to focus on a swift and decisive military victory. Let’s avoid a self-fulfilling prophecy by preventing our concern for risk from limiting our COAs in terms of initiative, dominance, and rapid phase transition.” A control might also be more subtle, like designating a team member to be a devil’s advocate and requiring that he critique each aspect of planning through an independent and reason-based lens.
Another way to limit the ill effects of emotion on decision making is to ensure the staff is given clear and detailed guidance. For example, a transition point must be universally understood: “X, Y, and Z must be achieved before the next operational phase can begin.” Formulating such guidance, of course, requires knowing exactly what the commander wants.
Here are more examples, drawn specifically from military planning processes, of the ways subjectivity impinges on planning:
• Determining the validity of courses of action: Joint planning doctrine states that COAs must be suitable, feasible, acceptable, distinguishable, and complete. All these measures are biased to some degree, but determining acceptability is most subjective. There is a relevant definition in the U.S. Navy Planning Process: in order for a COA to be deemed acceptable, the operational advantage gained by executing the COA must justify the cost in resources, friendly losses, time, position, and opportunity. Still, what is acceptable for some won’t be for others. To reduce subjectivity, flesh out this aspect of validation in the most objective way possible.
• Comparing COAs: After validating a set of COAs, the team must compare their relative advantages and disadvantages. Under joint planning doctrine, the OPT designs a decision matrix: a table with subjectively chosen and subjectively weighted governing factors. To reduce subjectivity, the team should repeatedly consider and refine their perceived governing factors through constant communication with the command element. If those factors are important to the commander, they must be fascinating for the staff, and therefore weighted heavier than other factors. The commander has the right to be subjective; the staff does not.
• Assigning measures of effectiveness for tasks. If there is any doubt as to how to define progress toward task achievement, rather than debate metrics, planners should consult the 1,100-page Universal Joint Task List Database.
Finally, the commander’s orders — the OPT’s final deliverable — must wrestle with the dichotomy of telling subordinate units what to do and not how to do it. Joint Pub 5-0 codifies what good leaders have long known: that the best plans allow subordinates the freedom to create their own means of achieving the commander’s end state. In effect, planners must leave enough room for subjectivity, but not too much.
Plan ahead for planning
Better than countering the negative effects of irrational planning as they arise is ensuring they don’t arise. Most senior military leaders have learned to avoid common mistakes on planning teams: don’t pull rank, don’t marginalize or alienate teammates, etc. But a comprehensive defense against irrationality requires a commander to put proper avoidance controls in place before planning starts.
Here are some common problems, with examples and possible solutions:
• The framing effect. Decision makers can be sensitive to irrelevant differences in how information is presented. During COA Comparison, for example, a briefer who casts a certain COA in terms of gains and a competing COA in terms of loss will often lead the former to be selected, even when the latter is objectively superior. To avoid this, a commander should remind the OPT to view all facets (especially risk assessments) of COA formation through the same lens, to brief all COAs consistently and describe their gains and losses in the same way, and to ensure that governing factors are the same for each COA assessment. Lean on doctrine and follow the proven processes; research shows that more analytical or systemic thinkers make fewer framing errors.
• Sunk-cost bias. The world of Pentagon acquisition is notorious for its the inability to ignore past investments when making a decision, but other kinds of planning are just as susceptible. For example, planners might reaffirm an initial Center of Gravity determination despite contrary indicators, thus creating an improper context for evaluations of critical capabilities and weaknesses. One antidote is to prearrange to have an outsider observe the OPT at intervals to ensure that planning is focused on future cost and benefit only. If evidence of sunk cost bias arises, the OPT should accept responsibility for planning mistakes and not allow pride to prevent changing course.
• Availability heuristic. This means assessing the probability of an event by the ease with which instances and occurrences can be brought to mind. For example, planners might come to believe that that a certain enemy COA is more likely solely because it can be most easily imagined. Or they might base their notion of an enemy’s strategy on recent precedent or personal experience in the face of facts and valid planning assumptions that indicate otherwise. Certainly, history can provide context when assessing potential enemy COAs, but before wargaming, the OPT should caution the team to focus on current data specific to the current situation, particularly empirical evidence such as corroborated intelligence reporting and relative combat power analyses.
• Anchoring bias. This inability to adjust is what happens when planners, say, base their efforts on a standing OPLAN or some other directive even after the commander modifies his or her intent. To avoid this, don’t plan to an old plan. Keep operational plans on the shelf, but consider them notional; they cannot possibly be the perfect fit (or necessarily even a good fit) for specific situations without serious modifications. (This is why Eisenhower said that plans are worthless, but planning is indispensible.) Instead, start planning with higher headquarters’ intent and commander’s guidance, as received at the Mission Analysis Brief, the COA Decision Brief, and wherever else possible.
• Confirmation bias. Related to sunk-cost bias, this is the predisposition to a certain outcome because it is believed to be the eventual outcome. In economic theory, estimates focused on a single possibility are called “single-outcome forecasting.” In military planning, one might, for example, identify the discovery and securing of WMD as a strategic end-state, refuse to consider possible alternatives, and thus drive the purpose and method for all supporting operational planning in counter-productive directions. To counter this, the OPT must plan in a flexible way such that branch plans (with divergent end-states) can be easily activated.
• Finally, there is status quo bias, the tendency for people to resist change, and at times in spite of a very apparent need for change. One reason for this is a reluctance to rock the boat. Most humans weight errors of commission far more heavily than errors of omission — that is, a flawed action is perceived as being far more damaging than the omission of a corrective action. Harvard economist Richard Zeckhauser says this can lead to “herding,” in which team members keep quiet and play wait-and-see rather than voice an objection about a merely potential disaster. Heidi Gardner, another Harvard professor, recalls that when she was at McKinsey & Co., the consulting firm’s standing rule was that anytime an employee did not agree with an assessment or remedy, he or she had “an obligation to dissent.”
• That’s usually not easy to do, and is even more difficult if one is relatively junior on the planning team. But an OPT lead cannot tolerate herding. He or she must create a dissent-friendly planning environment. Members should challenge norms, assumptions, and the appearance of groupthink mentality. They must keep open minds and search for outside-the-box alternatives. The extended discussions spawned by these challenges and searches will either prevent the team from continuing down an ineffective path or they will validate previous planning efforts; in either case they are helpful.
Conclusion
Operational planners are given a desired end state and work backward in a rational manner by using doctrine, precedent, situational data, and more. We owe it to our commanders to inject some level of emotional intelligence into planning the critical courses of action they will direct. This requires a deep understanding of attitudes, opinions, norms and social tendencies that quietly nudge us all into making certain decisions in certain situations. As trusted planners, we must see those factors in advance and acknowledge their power as we plan. We must be aware of our personal predispositions, biases and tendencies and we must not allow them to shape our planning efforts.
If we implement the proper controls before and during planning, the commander can be assured of the most rational planning possible. Moreover, some of us will one day receive recommendations from our own planning staffs. Understanding the effects of emotion as a staff officer will improve our own judgment in the face of great peril and great opportunity.
Since 2009, Cmdr. Tony Schwarz, USN, has been a Navy Reserve Operational Planner with the Maritime Planning Group on staff at U.S. Pacific Fleet Headquarters. A Naval Academy undergraduate and helicopter pilot, Schwarz has over 50 war-time flight hours and has served tours at the White House, State Dept and Pentagon.
Recent Comments