A complex world demands a different kind of decision-making
As the insurgency in Iraq gathered in late 2003, leaders struggled to understand the cause of this unexpected widespread violence.
One of the earliest explanations that seemed to make sense was that in firing Baathists and disbanding the Iraqi Army, we had stripped so many Sunnis of their means of making a living that the insurgency, while fueled by al-Qaeda affiliates, was supported by locals as one of the few paying options. This theory, while sensible from a Western point of view, was far more simplistic than reality. But once the theory became dominant, it was hard to abandon. It colored senior leader decisions until the Sons of Iraq and the awakening councils provided not only employment, but vitally, dignity and honor to those Sunnis who had been enabling the insurgency.
We attributed the problem to a direct cause because we had trained ourselves to think that way over the preceding decades.
This notion that there are specific knowable causes that are linked to corresponding effects dominates military thinking and manifests in our drive to gather as much information as possible before acting. This concept was captured by Air Force Col. John Boyd’s decision loop: observe, orient, decide and act. In this OODA Loop, an endless cycle in which each action restarts the observe phase, it is implied that collecting information would allow you to decide independent of acting. Also implied is the notion that you can determine measures of effectiveness against which to observe each action’s movement toward achievement of your goal so you can reorient. The result of this type of thinking is to spend a lot of time narrowing the focus of what we choose to observe in order to better orient and decide. This drives one to try and reduce the noise associated with understanding the problem. We do this by establishing priority information requests or other methods of focused questions aimed at better understanding the core problem so we can control it.
The unemployment theory fit our understanding of the problem and, while incorrect, was coherent with an OODA Loop approach. We observed lots of unemployed Sunni in the streets and knew that the same cohort provided manpower for the insurgency. We oriented to the reality that coalition decisions had put them out of work. We decided that works projects would give them employment and take them off the streets. We acted by spending huge amounts of money on projects that were largely ineffectual in fixing the infrastructure or reducing the insurgency. Our mistake was in thinking a fundamentally complex problem — one with so many seen and unseen variables that there are no longer direct correlations between action and outcome — was merely a complicated one, with direct linkages between cause and effect.
This transition from complicated to complex manifests in every major problem in the world today. Thanks to cellphones, television, the Web and more, the speed of change is faster and the number of actors impinging on any situation approaches infinite. The recent implosion of dictatorships in the Middle East is a prime example. An impoverished man immolates himself to protest his inability to feed his family, setting a spark that touches off rebellions and protests throughout the Arab world.
Army Gen. Martin Dempsey, the new chairman of the Joint Chiefs, wrote in Armed Forces Journal in March that Army officers “must be comfortable with ambiguity and be able to provide advice and make decisions with less, not more, information,” and that officers must recognize that “the complexity of problems will increase over the course of an officer’s career and require strategic leaders to develop greater sophistication of thought.” Dempsey’s use of the word “complexity” is apropos. Decision-making in a complex world requires a different set of tools than those of a merely complicated one.
The following example illustrates the difference. During the Cold War, as U.S. ground forces trained for engagements they expected to encounter in a conflict, the toughest was to “cross an obstacle covered by fire.” Recall the challenges associated with this operation. The friendly forces had to cross ground emplaced with minefields, anti-tank ditches, wire and other obstacles. The enemy, who has surveyed the area for precise artillery fire, remains in direct observation of it and can call on direct and indirect fire and perhaps even reserves prepared to counterattack. As challenging as this sounds, it is fundamentally a complicated, not a complex, problem. It is ruled by the laws of physics.
To solve the problem, we massed critical forces and combat enablers in time and space to win the physics of the problem: obscurants to defeat the enemy’s ability to see, deep fires to take out the indirect fire, obstacle-clearing demolitions, our own aerially delivered obstacles dropped to hinder the movement of reinforcements. The former Soviet Union so embraced this philosophy of warfare that it developed an entire science around it: COFM, for correlation of forces and means.
Compare this with delivering security, assisting with development of a functioning government, changing cultural roles of the military inside of a foreign country. These missions are not governed by the laws of physics. Rather, they are affected by the actions of every individual in the area of operations. Anyone may choose to do something completely unanticipated, like the Tunisian man who videotaped his desperate act of protest.
In an era of complexity, decision-makers must beware of falling in love with their understanding of the problem. This can be difficult; we have become used to spending enormous energy trying to understand risk-laden situations. But the metrics we develop often give us only the illusion of precision in our decision-making: analysis matrices based on subjective measures and biases that become lost in the math.
A truly complex problem is completely opaque; we have little idea what will happen until we act. Complex problems evolve rapidly with actors continuing to learn and change enabled by the speed of information and a global support network. As Dempsey pointed out, these situations require officers to make decisions and give advice based on less, not more, information. This requires a leap of faith; we must act without expecting that we really know what will happen. We must, in effect, discard the OODA Loop by acting first, then observing and then acting again, without ever actually deciding that we know enough.
ENTER THE RED TEAM
One method for coping with decision-making in the era of complexity is to invest in “Red Teams.” In a military context, “red” generally implies something associated with the enemy. In this case, it means a group independent of “blue” — everyone else associated with addressing the complex problem.
Properly trained, Red Teams provide two critical elements. First, they can look at problems from a decidedly different perspective than the rest of the staff. If everyone else is looking at how to achieve the desired outcome, the Red Team examines how to avoid the worst possible outcome. While the rest of the team looks at the equities of the major players in the operational environment, the Red Team examines the lesser actors, how they would see events and how they might affect things from the periphery. Second, they create space inside the staff for slowing down, questioning common wisdom and giving a charter to an in-house skeptic.
Pre-mortem analysis is among the more useful tools in the Red Team’s box. If the standard military decision-making process is designed to maximize outcome, pre-mortem analysis is designed to minimize risk. As the staff develops a plan to achieve a desired end state, the Red Team envisions the worst-case future. They describe the nightmare scenario in detail and describe the events that led to it, then examine the plan to see how well it heads off the events that would lead to failure. Invariably, this leads the staff to see things it otherwise would not.
Such analysis was used to develop U.S. Forces-Iraq Order 11-01, which directed the handing-over of operations to the Iraqis and the maneuver of U.S. forces out of Iraq. Taking the time to highlight potential causes of failure made discourse and analysis more rigorous.
Another key tool is Four Ways of Seeing, a simple methodology designed to highlight bias. Given a situation, the team picks two protagonists and tries to describe four perspectives: how the actors see themselves and their roles — how X sees X, how Y sees Y — and how they see each other: how X sees Y and how Y sees X. This helps identify sources of friction; that is, the ways each protagonist’s self-image differs dramatically from how his adversary views him.
As complex problems deal with human dynamics, trying to get inside the collective mind of key interest groups provides insights that capabilities-based assessment approach will not. In 2008, the 25th Infantry Division asked the University of Foreign Military and Cultural Studies, the Army’s Red Team training organization, to help the staff prepare to deploy to Iraq and take control of the Multi-National Division-North. The chief of staff asked the Red Team, “Where are those who want success and where are those who want failure?” The team used the planning documents produced by the division, interviews with the staff and information available through classified and open sources to build a four-ways perspective that helped answer the question.
Many reading this may question whether they need formal Red Teams: “I’ve had [three, four, five] deployments. I know we have to engage with other cultures. I know problems are complex. I encourage imaginative thinking. Why do I need a Red Team?”
The answer is that we are captured by our own experiences. We need continual help to recognize that what we experienced yesterday may not have bearing on what we are experiencing today.
To cope with time pressures, we use mental tricks: best practices; analogies to our current situation in our experience or training; and numerous stated and unstated biases about ourselves, other actors and the environment. These shortcuts are handy when our problem can be solved by understanding and physics.
But when the problem is complex, these shortcuts delude us into believing we know more about the situation than we actually do. Analogies are always imperfect: “No man crosses the same stream twice” because both the man and the stream have changed. Best practices are extraordinarily dangerous if applied to an environment that has radically changed.
Red Teams have the charter and the time to think about problems in a way the rest of the staff does not. Their raison d’être is to identify how the current situation is different from our analogized version.
Most importantly, the Red Team has emotional distance from the plan. It is very difficult once invested in developing a plan, based on a mental model of the operational environment, to then abandon this understanding even when faced with overwhelming evidence that it is no longer true. Recall how long the Defense Department prohibited the use of the word “insurgency” to describe actions taking place in Iraq in 2003-2006.
More generally, adding a Red Team requires leaders and staff to have the humility to admit not knowing, to be open to information from the edge and from outsiders. The hierarchy of military organizations reinforces the notion of the leader as the “smartest person” in the room, and in an era of complicated problems, the commander confronting the obstacle was, in fact, likely to have faced such situations more often than his or her subordinates. But no one really has any more experience in dealing with a complex problem that morphs in unexpected and unpredictable ways.
This kind of culture change requires long-term investment. It requires building Red Teams inside army formations, creating tools to help bring alternative perspectives forward and training team members to use them.
The composition of the student and instructor cadres is critical to red teaming. Not everyone is suited to serve. Those with an intrinsic ability to think critically and challenge the conventional wisdom are good candidates, but successful red-teamers can also communicate well and have bona fides in the organization in which they will work.
Decision-making in the 21st century will take place under conditions of ambiguity and hyperspeed in information: in a word, complexity. Red Teams are not the only element of change required to cope with complexity. Across the force, we need to embrace abductive reasoning instead of inductive reasoning, free ourselves from our belief that qualitative metrics are the only valid way to measure progress, and value the fingerspitzengefuehl of leaders at the edge.
Critical thinkers require education and experience. Commanders require critical thinkers who can challenge assumptions and offer alternative perspectives. Red Teams educated to support decision-making can contribute to more nuanced decisions and serve to inculcate critical thinking across the force.
Designs, better visualization of the battlefield, or other improvements to the commander’s and staffs’ understanding of the operational environment are not substitutes for teams of trained and educated contrarians on the staff.
Recent Comments