Skip to content
headshot2-randy

Posted On

By

In

The Case of Kabul: Foresight and the Hindsight Bias Trap

The chaotic evacuation of US and NATO forces from Afghanistan has generated a storm of speculation—some informed and some not—about what led to the crisis in Kabul and how it could have been prevented. In this special edition of the Analytic Insider, I collaborate with my German colleague, Ole Donner, to review how Foresight and Hindsight analytic processes are both used – with varying success– to assess what occurred in Kabul and why.

Many commentators appear to have fallen into the trap of conflating what is known today with what could have been anticipated before Kabul fell to the Taliban. The neurological processes for anticipating what is about to happen and for evaluating what has happened are quite different.

Our brain consists of millions of neurons and the connections between them. When we learn something, it changes which neurons are connected to each other and how. Our brain’s physical change is referred to as neuronal plasticity. This process takes place unconsciously and irreversibly changes the way we think about an event.

When using Foresight analysis to anticipate what is about to happen, the objective is to:

  1. Identify a set of key drivers (or key variables) that best frame the issue and will determine how events will play out.
  2. Give different weights to these drivers to generate a set of mutually exclusive but comprehensive scenarios or potential trajectories.
  3. Develop a set of indicators to alert decisionmakers to which scenario or alternative trajectory is beginning to unfold.

For example, in the weeks before Afghanistan fell to the Taliban, many were asking the question: When will the Taliban take Kabul? Answering this question required analysts to select pieces of information from a sea of incomplete and contradictory data and then formulate a set of potential scenarios when much of the needed data for conducting the analysis was missing.

In the Kabul case, the scenarios most often discussed in the press posited different time estimates (weeks, months, and years) for when the Taliban would gain control of Kabul. Instead of trying to predict a date when the Taliban would take over (an almost impossible task at the time), a better approach would have been to identify the key drivers that would determine when a takeover would occur. Some examples of key drivers are: the will of government leaders and Afghan soldiers to resist the Taliban, the extent of popular support for the government and the Taliban, prospects for installing a transition government, and US willingness to increase or decrease its military footprint.

As a situation became increasingly fluid, a primary analytic function was to track the indicators relating to each of these key drivers and alert decisionmakers to which scenario or alternative trajectory was emerging as the most likely. For example, an indicator that there would be sufficient time to evacuate Americans and Afghans would be the successful establishment of a transition government.

Hindsight is a very different matter. If a certain event already has occurred, such as the fall of Kabul to the Taliban, that changes how we—as analysts, journalists, or politicians—look at that event. The challenge is to avoid the cognitive trap of Hindsight Bias—or at least mitigate its impact.

Hindsight Bias: Claiming the key items of information, events, drivers, forces, or factors that shaped a future outcome could have been easily identified.

Information related to the event, which may have seemed less important than others in a Foresight analysis, will suddenly take on enormous significance because—and only in retrospect—it clearly pointed to the event that occurred. Before the event occurred, this same information was just part of the background noise in a sea of other information.

The tendency to retroactively give undue weight to some items of information is the essential difference between how the brain processes data in Foresight versus Hindsight. After we know about an event, it is not possible to put ourselves in the same cognitive state we experienced before it occurred. The physical structure of our brain is changed by what we have just learned has occurred, and this neuronal plasticity cannot be reset.

The Hindsight Bias cognitive pitfall helps explain why some observers claim that an apparent “intelligence failure” such as the failure to anticipate the rapid fall of Kabul should have been easier to predict than may actually have been the case. Commentators and politicians who understand the difference between Foresight and Hindsight know to exercise restraint in assessing the success of predicting a complex event or anticipating a surprise development. They recognize that Hindsight Bias is one of the most important cognitive pitfalls to protect against.

For more information on Hindsight and other cognitive biases see Richards J. Heuer Jr’s Psychology of Intelligence Analysis, (2007), page 161. To learn more about Key Drivers and Foresight techniques check out my Handbook of Analytic Tools & Techniques, 5th ed. (2019).

×
Pherson books and other publications are now available through Globalytica.com
Please note Globalytica is a separate entity from Pherson.