Proving Analysts Wrong – Part IV

Do you have trouble admitting you are wrong? Or convincing a colleague that his or her analysis is incorrect? Most of us find these tasks challenging because our egos are involved and we usually focus our attention on information that supports our view. This issue of Analytic Insider presents Analysis of Competing Hypotheses, the fourth and last of a select group of structured analytic techniques—including Indicators, Argument Mapping, and Deception Detection—that can spur analysts to admit their initial analysis was flawed and to work toward achieving a better result.

Technique #4: Analysis of Competing Hypotheses

As political campaigns heat up in advance of the US national elections in November, the airwaves are filled with competing interpretations of past events. Our natural tendency is to contrast all the facts and explanations one candidate posits to prove she or he is right with the other candidate’s list and agree with whomever has made the stronger case. A more thorough and efficient approach is to focus instead on how many facts or explanations are inconsistent with a candidate’s position. Sometimes only one highly persuasive item of inconsistent information is sufficient to reject an entire line of analysis. When an analyst is confronted with an item of information or piece of evidence that could not be true if his or her stated position is correct, most analysts –but probably not most politicians—will agree that the initial analysis is flawed and a new hypothesis is needed. More often than not, however, the contrasting information is less compelling. In such cases, it is more efficient to focus on what is inconsistent with a given hypothesis and embrace the position supported by the least inconsistent data.

The good news is that technology is making it easier to identify compelling contrary information. Mobile phone pictures, videos, and audio tapes, for example, can provide incontrovertible evidence of what was said or done. The key to being a successful analyst is to consciously seek out and record inconsistent or anomalous data and to consider more than one explanation as potentially true until all the data and facts are on the table.

To learn more about how the eight­step Analysis of Competing Hypotheses methodology works, check out the Handbook of Analytic Tools and Techniques. For a fuller discussion of when to use the technique, its value added, and potential pitfalls to avoid see Chapter 7 in Structured Analytic Techniques for Intelligence Analysis.

Proving Analysts Wrong – Part III

Do you have trouble admitting when you are wrong? Most of us do because it is hard to admit we have made a mistake. We have a natural tendency to accept information we read or hear as correct, assuming it comes from an authoritative source. We are particularly prone to accept information when what we hear is consistent with our world view. Those who want to manipulate how we think, however, understand this concept. They will supply deceptive information in the hope of changing our behavior. This issue of The Analytic Insider presents Detecting Deception, the third of a select group of analytic techniques, including Indicators and Argument Mapping, that can spur analysts to admit they are wrong when they incorrectly believed that what they first heard was true.

Technique #3: Detecting Deception Who hacked the Democratic National Committee server? Was it a member of

Who hacked the Democratic National Committee server? Was it a member of rival political organization? Rogue hackers in Berlin? Russia? China? Hackers employed by WikiLeaks? Ukrainian cyber hackers? Are these real messages? Why did this information surface only days before the Democratic National Convention? When might more hacked messages appear in the media?

If a deception campaign is underway, how would you know who was doing it? How strong is the case that Russia is behind it and not just some rogue hackers? In the Handbook of Analytic Tools and Techniques, we offer the following signs that address these questions; they will help you assess whether someone was conducting a deception campaign.

Baseline conditions

Does the potential deceiver have:

  1. A history of conducting deception campaigns? (Russia is one of the most notorious users of deception.)
  2. A feedback channel that would allow it to track how the information is being processed and to what effect? (This has become a major consequence of the 24/7 media environment.)
  3. A great deal to lose or gain depending on the outcome? (Such as the integrity of democratic institutions and the US electoral process.)

Tactical tipoffs

Did key information enter the public domain:

  1. At a critical time? (For example, just before a national convention or just before it is time to vote.)
  2. From a source whose bona fides are questionable? (WikiLeaks.)
  3. From a single source or based on a few attention­grabbing documents?

Would accepting the information as true prompt you to:

  1. Alter a key assumption—something you previously had considered common wisdom?
  2. Reallocate substantial resources or reposition substantial personnel?

On 6 September, The Washington Post reported that senior US intelligence officials have recently concluded that Russia may well be engaging in a covert operation during the US presidential race to sow public distrust of US political institutions and disrupt its electoral process. (Read the article here.) A lead indicator that Russia may have launched such a campaign was the hacking and release of Democratic National Committee emails just days before the Democratic Party Convention—some of which had been doctored. Officials speculate that several motives are possible, including the desire to 1) influence US elections and 2) generate propaganda fodder to counter US democracy building initiatives around the world, particularly in the countries of the former Soviet Union.

To learn more about how to evaluate whether a report is deceptive, employ the Deception Detection Checklist (included in the Handbook of Analytic Tools & Techniques) that consists of 18 questions focusing on:

  • MOM ­ Motive, Opportunity, and Means
  • POP – Past Opposition Practices
  • MOSES – Manipulability Of SourcES
  • EVE – EValuation of Evidence

Remember, just becoming sensitive to the possibility of deception usually will not protect you from being deceived. It is simply too hard cognitively to challenge every item of information you read or hear. A better strategy is to train yourself to recognize when you are most susceptible to deception. When one of these flags goes up, pause to ask yourself “Am I being deceived?”

Proving Analysts Wrong – Part II

Do you have trouble admitting when you’re wrong?

Most of us do because it is hard to admit we have made a mistake. Once we have come to a conclusion (like which political candidate to support), we tend to accept data that supports our view and ignore data that would undercut that decision. We fall into the traps of Confirmation Bias, Ignoring Inconsistent Evidence, Relying on First Impressions, and the Anchoring Effect. Structured Analytic Techniques (SATs) are designed to save us from these pitfalls. This issue of the Analytic Insider presents Argument Mapping, the second of a select group of analytic techniques, including Indicators and Analysis of Competing Hypotheses, that can spur analysts to admit they were mistaken in their initial analysis.

Technique #2: Argument Maps

Argument Maps are used to test a single hypothesis through logical reasoning. An Argument Map is a tree diagram that starts out with a lead hypothesis or conclusion and then branches out to show the logic, evidence, and assumptions that support or undermine that hypothesis or conclusion. An Argument Map makes it easy for analysts and, more importantly, decision makers to clarify and organize their thoughts. With the entire map arrayed in front of them, they can more easily and accurately evaluate the soundness of the analysis and their resulting decisions.

Imagine if you are engaged in a discussion of whether Brexit makes sense or Obamacare is worthwhile. Those on each side of the issue will marshal piles of evidence and logic to support their view, leaving it to you to decide who has made the stronger case. Each side will emphasize what best supports their view and ignore facts or logic that argue against their position. When the issues are this complex, it becomes very hard to figure out what makes the most sense. This is an ideal setting where a structured technique can help you sort through the thinking process.

An Argument Map forces both sides to put everything on the “same table” in an organized fashion. Each side lists the arguments (or claims) and evidence that support the hypothesis, the claims and evidence that undermines it, rebuttals for all claims, and assumptions inherent in the argument. With all this information arrayed in the context of a single diagram, both sides can then stand back and evaluate all aspects of the issue. Once all the evidence is arrayed, evaluated, and prioritized, the map will show whether the conclusion is fundamentally supported by the evidence, logic, and assumptions or if it falls apart.

The construction of an Argument Map makes it much harder for advocates to focus attention only on arguments that best support their position. All evidence must be evaluated within the context of the entire argument. If one side does not agree with the conclusion, they are free to add new data or new logic at the appropriate location in the map or alter initial assumptions.

When a team of analysts constructs an Argument Map, a picture will emerge showing the overall strength or weakness of the initial hypothesis. If the analysts agree to approach the topic with an open mind, a consensus will usually form within the team over what is the proper analytic bottom line. In most cases, analysts will freely admit that some initial positions they took were wrong because the technique forced them to 1) consider new evidence, 2) weigh the significance of all the evidence, 3) discover possible faults in their logic, or 4) recognize that some of their initial assumptions were flawed.

Globalytica’s case study, Iraq WMD: Facts, Fiction, and Yellowcake, uses an Argument Map to show how each argument that supported the conclusion that Iraq had imported yellowcake from Niger can be refuted. Click here to purchase the case study today.

Argument Map:

Is Iraq importing yellowcake from Niger for its nuclear weapons program?

screen-shot-2016-11-11-at-11-05-33-am

Meet the 2016 IAFIE Instructor of the Year – Mary O’Sullivan

Congratulations to Mary O’Sullivan on being named 2016 Instructor of the Year by the International Association for Intelligence Education (IAFIE). Mary has been an integral part of the Pherson Associates team since 2008, developing and instructing classes that have reached thousands of analysts and students in the intelligence community and beyond.

Mary’s notable achievements include:

  • Leading design and management of analytic training programs for government audiences as Dean of Pherson Associates’ educational facility, The Forum.
  • Developing and managing Globalytica’s certificate courses, offered internationally and to private industry.
  • Teaches on average more than 50 courses per year to government and private sector analysts.
  • Published contributions: development of the CREATE methodology as well as a chapter in Intelligence Communication in the Digital Era (see below for details).
  • Established credentials as a gifted instructor, innovator, and mentor. Chancellor and co-founder of CIA University.

We are proud to have Mary O’Sullivan on our team as she continues to make strides in strengthening the analytic workforce. We are honored that Mary’s instructional excellence and innovative curriculum development have been recognized by IAFIE.


Intelligence Communication in the Digital Era Transforming Security, Defence and Business

By Rubén Arcos and Randolph H. Pherson (Editors)

  • Present analysis and customize information to decision makers.
  • Leverage advancements in technology to communicate information differently.
  • Encourage analysts to change the way they work, and provide them with the resources and production technology to do so.
  • Invest in new analytic tradecraft that prioritizes developing producer/consumer relations rather than predicting future events.

FIND OUT MORE

Proving Analysts Wrong – Part I

Do you have trouble admitting when you’re wrong?

Most of us do – and one reason for this reticence is being slow to acknowledge the possibility that we have made a mistake. Let’s face it- the only thing worse than being wrong is taking too long to realize it! Mistakes are inevitable, but we have the tools to help you avoid looking like a fool who can’t admit a blunder. Over the next few months, the Analytic Insider will explore a small group of Structured Analytic Techniques that can spur an analyst to admit that his or her analysis is wrong.

Technique #1: Indicators

When an analyst generates a scenario or makes a prediction that a certain event is likely to play out, a best practice is to develop a list of indicators or “things one would see” that suggest the scenario is actually emerging. A good analyst will also develop a set of observables that will indicate the scenario or event is not going to happen. If most of the indicators that the scenario will play out start to happen, then the analyst will be vindicated in her or his prediction and congratulated for outstanding insight. If, on the other hand, few or none of the indicators come into play and even some of the negative indicators start to “light up,” then the analyst has little choice but to admit that the analysis is flawed or undermined by unanticipated intervening events.

If the analyst has not developed a set of indicators to provide an objective baseline for the analysis, he or she most likely will be inclined to stick to the analysis as long as possible, focusing on the data points that confirm the proffered view and ignoring the mounting pile of evidence that contradicts the prediction.

The existence of a pre-determined set of baseline indicators, however, quickly illuminates this mistake. If “we all agreed we would be wrong if these things happen” and they start to happen, only a fool would refuse to admit that the initial analysis is wrong.

In today’s uncertain and increasingly confusing world, analysts (as well as political pundits) should develop lists of indicators to accompany their projections and predictions. At the Dahrendorf Symposium in Berlin last June, we developed five sets of scenarios and associated indicators for how EU relations would evolve with the United States, China, Ukraine, Turkey, and the Middle East (link to the report in Recommended Resources above). The next step will be to track the indicators to see which events are actually occurring, indicating which scenarios are the most apt forecasts for the EU’s future.

Indicators are a pre-established list of observable events that analysts periodically review to track events, spot emerging trends, and warn of unanticipated change.

Indicators can be grouped into two categories:

  • Validating or backward-looking: Past or current actions or activities that would help confirm that a target’s activities or behavior are consistent with an established pattern or historical norm. Characteristics that would help an analyst determine whether something meets a set criteria.
  • Anticipating or forward-looking: Future activities that one would expect to observe if a given hypothesis is correct or a predicted scenario is emerging. Developments that allow analysts to track how a situation is playing out over time.

The method for developing validating indicators is fairly straightforward. The first two steps usually are done working alone. The third and fourth steps can be an individual or a group process.

  • Precisely define the phenomenon being considered.
  • Locate an existing list of indicators that best describe this phenomenon (or develop your own based on historical research).
  • Assess how many of the historically-derived indicators are present in the case.
  • Assess whether the correlation is strong enough (enough indicators are present) to justify applying that label to the current case.

The method for developing anticipating indicators of change is somewhat more involved. The process can be done working alone or in a group:

  • Identify a set of alternative scenarios or competing hypotheses.
  • Generate a list of activities, statements, or events that one would expect to observe if the scenario is beginning to emerge or the hypothesis is coming true.
  • Examine the list to ensure that all the indicators are:
    • Observable and collectible. Can it be observed and reported by a reliable source and reliably collected over time?
    • Valid. Is it clearly relevant to the end state the analyst is trying to predict or assess? Does it accurately measure the concept or phenomenon at issue?
    • Reliable. Is data collection consistent when comparable methods are used? Those observing and collecting data must observe the same things. Reliability requires precise definition of the indicators.
    • Stable. Is it useful and consistent over time to allow comparisons and track events?
    • Unique. Does it measure only one thing and, in combination with other indicators, point only to the phenomenon being studied?
  • Periodically check the indicators list to track which scenario appears to be emerging or which hypothesis appears to be most correct.

School’s Out – What’s Next? You Decide!

The end of June brings the last day of school and months of new activities. Family members’ various opinions and schedules can bring complexity to even the most basic decisions. Fortunately, we have some Decision Support Techniques to help you with your summer planning!

We know making decisions is hard when they involve tradeoffs among competing goals, values, or preferences. Because of the limitations of short-term memory, we often struggle with making sure we have evaluated rigorously all the aspects of a problem before deciding what we will do. One way to deal with this challenge is to use simple decision-support techniques that lay out the options in graphic form so that you can evaluate the results of alternate solutions while keeping the problem as a whole in view. The technique you choose should depend on the type of problem you are confronting. In Structured Analytic Techniques for Intelligence Analysis, we describe seven techniques that you can use at work, but just as easily at home. In fact, many of our students find that the techniques work surprisingly well when dealing with real time problems like which car to buy, how best to explain a new rule, or even whom to date!

Below are some everyday scenarios you may face this summer, paired with the tools that can help you make your decision.

Globalytica’s Decision Support Techniques

Decision Trees are a simple way to chart the range of available options, estimate the likelihood of each one, and weigh alternatives. Use this technique to chart various vacation activities.

The Decision Matrix is a simple but powerful device for making tradeoffs among conflicting goals or preferences. For example, if you are buying a car, the matrix can help you evaluate options such as gas efficiency and the number of safety features. The technique forces you to determine how to decide and sometimes surprises you when you see the outcome.

Pros-Cons-Faults-and-Fixes is a useful tool for critiquing new ideas. Most of us have made lists of Pros and Cons, but this technique takes the decision process one step further by asking you to also consider how one could “Fault the Pros” or “Fix the Cons.” You could use it to decide if it is worth upgrading your current home computer system.

Force Field Analysis helps you decide how to solve a problem or if it is even possible to do so. It can answer questions like “What will it take to get this new policy adopted?” or “Who do we need to lobby the most to accomplish our stated goals?” This summer, use this technique to help you convince your relatives to organize a family reunion at your choice of location.

If you have set an ambitious goal, but you are not sure if it can be achieved, use SWOT Analysis. This tool helps you develop a plan or strategy for achieving a goal by focusing on Strengths and Weaknesses of your organization along with the Opportunities and Threats in the external environment.

The Impact Matrix is used by leaders to figure out the most effective way to implement a new–and sometimes highly unpopular–policy in an organization by evaluating what impact it will have on all key actors before the new policy is announced. It is a good way to anticipate trouble before it is too late! Use the Impact Matrix to help you explain to potentially uncooperative family members why taking a cross-country road trip is more fun than flying.

The Complexity Manager provides a simple but rigorous approach for understanding highly complex problems. It helps uncover unintended consequences, assess chances for success, and identify opportunities for influencing the outcome of a decision. Use this tool to help you anticipate all the ways a wedding could go wrong or (preferably) right.

How to Make Your Garden Grow

Spring is upon us, and for many of us the warmer weather means working in our yards and gardens. Gardeners are often praised for their green thumbs, but in reality there is no secret to a healthy lawn or blooming plants – all it takes is proper planning to make your garden grow. The same can be said for analysis: just as the best gardens start with a design or a plan, if you invest some time to design your project or outline your paper before getting started, your analysis will thrive!

One of the biggest mistakes analysts make is to plunge in and begin writing or researching as soon as they are given a task or a question to answer. When Globalytica developed its Analysts’ Roadmap listing the primary tasks all analysts should perform when generating an analytic paper, the process was divided into five steps. The first-and most important step-is to Stop and Reflect before plunging in.

When beginning a website search, for example, we often tell students to write down their key search terms before typing them into the computer. This may sound like a waste of valuable time, but writing down the terms on a piece of paper forces you to reflect on which keywords would be the most effective for your specific search criteria. As a result, you do not lose minutes or hours reviewing bad “hits” or revising search parameters multiple times.

Similarly, considerable time can be saved if you ask yourself simple questions before beginning a project such as “Has anyone written on this topic before?” or “Who can I talk to who would know how best to begin my research?” or “Where would I expect to find the best information?” At Globalytica we have captured some of the best questions to ask at the start of a project in the Getting Started Checklist (below). We consider the checklist so important that we published it in two of our books: Critical Thinking for Strategic Intelligence and Structured Analytic Techniques for Intelligence Analysis.

Stopping to reflect can also be a group activity. We recommend that analytic units create a short list of key questions they should address before beginning work on any paper. They should review the list as a group. Typical questions include: What is a realistic deadline? What is the client expecting? Are we answering the right question? Do we need to reach out to other experts? What have we told the client about this issue before?

Attempting to jump into the analysis without these simple but effective tools is like planting without preparing your soil. By using the Getting Started Checklist, your analysis is much more likely to produce fruit. Happy planting!

Globalytica’s Getting Started Checklist

The Getting Started Checklist is a simple tool to help analysts start a new project. Analysts should answer the following questions before they start to draft.

  1. What has prompted the need for the analysis? For example, was it a news or intelligence report, a new development, a new report, a perception of change, or a customer request?
  2. What is the key intelligence question that needs to be addressed?
  3. Why is this issue important, and how can analysis make a meaningful contribution?
  4. Has your organization or any other organization ever answered this (or a similar) question before, and if so, what was said? To whom was that analysis delivered, and what has changed since then?
  5. Who are the principal clients? Are their needs well understood? If not, try to gain a better understanding of their needs and the style of reporting they like.
  6. Are there other stakeholders who would have an interest n the answer to this question? Who might see the issue from a different perspective and prefer that a different question be answered? Consider meeting with others who see the question from a different perspective.
  7. From your first impressions, what are all the possible answers to this question? For example, what alternative explanations or outcomes should be considered before making an analytic judgment on the issue?
  8. Depending on responses to the previous questions, consider rewording the key intelligence question. Consider adding subordinate or supplemental questions.
  9. Generate a list of potential sources or streams of reporting to be explored. 10. Reach out and tap the experience and expertise of analysts in other offices or organizations – both within and outside the government – who are knowledgeable on this topic. For example, call a meeting or conduct a virtual meeting to brainstorm relevant evidence and to develop a list of alternative hypotheses, driving forces, key indicators, or important players.

Don’t Be an April Fool

One of the biggest tricks analysts can play on themselves is to forget to stop and reflect before plunging into a project or initiating a keyword search for information. For example, we tell analysts that if they enter the first keywords that come to mind when searching the web, they usually will discover — often after tens of minutes have gone by — that they have reviewed several screens of data without finding a high quality source or citation. In essence, they fooled themselves into thinking that by plunging in they would save time when the opposite is true.

A far better approach is to pick up a pencil and write down the search terms you intend to use before typing them on the keyboard. This forces you to stop and reflect on what would be the most effective terms to use. If you wanted to learn more about the value of using Structured Analytic Techniques to combat cognitive bias, it would be highly inefficient to enter keywords such as “structured,” “analytic,” “techniques,” “cognition,” or “bias.” Words like “Heuer,” “Pherson,” or “intuitive traps” would get you to key materials much faster.

The need to stop and reflect is just as important when writing analytic papers. Before submitting a draft for review, use Globalytica’s Critical Thinker’s Checklist to ensure that you have produced a high quality product. By taking a little time up front to make sure your paper is soundly written, you and your editors can avoid wasting hours going back and forth on how the paper needs to be improved. Use the Critical Thinker’s Checklist before delivering your product or presentation to avoid being an April Fool!

Critical Thinker’s Checklist

Before delivering your product for review, these are the “must do” questions you should ask yourself. Does the report, assessment, or presentation…

  1. Answer the client’s key questions.
  2. Include both the What and the So What?
  3. Articulate a clear line of analysis, putting the bottom line up front.
  4. Provide new insights and further our understanding of the issue.
  5. Present clearly and accurately all the forces and dynamics at play.
  6. State the main point of the paper in the title and the first paragraph.
  7. Support all key assumptions underpinning the analysis.
  8. Reflect whether all the elements of the Who, What, How, When, Where, Why, and So What? have been adequately researched and addressed in the paper.
  9. Provide sufficient reasoning and compelling evidence to support all judgments.
  10. Consider alternative hypotheses, including the null hypothesis; assume the activities observed could be legal until proven otherwise.
  11. Identify important contrary evidence and gaps.
  12. Ensure that each section, paragraph, and sentence advances the storyline.
  13. Avoid bias, advocacy, and value-laden terms.
  14. Incorporate graphics to advance and underscore the message.
  15. Ensure that the paper includes required citations, and that all source references are correct.
  16. Express clearly the analysts’ levels of confidence in the key judgments and present the reasons for any uncertainty.

Upcoming Certificate Workshops

Globalytica, in conjunction with the International Association for Intelligence Education (IAFIE) and IAFIE-Europe, is offering those attending IAFIE’s 12th Annual Conference in Breda, the Netherlands, a unique opportunity to earn certificates in Strategic Foresight Analysis and Critical Thinking & Effective Writing.

Join Globalytica’s team of expert instructors prior to the start of the Annual IAFIE Conference at The Apollo Hotel in Breda, the Netherlands on June 22, 2016.

Randy Pherson, Globalytica CEO and Kathy Pherson, Globalytica President will lead exclusive, one-day workshops that provide attendees with:

  • Interactive, hands-on experience with experts in the field Certificates in new skill-sets Opportunities to connect with other conference attendees
  • Certificates in new skill-sets
  • Opportunities to connect with other conference attendees

Strategic Foresight Analysis Certificate Workshop: Redefining the Trans-Atlantic Security Paradigm. How might the EU and the US restructure their security framework in the wake of potential disruptors such as the Brexit vote on 23 June, a Trump (or Republican) presidency, and the migration crisis?

Critical Thinking and Effective Writing Certificate Workshop: Practice leading-edge critical thinking skills that all good analysts should master. Visit our website for more information.

The Power of Silent Brainstorming

Conducting brainstorming sessions in multiple countries for diverse customers, including banks, intelligence services, and political action committees, we have learned one major lesson: all brainstorming activities need to include a time when the participants are not allowed to talk. It sounds counterproductive, but it is really true!

Why does silent brainstorming work? People process information differently, and a significant component of the population cannot think productively when others are talking. In every brainstorming session, I facilitate, I ask the participants to specify whether they think better as part of a give-and-take discussion or would prefer to work in silence while gathering their thoughts before the discussion begins. Usually, at least 20 percent of the group falls into the second category; in classes that are dominated by analysts, the “silent worker” percentage can be as high as 80 percent. If the brainstorming session consists of constant dialogue, these individuals will usually opt out, and their contributions will be lost in the process.

How can you be inclusive of all types of thinkers? Some steps you can take include:

  1. Begin a brainstorming session by asking participants to write down their three best answers to the focal question-the question the brainstorming is supposed to answer. Collect these ideas and write them down on the whiteboard. Then everyone has already contributed to a solution, and everyone’s voice has been heard.
  2. If several people appear to be dominating the discussion, pass out 3″ x 5″ notecards and ask everyone to provide input to a key issue under discussion. When you collect the cards, discretely put the cards belonging to the talkative on the bottom of the pack. Then read out the answers or write them on the whiteboard to stimulate further discussion
  3. At the conclusion of a brainstorming session, always ask participants to write down their key takeaway on a 3″ x 5″ notecard and give it to you before they leave the room. Then consolidate everyone’s key takeaways and send them out to the group. You will be surprised by how effective this technique can be to validate the idea of holding the session as well as promoting a continuing dialogue on the topic. You may even want to send the list to the supervisor of invitees who was unable to attend, asking for his or her input as well.

Over the years, we have learned that almost all structured analytic techniques benefit from having one or two silent brainstorming sessions incorporated into the process. This extra consideration guarantees that you generate a superior final product.

Don’t Hibernate – Use these Eight: 8 Rules for Successful Brainstorming

When winter weather forces us indoors, do not retreat behind your desk. Shake things up by conducting a brainstorming session!

The August 2015 issue of Analytic Insider highlighted the importance of Structured Brainstorming, explained when to use the technique, and outlined how to conduct a Structured Brainstorming session. (Click here to reread the article.)

The eight rules shown below are simple but important guidelines to follow when running any brainstorming session. We hope you use them soon!

New Year’s Resolutions – Now is the Time to Refresh Your Analysis

While you are thinking about your 2016 resolutions, consider adopting new practices to counter weaknesses in your analysis. The ideal structured analytic technique to help you get started is the Structured Self-Critique.

Structured Self-Critique is a systematic procedure used to identify weaknesses in analysis, where team members change perspective to become critics rather than supporters of their own analysis. By responding to a list of questions about potential weaknesses in their evidence, assumptions, logic, and cognitive processes, the team is forced to reexamine its own analysis and identify how it might be spectacularly wrong!

How To Get Started

Form a small team composed of the authors, peer reviewers, editors, or other potential stakeholders in the paper. Make sure all the members of the Structured Self-Critique group are wearing the “black hat” of a critic. They should compete among themselves to see who can find the most glaring errors in the analysis. The group should work from a list of known past mistakes, including some or all of the following topics:

  • Sources of uncertainty: Identify the sources and types of uncertainty, using these questions:
    • Is the question being analyzed a puzzle or a mystery? Puzzles have answers, and correct answers can be identified if enough pieces of the puzzle are found. A mystery has no single answer; it depends on the future interaction of many factors, known and unknown.
    • How does the team rate the quality and timeliness of its evidence? Are there a greater than usual number of assumptions because of insufficient evidence or the situation’s complexity?
    • Is the team dealing with a relatively stable situation or one that is undergoing, or likely to experience, significant change?
  • Analytic process: If the team did not perform the following actions in the initial analysis, consider doing them, or lower the level of confidence in your judgments.
    • Identify alternative hypotheses and seek out information based on these hypotheses.
    • Identify and challenge key assumptions.
    • Seek a broad range of diverse opinions by including analysts from other sectors
  • Critical assumptions: If the team has identified key assumptions, focus on those that would have the greatest impact on the analytic judgment, if they turned out to be wrong. How recent and well-documented is the evidence that supports each key assumption? Brainstorm circumstances that could cause each one to be wrong. Would the reversal of any of these assumptions support any alternative hypotheses?
  • Diagnostic evidence: IIdentify alternative hypotheses and the most diagnostic items of evidence that enable the team to reject alternative hypotheses. Brainstorm reasonable alternative interpretations of these items of evidence that could make them consistent with alternative hypotheses.
  • Information gaps: Reevaluate confidence in your conclusion based on gaps in available information, dated information, and absence of information.
  • Missing evidence: Are you missing evidence that one would expect to see in the regular flow of intelligence or open source reporting?
  • Anomalous evidence: Is there any item of evidence that would have been important if it had been believed or related to the issue of concern; but was rejected as unimportant because it was not believed or its significance was not known? If so, try to imagine how this item might be a key clue to an emerging alternative hypothesis.
  • Changes in the broad environment: Could social, technical, economic, environmental, or political changes play a role in what is happening or will happen? Could these factors have an impact on whether the analysis proves to be right or wrong?
  • Alternative decision models: If the analysis deals with decisions by a foreign government or non-state actor, was the group’s judgment about foreign behavior based on a rational actor assumption? If so, consider the potential applicability of other decision models.
  • Cultural expertise: If the topic being analyzed involves a foreign or unfamiliar culture, does the team have cultural expertise on thought processes in that culture?
  • Deception: Does another country, NGO, or commercial competitor have a motive, opportunity, or means to engage in deception to influence US policy or to change your behavior? Does this country, NGO, or competitor have a history of engaging in denial, deception, or influence operations?

After reviewing these questions, the team must decide a) what additional research is needed, b) what text should be revised, and c) if the level of confidence in the judgments provided is appropriate. If few problems are identified, then the initial judgments have been reaffirmed; if problems emerged from the process, then the paper should not go forward until they have been corrected. More information, such as the best time to use this technique, value added, and potential pitfalls, are found in Structured Analytic Techniques for Intelligence Analysis, 2nd Edition by Richards J. Heuer Jr. and Randolph H. Pherson. Details on this publication can be found here.

Don’t Take Near Misses for Granted

Analytic units will often create their own list of Structured Self-Critique questions that are tailored to their specific work environment. Usually the first items on this list are examples of past errors that the unit does not want to repeat. This is called learning from your mistakes. For most of us, however, it is pretty easy to recall when we made a mistake and remember not to do it again. A bigger challenge is to remember past “near misses” or incidents when we got it wrong-or almost got it wrongand we were lucky that no one noticed.

For example, when a major 5.8 magnitude earthquake hit the US East Coast centered in Mineral, Virginia on August 23, 2011, the North Anna Nuclear Power Plant was only 11 miles away. The earthquake shut down the two nuclear power reactors and three of four diesel generators started up to supply electricity to the safety systems. The initial reaction to this was: “Good news! A nuclear disaster similar to what occurred at Japan’s Fukushima Daiichi plant the previous March has just been averted!” Fortunately a fifth back-up generator was brought on line to replace the broken generator that suffered a coolant leak.

But no one had anticipated that one of the diesel generators would not work. What would have happened if all four or even two or three of the generators failed when the earthquake hit? The plant was designed to survive an earthquake of a magnitude of 5.9 to 6.2 which suggests the event qualified as a very fortunate near miss. The near miss gave Dominion Power the opportunity to review its safety standards and make appropriate adjustments. As analysts we also need to learn from our near misses as well as our failures and use techniques such as the Structured Self-Critique to ensure that we do.