Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


    December 1995
 


Note 9

 
 

Conclusions

 
 
This is the ninth in a series of Product Evaluation Staff notes to clarify the standards used for evaluating DI assessments and to provide tradecraft tips for putting the standards into practice.
 


Conclusions - the analysts' findings based on organizing, evaluating, and interpreting the all-source information available on the issue at hand usually are the most important value added for key consumers of DI assessments. Because policymakers and action takers rely on DI assessments as they define and defend US interests, analysts must be precise in conveying the level of confidence in their conclusions, taking appropriate account of the prospect of deception and other sources of uncertainty.

Heavily engaged policy officials often receive directly much of the open-source and intelligence information available to DI analysts. But the policy officials need DI assistance in keeping track of the facts, fictions, and trivia. The important event has to be identified from among the ordinary, the underlying patterns have to be cobbled together from seemingly unrelated pieces of information, and the reliable sources have to be distinguished from the self-serving.

On complex national security issues the information available to the analysts rarely speaks for itself. Gaps and inconsistencies are the rule. This is where the DI analysts' expertise comes into play. When seasoned and skeptical DI analysts believe they have made their case by a careful scrubbing of ample all-source information, they should lean forward in making their conclusions precise and clear to the policy officials responsible for management of the issue. Analysts who have organized and evaluated their information are able to conclude with authority, for example:

DI assessments are particularly valuable to policy officials when the analytic findings are derived from the collective databases of a multidisciplinary team. Depiction of the political context for a foreign country's lax policy on narcotics controls, for example, or its financial and technological potential for pursuing a nuclear weapons program enables users of DI analysis to take a better measure of the potential risks and benefits of contemplated US policy initiatives.

In contrast, when available information is incomplete or susceptible to foreign deception operations and other sources of ambiguity, the analysts' reasonable doubts about, say, cause-and-effect relationships should be shared with the officials who may rely on DI assessments in taking policy actions.

Many of the issues that the DI tackles are inherently complex and thus shrouded by uncertainties. The analysts have a professional obligation - where warranted - to conclude that they do not know. In such instances, presentation of two or more plausible interpretations of the available information makes a more useful conclusion than a single unreliable one masked in vague language (for example, a real possibility).

Analysts should be particularly wary about projecting thin information as a DI conclusion. When, for example, analysts do not have a solid informational base and are relying on a small number of reports depicting an event as unprecedented or a pattern as well-established, they should attribute such conclusions to the source. Clandestine agents, foreign officials, and the local media may jump to conclusions. DI analysts should not.

In sum, the value added to policymakers, negotiators, warfighters, and law enforcement officials of conclusions in DI memorandums and briefings rests on:


Tradecraft Tips

DI veterans offer the following recommendations for conveying conclusions effectively:

1. When analysts have reached firm conclusions on complex and especially controversial issues, take the time to present the data, to point to the relationships and other implications, and to state the conclusions forcefully. For example:

2. Again, when an issue is complex and controversial, describe the credentials that lie behind the findings - in a textbox or footnote, if not in the main text. For example: depict the reliability of the sources of the information and other specific characteristics of a database; spell out the indicators used to determine diversion to military use of dual-purpose technology imports.

3. To minimize confusion when conveying a necessarily qualified conclusion, think of supplementing adverbial descriptors with a statement of rough numerical odds:

4. When the quality of available information requires either reserving judgment about conclusions or presenting multiple plausible interpretations, consider including a textbox or annex on information gaps and collection requirements.

5. When the text must be kept brief because of space limitations of a DI art form, the findings can be laid out in some detail in a textbox. This coverage can be useful both for those consumers who need a quick study into the issue and those with direct responsibility for decision and action who have an interest in taking precise account of what the DI knows.

6. Also use a textbox to explain any major shift in a DI conclusion from previous assessments or the basis for a contrasting conclusion held by other Intelligence Community analysts.

7. When appropriate, use chronologies, matrices, and other graphics to supplement the text in conveying complex trends and relationships. Even the best informed policy officials appreciate graphics that help them evaluate important information.

8. Conclusions are the bedrock foundation for estimative judgments in DI assessments that address future patterns of development. In papers that are divided between sections that set out the findings and those that make predictive judgments, analysts may find it useful to summarize the findings in a textbox immediately preceding the estimative portion. This helps clarify the argument for the critical reader. It can also help analysts audit their own logic trail.

Forward to "Note 10"

Return to "Compendium" Contents