Intelligence analysis: Cognitive traps 301
Posted on Thursday August 27th, 2020 @ 3:11pm by Lieutenant Commander T'Pri
Location: Classroom 1A
Lieutenant Commander T'Pri stode into the classroom to teach her cadets.
"Please take your seats, Cadets. Today's course will be on the cognitive traps of Intelligence analysis.
Intelligence analysis is a way of reducing the ambiguity of highly ambiguous situations. Many analysts prefer the middle-of-the-road explanation, rejecting high or low probability explanations. Analysts may use their own standard of proportionality as to the risk acceptance of the opponent, rejecting that the opponent may take an extreme risk to achieve what the analyst regards as a minor gain. Above all, the analyst must avoid the special cognitive traps for intelligence analysis projecting what she or he wants the opponent to think, and using available information to justify that conclusion. To assume that one's enemies try to confuse is not being paranoid but realistic, especially in the areas of Intelligence and its subdiscipline counterintelligence. During the Dominion war the Dominion word for counterintelligence art was Runkendar, or radio game—not a game in the sense of playing fields, but something that draws from game theory and seeks to confuse one's opponents.
Obviously, a set of problem-solving talents are essential for analysts. Since the other side may be hiding their intention, the analyst must be tolerant of ambiguity, of false leads, and of partial information far more fragmentary than faces the experimental scientist. According to Starfleet intelligence, in an experiment in which analyst behavior was studied, the process is one of incremental refinement: "with test subjects in the experiment demonstrating that initial exposure to blurred stimuli interferes with accurate perception even after more and better information becomes available...the experiment suggests that an analyst who starts observing a potential problem situation at an early and unclear stage is at a disadvantage as compared with others, such as policymakers, whose first exposure may come at a later stage when more and better information is available."
The receipt of information in small increments over time also facilitates assimilation of this information into the analyst's existing views. No one item of information may be sufficient to prompt the analyst to change a previous view. The cumulative message inherent in many pieces of information may be significant but is attenuated when this information is not examined as a whole. The Intelligence Community's review of its performance before the 2368 Bajoran war for independence[in the only declassified paragraph.
The problem of incremental analysis—especially as it applies to the current intelligence process—was also at work in the period preceding hostilities. Analysts, according to their own accounts, were often proceeding on the basis of the day's take, hastily comparing it with material received the previous day. They then produced in 'assembly line fashion' items which may have reflected perceptive intuition but which [did not] accrue from a systematic consideration of an accumulated body of integrated evidence.
Writers on analysis have suggested reasons why analysts come to incorrect conclusions, by falling into cognitive traps for Intelligence analysis. Without falling into the trap of avoiding decisions by wanting more information, analysts also need to recognize that they always can learn more about the opponent.
Intelligence analysis is plagued by many of the cognitive traps encountered in other disciplines. The first systematic study of the specific pitfalls lying between an intelligence analyst and clear thinking was carried out in the 20th century and are still true today. According to Starfleet Intelligence, these "cognitive traps for intelligence analysis" may be rooted either in the analyst's organizational structure or his or her own personality.
Types of cognitive traps:
The most common personality trap, known as mirror-imaging is the analysts' assumption that the people being studied think like the analysts themselves. An important variation is to confuse actual subjects with one's information or images about them, as the sort of apple one eats and the ideas and issues it may raise. It poses a dilemma for the scientific method in general, since science uses information and theory to represent complex natural systems as if theoretical constructs might be in control of indefinable natural processes. An inability to distinguish subjects from what one is thinking about them is also studied under the subject of functional fixedness, first studied in Gestalt psychology and in relation to the subject–object problem.
Experienced analysts may recognize that they have fallen prey to mirror-imaging if they discover that they are unwilling to examine variants of what they consider most reasonable in light of their personal frame of reference. Less-perceptive analysts affected by this trap may regard legitimate objections as a personal attack, rather than looking beyond ego to the merits of the question. Peer review (especially by people from a different background) can be a wise safeguard. Organizational culture can also create traps which render individual analysts unwilling to challenge acknowledged experts in the group.
Another trap, Target fixation, has an analogy in aviation: it occurs when pilots become so intent on delivering their ordnance that they lose sight of the big picture and crash into the target. This is a more basic human tendency than many realize. Analysts may fixate on one hypothesis, looking only at evidence that is consistent with their preconceptions and ignoring other relevant views. The desire for rapid closure is another form of idea fixation.
"Familiarity with terrorist methods, repeated attacks against Starfleet facilities, combined with indications that the headquarters on Earth was at the top of the Dominion target list might have alerted us that we were in peril of a significant attack. And yet, for reasons those who study intelligence failure will find familiar, the Battle of Betazed fits very much into the norm of surprise caused by a breakdown of intelligence warning." The breakdown happened, in part, because there was poor information-sharing among analysts (in different SFI offices, for example). At a conceptual level, Starfleet intelligence knew that Jem’Hadar actions almost always involve multiple, near-simultaneous attacks; however, the SFI did not assimilate piecemeal information on oddly behaving dominion troop movements into this context.
On the day of the planetary assault (under tremendous Dominion pressure), no analyst associated the multiple strikes of the changelings with the multiple-attack signature of the Dominion. The failure to conceive that a major attack could occur within the heart of Federation space left Starfleet unprepared. For example, irregularities detected by the federation trading guild and the Klingon defense force did not flow into a center where analysts could consolidate this information and (ideally) collate it with earlier reports of odd behavior among certain movements, or the possibility of hijacked freighters being used as weapons.
Inappropriate analogies are yet another cognitive trap. Though analogies may be extremely useful they can become dangerous when forced, or when they are based on assumptions of cultural or contextual equivalence. Avoiding such analogies is difficult when analysts are merely unconscious of differences between their own context and that of others; it becomes extremely difficult when they are unaware that important knowledge is missing. Difficulties associated with admitting one's ignorance are an additional barrier to avoiding such traps. Such ignorance can take the form of insufficient study: a lack of factual information or understanding; an inability to mesh new facts with old; or a simple denial of conflicting facts.
A post by:
Lt. Cmdr T'Pri
Chief Intelligence Officer