by David T. Moore and William N. Reynolds
In recent years, the Intelligence Community has paid increasing attention to the role of complexity science in intelligence analysis. It is well known that intelligence problems are "complex,” insofar as they are detailed, multi-faceted, and extremely dynamic. However, this insight has remained qualitative in nature, precluding the development of methods for reducing the complexity of these problems to a level within the capabilities of human reasoning. A more realistic goal is to quantitatively measure the complexity of modern intelligence problems so that their difficulty can be assessed up front. Such measurements will help determine the cost benefits associated with a problem, the resource allocation it demands, and the most useful analytic methods to solve it.
In this paper, we present a simple quantitative metric for estimating the complexity of denial and deception problems. We show that the complexity of a problem increases as a product of the numbers of possible states (in essence, true, deceptive, etc.) for each of the possibly deceptive pieces of evidence. We enumerate the complete set of contingencies that must be considered in a denial and deception problem and provide a heuristic-based method for pruning these down to a manageable set that can reasonably be considered by analysts and decision makers.
The authors gratefully acknowledge the assistance of Rita Bush, David Dixon, William Mills, George Mitroka, Amanda Redmond-Neal, William Parquette, Suzanne Sluizer, and Marta Weber in the preparation of this paper.