Small Wars Journal

Examining the Base Space: Regressing to Reveal Flaws in Biological Analytics

Sun, 09/06/2015 - 9:17am

Examining the Base Space: Regressing to Reveal Flaws in Biological Analytics

Douglas Rose

The complex nature of today’s data-rich landscape poses several problems within academia and strategic security, and by extension, those who strive to improve the quality of exportable, teachable methods aimed at advancing each discipline.  Applications that require and demonstrate large bodies of information are legion. Without commenting on the parallel scale of both available data and implied applications, this missive seeks to work toward identification of a useful nexus between bioinformatics, complexity, and the predictive nature of syndromic surveillance in order to argue for the development and adoption of a flexible model for predicting and interdicting the existence of a catalyst for a biological attack by a technologically unsophisticated actor.  

In an increasingly complicated security environment obfuscated by unsophisticated terror attacks – here designed to refer to the relatively crude tools used and not the advanced methodologies like those displayed on 9/11/2001 – there is a security and awareness vacuum represented by the possibility of a significant bioterrorist attack that in a complex, networked sense exists in that space.  Fusing the disciplines of syndromic surveillance, complexity, and bioinformatics is not enough to work toward increasing awareness in this sense without a tailored model on which to base this combination.  We must move into a multi-dimensional understanding of information, connectivity, and signatures in order to turn the requisite data on its axis and consider the hierarchal as physical in order to interdict not only biological weapons development but employment.  

Given the existing limitations of medical surveillance techniques, it is precisely the absence of a representative, adaptable model demonstrating the unique intersection of the disciplines mentioned here that this paper asserts.  Construction of such a filter will remain in the realm of the abstract until it is applied since it is the identification of the exact point of intersection and not the movement toward it.  Essentially, this can be done in such a way that an abstract can begin to be accepted as a cognitive “thing” that represents a bridge between analysis and analytics.  It is within the spirit of an overabundance of representative data that such a model exists, and why significant discussions on complexity and entropy serve as the initial filter in an environment that requires significant amounts of bounding in order to define borders in a fluid environment.  Asserting this as a necessity would seem to defeat the efforts of the strategist aiming to cage unique inclusions; but this approach is not an attempt to constrain but enable. 

Even so, the deliberate identification of the foundations of such a model – hereafter referred to individually as “pillars” – will aid in the visualization and conceptualization for thinkers and analysts with a broad range of abilities in order to inform follow-on discussions surrounding usefulness and implementation.  This marriage of bioinformatics, complexity, syndromic surveillance, and several of their subsets seeks to build an adaptable framework to first explain connectivity between disparate areas of focus with an intermediate level emphasis on complexity as a discipline borne out of a need for adaptive boundaries.  Both syndromic surveillance and complexity provide some predictive value within their historically accepted methodologies with bioinformatics providing the progressive, associative structure from which this model springs.  The key here will be to first identify the base elements of these approaches in order to visualize those specific elements within each that may be most useful in determining at what point the evolution of a pathogen within a biological system intersects with the awareness of a rogue actor bent on impacting a target audience.

The challenges that structured or unstructured metadata present to the intelligence or counterterrorism practitioner are neither emergent, nor without multitudes of suggested, software-based solutions.  Software is vital to the evaluations of a massive body of information in any discipline that champions protection or prevention but this dependency should not be at the risk of de-evolution of the abilities cultivated within the analyst or strategist charged to do so.  Fusion of computation and comprehension can and should result in “…tangible, real-world benefits by helping analysts connect fragments scattered across massive amounts of data so they can identify potential threats” [i]

The justification for such a model, at the broadest characterization possible, is possibly best represented by examining the ruminations of Myhrvold and Hoffman, whose observations engender a twenty year variance.   Myhrvold’s applicable theory is as follows:

“The novelty of our present situation is that modern technology can provide small groups of people with much greater lethality than ever before.  We now have to worry that private parties might gain access to weapons that are as destructive as—or possibly even more destructive than—those held by any nation-state. The gap between what is necessary and what is being contemplated, much less being done, is staggering”.[ii]

Hoffman offers a view that exists on a complementary level:

“The contrast between the means and methods of modern warfare and the tactics and techniques of contemporary terrorism is striking. Whereas technological progress has produced successively more complex, lethally effective and destructively accurate weapons systems that are deployed from a variety of air, land, and sea platforms, terrorism has functioned largely in a technological vacuum, aloof or averse to the continual refinement and growing sophistication of modern warfare”.[iii]

The synchronicity between these two authors is complementary and prevalent; over the span of two decades the terroristic threat stream has evolved, but it has not been forced to.  While there may reach a point where a malicious actor will achieve some advanced technological wherewithal, as a body, terrorism relies on tactical methods and in no small measure the opportunities presented by an relatively advanced society.  The intersection between the weapons Myhrvold fears and the simplicity demonstrated by Hoffman is the focus of this paper; it is the suggestion of application of an advanced construct to combat the simplistic interdiction that is required.  Said in a varying manner, use of complexity without requiring understanding should result in a forgotten or overlooked level of understanding.

Complexity, bioinformatics, syndromic surveillance…all are advanced concepts with higher-order, obvious implications across several specialized academic areas before one wanders into the practical applications or divisions of each.  Underneath these, or despite of them depending on one’s view are big data, cyber-warfare, and the asymmetric terrorist as advanced, almost mythical entities that exist either in our own operational space or some distant or multi-dimensional battlefield.  All are challenges to the strategist or analyst, but rarely are these grasped at the base level; we have moved away from the realization that, while our technical adaptations and advanced weaponry are unparalleled amongst any adversary touching the priori summary within these pages, adversary success thrives in the “base space”; that area in which academia, automata, and strategy consider elementary and where this monograph asserts itself.

The need for an emergent model that may potentially interdict a biological attack, while largely being able to spatially manipulate the actual space travelled by the attacker mirrors not only the multi-dimensional construct that is cyberspace, but aligns itself with the higher-order relationships between the disciplines used as supports of this theory.  Naturally then, the following points will include the instance when the progressive infection becomes so severe that travel becomes impossible without detection; again without consideration of physical distance considerations remains a plausible scenario, since both play into the negative technology or base space that we know malicious actors commonly occupy.  It is important to note that this backwards environment can thrive and indeed be thrust ahead of the technologically advanced society simply by a body aware of its existence.  This is a premise with a multitude of deadly examples and one that needs no further justification in that sense.  The building body of global pandemics provides the springboard for this attack and across the body of knowledge represented here, we have no filter for modeling it.

The solution this essay represents is neither a variance of, nor is it a suggestion that the aforementioned methodologies’ predecessors are obsolete.  A system itself, by its self-limiting nature, would not allow for the introduction or blending of multiple disciplines given their disparate origins.  In a general sense the resultant entropy in combining two systems without assessing their level of compatibility prior to that fusion would be destructive to the information itself on a systemic level and inhibiting to an analyst trying to produce a product from it, which is another reason this model forces selective entropy as a means of enabling genesis.  This is of course only true of the initial interaction, and the longer-term relationships -- over generations -- would produce another product entirely. 

In the case of this proposal, increasing size of a data field is not directly linked to the automatic labeling of a congruent state of complexity.  While some schools of thought assert that larger objects are by definition more complex[iv], the multi-disciplinary intersection within these pages is neither an attempt to parse large portions of a data field, nor is it simply a filter designed to reduce complexity.  This blending of theory is designed to assert subtraction of inapplicable data by forcing out information with incompatible messages to increase the likelihood of accurate forecasting and reduction of the angles from which unknown information could flow into a particular analytical structure.  Taking this tack requires the decomposing environment to work in the favor of the user of this model and accomplishes some degree of the shaping operations discussed earlier which we presented as necessary to consider an abstract in this fashion. The underlying organization and methods of coding and syndromic surveillance contain some of the same structures of data so from a systemic perspective we avoid excessive entropy from the start, but we are able to adjust tolerance for reductionism based on particular need instead of insisting it be present as a staple of this model.

The issue with syndromic surveillance seems to be its reactive nature.  While an absolutely invaluable metric for enabling fusion analysis on biomedical events and trends, the timeline involved in this approach continues to provide cover for the determined state sponsored or individual actor, or its missing back-end.  From a systemic perspective, it could be the front end of a construct; however the limitation of syndromic surveillance as a method is missing the ability to ascribe definitions or designations to data; this factor that precludes making expeditious connections in support of tactical prevention of a biological attack. 

Individual networks that are designed by some confluence of actor and vector self-organize in order to achieve a purpose; that organization, along with the emergence and presence of a network would overwhelm filters organic to syndromic surveillance precisely because they are neither built nor designed to detect such signatures.  Nodes in a network design would not fit typical analytic profiles or pre-constructed tolerances for automated alarms within connected communities of practice.  This is largely because the centers of gravity, or even attractors, of the information designed to fit into modern surveillance methods are nodes in the typical sense (care centers, practitioners, or agencies).  Moving beyond existing structure means that identification, shaping, and focus of analysis in this vein focuses on connections as centers of gravity and network signature; not nodal analysis.  Nodes should be less of a focus in this form of analysis even with the dependence on, and use of, a self-limiting model and the associated properties common to complexity. 

Commentary within these pages already suggests the ability to eliminate spatial relationships, as it should if one is to forcefully assert entropy in order to mitigate physical distance and the time it takes to transverse space. This missive does not challenge, and indeed concedes several key evolutions in detection, modeling, and to some extent, treatment of biological incidents.  This model is meant to transcend multiple, simultaneous real-time incidents in order to disregard time in the associated decision cycle and to demonstrate the fact that in the cyber-ether where this abstract functions, it is meant to exceed the categories already resident in the syndromic surveillance model. 

Bioinformatics, being the budding discipline that it is, makes sense for inclusion in this theory because it is not the author’s intent to regress to the point where the underlying process of any methodology borrowed from within these pages begins to disintegrate as a process.  There are specific datasets that the already existing intersection between bioinformatics and medical surveillance leverages in order to inform diagnosis[v]however the intent behind both the selection of these models and categories does nothing to accomplish the tactical goals of this paper and involves levels of dissemination that hinder the underlying analytics.

Remaining in the base space associated with bioinformatics, are the bounding methodologies that involve utilization of microarray-type analytical methods already currently used “to explore the vast amount of data obtained from those microarrays most effectively”. [vi] Similar to the pillars extracted from the methods previously touched on in this paper, the remaining structure gained from the micro-array approach organically lends itself to the clustering and modeling of data that naturally confines itself because of the presence of self-organizing attractors that physically fit into the abstract being developed here.  This natural tendency, which should automatically accomplish the pairing or elimination of datasets to and through our intended use of entropy, depends less on the actual data itself vice the structure it produces.  This selection is based on not the type of data, since that requirement has been eliminated by the very construction of this theory, but the resulting connections formed when forcing disparate, unnatural connections between portions of existing systems.  While the closest, fully-developed model to this thesis might be the Cynefin framework[vii], that model allows for the presence of disorder, and as a result, the base space therein is more restrictive by its forcing of ambiguity. 

While microarray analysis plays less than a primary role in this model when contrasted with the other principal pillars utilized here, sections of the base space can and are applied in order to frame the practical use of this theory.

Whereas choice is eliminated for the practitioner looking to exercise the theory within these pages, and that force may exist at a juncture outside of the initiation of awareness of an instance, the hybridization illustrated above is where our abstract starts to move past its cognitive state to a construct that can be demonstrated at a technical level on a two-dimensional medium, despite its potentially five-dimensional nature.  This transition and subsequent representation is where insisting on a base space and forcing regression show their value and form the derivative product codified by staying loyal to the degradation and combination of existing theories.

The contrasts here between existing and proposed frameworks are intended to themselves represent varying structure from the ethereal to physical, again by design.  It is not the function or calling of this developmental model to restrict itself to any front-end medium, instead we intend for it to be purposefully ambiguous enough to be tailored to any number of applications, but not necessarily platforms.  There cannot be a proposition designed for the intricacies of existing analytics, but this line of logic remains demonstrably open in order to maximize application while remaining true to the spirit and intent of its application.  Insisting that the macrostate of the combinations proposed herein be informative enough to produce value in a relatively short period of time is one of the only firm requirements in an abstract that at the beginning stages would not represent such a visual. 

The fact that there are several implications to blending of systems and particularly, the insistence of the inclusion of forced entropy despite the risk of severe unwanted degradation has not escaped the attention of the author of this proposition.  It is precisely because of these risks that this model asserts itself as both valuable and emergent since disparate, associated theories cage their constructs within the bounds of a typical system and do not seek to frame data in a similar manner.  Especially if this model, or any construct for that matter, is to effectively exist and self-organize in a cyber-environment, then the backbone of it must be flexible and incomplete from an organic standpoint in order to provide the maximum value to the strategist or analyst at the earliest possible opportunity without the consideration of time.

End Notes

[i] SAS. (2013). Using Advanced Analytics to Facilitate Intelligence Analysis:

Make Connections Between Disconnected Data Fragments to Reveal Hidden Threats [White paper]. Retrieved May 12th, 2015, from the SAS Analytic Procedures and Processes Database:

[ii] N. Myhrvold. (2013, July 3). Terrorism: A call to action [Web log post]. Retrieved from

[iii] Hoffman, B. (1994). Responding to terrorism across the technological spectrum.  Retrieved from the U.S. Army War College Strategic Studies Institute website:

[iv]  Mitchell, Melanie. (2009). Complexity: A Guided Tour. Oxford: University Press

[v] Parks, Leticia I. "Homeland Security and HIM. Appendix B: Syndromic Surveillance Systems in Bioterrorism and Outbreak Detection" Journal of AHIMA 75, no.6 (June 2004): web extra.

[vi] Van Der Spek, P. and Stubbs, A. (2003). Microarray Bioinformatics. In Encyclopedia of the human genome. Hoboken, NJ: Wiley. Retrieved from

[vii] Boone, M. E., & Snowden, D. F. (2007). A leader's framework for decision making: Wise executives tailor their approach to fit the complexity of the circumstances they face. Harvard Business Review, 85(11), 68+. Retrieved from


About the Author(s)

Douglas Rose is an accomplished and credentialed member of the U.S. Intelligence Community with over 20 years of experience spanning all levels of recognized military operations.  His subject matter expertise spans the disciplines of Information Operations, human intelligence, and law enforcement.  His work on the movement of information through a complex system codified as a formal Information Domain represents the cornerstone of dynamical systems analysis in an emergent doctoral program at  American Military University.  He is now pursuing his doctorate in Strategic Security and intends to continue advocating for the establishment of a formalized information construct separate from cyberspace and host systems in order to enable the fractal manipulation of large datasets within the discipline of  intelligence analysis.