Some problems are so complex that you have to be highly intelligent and well-informed just to be undecided about them. Laurent J. Peters
A Systems View
A system is an internally organized whole, where elements are so intimately connected that they operate as one in relation to external conditions and other systems. A set of objects or a collection of people is not a system unless in regular interactions resulting in system behavior as a whole. An important criterion for applying a complex adaptive systems (CAS) paradigm to conflict, therefore, is that parties to the conflict and other actors must be in regular interactions that lead to system behavior as a whole. While this may seem obvious, many instances of conflict dynamics may not meet this fundamental criterion.
Additional criteria for applying the CAS paradigm to civil conflict derive from system properties. Systems can be linear or non-linear, open or closed, and simple or complex depending on the nature of the interactions between actors within the organization and between actors external to the organization. In the context of civil conflict, these properties may change significantly over time, depending on the amount of outside intervention by third party actors. Models of conflict dynamics need to explicitly account for these state properties and how they change in response to the exogenous and endogenous conditions.
A linear system is one in which one or more perturbations to parts of the system evoke a response of the system as a whole that is linearly proportional to the stimuli. Cause and effect are easy to observe, as big changes result in big and proportionate responses. In non-linear systems proportionality and summation no longer hold: small changes in initial conditions or interventions can result in massive changes to the system, and vice versa. Nonlinearity makes the relationship between causes and effects difficult to observe, which can be a problem when trying to validate models of resiliency in complex adaptive systems, especially those which may be relatively closed, such as many of the organizations involved in conflict.
A closed system is one that is fully self-contained, and does not interact with its environment. The law of maximum entropy obtained in these systems dictates that they will always be moving in time towards increasing disorder, absent any outside forces. In contrast, open systems support ongoing exchanges of materials and information with the environment. This allows for negative entropy to be maximized; that is, for order to develop without external intervention, as CAS self-organize to find optimal positions in fitness landscapes. All else being equal then, both open and closed systems can have mechanisms that act in opposite directions to impact resiliency, depending on the structures that develop and whether they impede or amplify and the inflow and transmission of information and resources.
Open systems can interact with unorganized elements of the environment or with other systems. When interacting with other systems, one has a system of systems (SoS). Emergent properties will obtain from a SoS different than those of the constituent systems themselves.
These simple system principles have important implications for third-party interventions in civil conflict, which by their very nature create open systems from what might otherwise been closed, causing major disruptions to system structures and thereby increasing system complexity.
There are many different reference frames for conceptually differentiating simple systems from complex, drawing on analogies from thermodynamics, information theory, structural mechanics, and graph theory. Crutchfield has demonstrated that deterministic conceptions — such as temperature, information density, and entropy — are in reality different measurements of the underlying order, or randomness, in the system(Crutchfield, 2003). This is illustrated graphically in Figure 1.
System complexity is introduced by allowing self-organizing interaction between the elements. This results in a variety of models of cooperation or competition, such as the predator-prey, some of which are capable of reaching a quasi-equilibrium state regulated by interactions between two mutual dependent elements. This often happens in war economies. However, in other models of competition no such regulation occurs and the system may become unstable.
Evolution and Adaptation in Complex Systems
Adaptation, evolution, learning and innovation are key features of complex adaptive systems that can be conceptualized as the response to feedback from, and interactions with, the environment. These behaviors are self-organizing mechanisms by which a system responds to disequilibrium states resulting from initial conditions, from internal drivers (such as competitive goal-seeking) that change resource utilization distributions and impact production/dissolution rates, and from external forces or shocks.
Evolution is the process of natural selection of “accidents”, such as mutants, based on their ability to improve the overall fitness of the system relative to its goal. Evolution occurs over long periods of time through successive generations, as those with the mutation are more successful in surviving and repopulating themselves than those without. Co-evolution may occur, in which the existence of one element (such as a species) is tightly bound up with the existence of another.
Adaptation through occurs on a much different time-scale than evolution. Both involve information exchange with the environment and with elements within the system. Learning is the process of modifying existing knowledge, behaviors, skills, values, or preferences. Learning involves synthesis of different types of information. Imitation occurs by mimicking the activities of others due to observed cause and effects of their actions; whereas repetition generates learning through feedback on one’s own actions. Learning can occur at the individual element level or at the system level.
Network Structures, Evolution & Adaptation, and Resiliency
A key principle of complex systems is that network structures evolve from system dynamics, and influence and constrain the processes of evolution and adaptation through information and resource exchange mechanisms. These in turn impact the resilience of the system, where resilience is defined as the capacity of a system to absorb or re-organize in response to disturbance and stressors so as to maintain functionality
Random networks in which there is equal probability of a connection between any two nodes, result in short average and overall path lengths, providing robust and efficient means of information exchange (high resiliency). Random graphs evolve slowly, and it is difficult for outliers to have much of an impact on the rest of the network. Even so, there is a critical threshold probability related to the number of nodes in the network beyond which a cascade effect will generate a single large, or even “giant” component. In this case, actions of outliers rapidly spread through the network. Research into collaboration networks validates the existence of random networks with giant components among diverse communities of social actors, such as scientists, movie actors, and board directors. Empirical data suggests the existence of giant components in several “dark” networks, e.g., Islamic jihadists, drug rings, and criminal organizations.
Scale-free networks are those in which the distribution of connections within the network follows a power law:
P(k) = ck–g,
where P(k) is the fraction of nodes in the network having k connections to other nodes, c is a normalization constant, and g is a parameter with values typically between 2 and 3. Preferential attachment and evolutionary processes are mechanisms that can generate scale-free networks, which exemplify the adage, “the rich get richer”. Computer simulations have shown that scale-free networks are able to evolve to perform new functions more rapidly than random graphs with equal probability of connections. Scale-free networks are resilient to accidental, random failures. However, they are more vulnerable to directed attacks than random networks. While scale-free networks are one of the most ubiquitous in natural, social, and technological systems, they are not prevalent among most covert organizations.
Small world networks are characterized by higher clustering coefficients (which measure the degree to which all nodes within a neighborhood are connected to all other nodes in that neighborhood) than random graphs while maintaining the same median shortest path length for the overall network. Like scale-free networks, small world networks are ubiquitous in self-organizing natural systems. As one might intuitively expect, adaptation in small world occurs in spurts, through a type of punctuated equilibrium process that is highly vulnerable to the existence of the weak links. They are therefore resilient, but that resilience can be vulnerable, depending on the mechanism for formation and reconstitution of these weak links. Many of the Islamic extremist organizations today exhibit small-world network properties.
As with scale-free and small-world networks, core-periphery networks exhibits a high degree of clustering. However, the clustering is confined to a densely connected core surrounded by sparsely connected peripheral nodes. Core-periphery networks are highly resilient and evolve as elements on the periphery join the core to exploit economies of scale, or as cores expand into outlying neighborhoods for resource exploitation. Information diffusion and virus propagation on many on-line networks exhibit core-periphery structures. Terrorist organizations that enjoy state sponsorship, such as Hezbollah, often evolve into core-periphery networks.
Recent studies on the spread of complex contagions suggest that core-periphery structures can have much higher transmission rates of risky behavior than small worlds. A complex contagion is one requiring multiple exposures for the contagion to spread. In small world networks, the linkages between community structures are long (which increases effective transmission rates) but “thin”. The thinness of these linkages slows the spread of risky contagion. In contrast, the multiple short paths between nodes in overlapping community structures build many “wide” bridges in the core-periphery network, creating high effective transmission rates. This has significant implications for resiliency of different actors in civil conflict depending on communal resources and the network structure to support innovation against adversaries.
Ring networks are simple structures in which each node connects to exactly two other nodes, forming a single continuous pathway for transmission events through each node. Obviously, these networks are highly vulnerable to the removal of any one of the links. Typically, this vulnerability is managed through redundancies – by sending simultaneous, duplicative transmissions in opposite directions and by utilizing secondary, overlapping and counter-rotating rings. The idea is that not all transmissions will get through all rings, but that the probability of complete system failure is low, as every node has the information necessary to be transmitted and it does not require a central node to manage the system. For networks of small numbers, ring networks have been shown to provide optimal configuration to protect secrecy while maintaining operational efficiency, if not robustness, but do not facilitate resiliency.This finding has implications for small covert cells, or organizations in start-up stages, and is consistent with the large numbers of short lived spin-off organizations that emerge in conflict settings.
For networks of more moderate size between 20 and 40, “windmill” and “reinforced wheel” networks have been shown to be most efficient for the achieving the dual objectives of secrecy and efficiency, but lack resiliency. These structures were generated in computer experiments to optimize network structures for dual objectives of secrecy and information efficiency in covert networks. Hub-and-spoke networks evolve naturally to optimize self-organizing distribution systems. While they are low in complexity, complicated operations that are identically required by every node can be carried out at the hub. Drawbacks include the longer path lengths required for distribution to every node, and the inflexibility of the hub to adapt quickly to changing environmental conditions, constituting a single point of failure for the system. In spite of these drawbacks, the hub-and-spoke paradigm remains ubiquitous in systems that can realize high improvements in efficiencies with centralization of operations.
Analyzing characteristics of network structures in the context of general systems theory, one can predict that resiliency driven by innovation and adaptation should be most likely to emerge from small world networks. The individual clusters in a small world network undergo continual learning and increasing specialization. At some point, randomly generated long connections between previously unconnected clusters will lead to discoveries of these differentiated skills and whole clusters can experience a step-change in functionality by whole-sale adoption of the discovery. When enough of these connections happen with complementary discoveries, non-linear, holistic systematic change may occur in “epochal” leaps, yet remain as small worlds. Such communities and/or organizations are highly conducive to constant innovation, taking advantage of the diversity of the skills and resources of the constitutive clusters.
Alternatively, if the structure of the complex system is scale-free, there will be preferential attachments to new ideas generated by particular nodes, resulting in swarming behaviors, with the swarm following new initiatives of a small number of nodes. These initiatives may or may not be the most optimal solutions for the system. Eventually such systems may become increasingly ordered and act more like hierarchical systems and lose resiliency.
Random networks and core-periphery networks, while they do not encourage adaptation, support resiliency by redundancy and relative ease of access to resources.
Chaotic systems are breeding grounds for innovation and adaptation that can feed resiliency, but require some degree of order to effect optimal benefit within the system. This can be affected by self-organized convergence resulting in a state of complexity, or through the imposition of order to a state of simplicity. The emergence of improvised explosive devises (IEDs) in Iraq, and subsequent regularization of their production, use, and continual improvements is an example of this type of innovation resulting in resiliency through increasingly ordered adaptation to chaotic conditions.