· 

OPINION: The war against complexity: What happens when the world outgrows our maps

An essay on abstraction, control and the limits of understanding


Modern societies operate on a paradox. The world we inhabit – technological, ecological, informational – is becoming ever more complex, while the institutions meant to govern it, still operate on comparatively simple, linear assumptions. We continue to rely on causal chains, step-by-step logic and reductionist categories in an environment that is non-linear, networked and full of feedback loops. When such worlds meet, friction arises. I call this friction a complexity conflict – a structural tension that appears whenever a complex reality must pass through a subcomplex system of interpretation or decision.

 

Human cognition, organizations and sciences evolve by progressively integrating higher levels of complexity. In early stages, we perceive the world through direct, linear cause and effect: action and consequence, input and output etc. As knowledge accumulates and interdependencies reveal themselves, new connections emerge, forming webs of relation that defy simple modeling. But each time we integrate additional layers of data or understanding, we also lose touch with the lower levels that generated them. This phenomenon – an inevitable by-product of progress – can be called abstraction amnesia. We forget the origins of our abstractions, we forget that our elegant models, indicators, or metrics are built upon earlier simplifications. Like when you learned walking, it’s not like you’re still thinking about activating every muscle that makes you move individually, instead you’re operating from a higher level of complexity.

 

In organizations on the other hand this amnesia manifests itself when key performance indicators or dashboards take on a life of their own, detached from the messy, human processes they were meant to measure. In science, it surfaces in computational models whose creators can no longer fully explain their internal mechanisms. In digital systems, it appears when users merely press “Go” and receive a result, with no visibility into the once-manual chain of actions that has been automated away (think vibe-coding). Each of these examples shows the same trajectory: complexity is absorbed and simplified until transparency disappears.

 

A subcomplex system can be described as one whose internal structure is too simple to represent the reality of its environment – which is unfortunately the case for a majority of current systems. The cybernetician Ross Ashby formulated this principle decades ago as the Law of Requisite Variety: only variety can absorb variety. When the external world multiplies its variables – through technology, globalization or ecological entanglement – systems that remain linear or hierarchical begin to fail. To remain operable, they simplify reality rather than adapt to it. Subcomplexity thus performs a protective function, shielding systems from overload, yet it simultaneously blinds them to emergent patterns that could ensure long-term survival.

 

This dual nature explains much of today’s institutional inertia. Bureaucracies thrive on reduction – they translate multidimensional phenomena into checkboxes and standard procedures. Political discourse reduces global issues to binary slogans. Scientific disciplines guard their boundaries instead of merging insights. Everywhere, the instinct is the same: contain complexity by simplifying it. The problem is that simplification, while cognitively soothing, increasingly misrepresents the world it tries to manage.

 

The history of organizational forms illustrates this evolutionary ladder. Early hierarchies such as Ford’s assembly line or Weber’s bureaucracy were strictly linear: each actor performed a defined task, and feedback was minimized to maintain control. Later, the rise of matrix structures and process management introduced coordination across functions, creating a first glimpse of interdependence. With the emergence of agile and networked organizations e.g. the famous Spotify model or Haier’s self-organized micro-enterprises, companies began to operate more like living systems, adaptive and decentralized. And now, in the frontier of digital economy, we encounter platform models and decentralized autonomous organizations (DAOs) that function without a clear center at all. Each step integrates more environmental variety, yet each also erodes transparency. The organization becomes more realistic and resilient, but no individual can oversee its totality. The subjective experience of this transformation is often described as “loss of control” – the emotional symptom of growing complexity.

 

Subcomplex systems share a deep cognitive habit: they think linearly. They describe the world as a sequence – input, process, output – while reality behaves as a dense mesh of feedback and mutual causation. Linear models intentionally avoid feedback because feedback undermines predictability. For the same reason, emergent orders – self-organized patterns arising from local interactions – are often perceived as chaos or failure. And while hierarchies value stability, networks thrive on oscillation. In complex systems, control is not imposed from above but emerges from ongoing adaptation. To a subcomplex observer, this looks like disorder and therefore triggers resistance.

 

When systems of different maturity collide, complexity conflicts appear. We can see them everywhere: when bureaucratic rules meet non-linear crises such as a pandemic or when populist rhetoric simplifies ecological and social interdependence into moral slogans or when machine-learning algorithms make decisions that regulators can no longer interpret or when linear management logic collides with agile, self-organizing teams. These are not simply communication failures – they are ontological mismatches between systems of unequal complexity.

 

Powerful but subcomplex institutions often respond by exerting what might be called downward pressure: they demand that more complex systems reduce themselves to stay communicable. That’s when e.g. science has to translate uncertainty into headlines or data scientists deliver single numbers instead of multidimensional heatmaps. These days it might be AI researchers who are asked to make their neural networks “explainable” even when such explanations are only metaphors. Even diplomats compress global interdependence into national narratives. This downward pressure creates symbolic clarity at the cost of systemic fidelity. It is why public discourse often prefers simple falsehoods to complex truths – not necessarily out of bad faith, but because complexity cannot travel through subcomplex channels without distortion.

 

Think of the profound socio-political consequences: Populism thrives on reduction, technocracy hides decisions within opaque systems that citizens can no longer decode, bureaucracy defends stability even where adaptation is required, media ecosystems reward simplicity through attention economics, while science communication struggles to compress uncertainty without falsifying it. Ulrich Beck’s notion of the “risk society” already hinted at this tension: institutions designed for linear, industrial problems now face global networks of cause and effect. The result is paralysis and systems that can’t perceive the world in sufficient resolution to act upon it.

 

Yet complexity conflicts are not merely symptoms of dysfunction – they are also engines of transformation. Every historical leap in organizational or cognitive evolution begins with such friction. Bureaucracy yields to networks, disciplinary science to transdisciplinary systems, linear automation to agentic artificial intelligence. Conflict marks the threshold where a system’s internal complexity no longer matches its environment’s demands. In that sense, complexity conflicts are not pathological – they are ideally evolutionary signals that adaptation has become unavoidable.

 

Niklas Luhmann’s social systems theory provides a deep theoretical lens for this dynamic. Luhmann saw society as composed of self-referential communication systems – law, economy, politics, science – each operating with its own binary code. These systems maintain themselves by reducing environmental complexity to something communicable within their own language: the legal system sees legal/illegal, the economy sees payment/non-payment, and so on. Reduction is not always a flaw, it’s the very condition of communication. But as the environment grows more differentiated, the gap between the system’s code and the world’s complexity widens. Luhmann’s insight foreshadows today’s complexity conflicts: systems can only perceive what their own communication allows them to perceive. When politics communicates in binaries while the environment operates in continua, misrepresentation becomes systemic. The system remains internally coherent but externally detached.

 

Technology offers a striking mirror to this evolution. Early computer interfaces required explicit, sequential input: press button A, then B, then C. Modern AI agents perform all three steps invisibly and with Multiagent Systems even the next evolutionary step is already ahead while the user merely expresses an intention. What has occurred here is not just automation but complexification: agency migrates from human sequences to algorithmic networks. The interface becomes simpler even as the internal logic becomes more intricate. We gain efficiency but lose visibility – a textbook case of abstraction amnesia. The same pattern repeats in finance, where algorithmic trading replaces explicit strategy or in logistics, where self-optimizing supply chains adapt faster than any human planner and even in governance, where predictive analytics inform policies whose inner reasoning remains opaque. Everywhere, we seem to exchange understanding for performance.

 

At the philosophical level, complexity conflicts invite a redefinition of truth itself. For centuries, epistemology equated truth with clarity: the truest statement was the simplest one. Complexity thinking inverts that relation. Truth becomes a question of adequacy, of how faithfully a model captures the interdependence of reality, even if that makes it harder to express. The map can no longer be simple if the territory is fractal. Yet societies continue to reward simplicity over adequacy. We still equate intelligibility with correctness, even as our problems demand multi-layered, probabilistic reasoning.

 

The path forward is not to reject simplification but to make it conscious. Mature systems recognize their own subcomplexity and build mechanisms of meta-reflexivity: feedback on their feedbacks, audits of their abstractions, governance for their models. Explainable AI, cross-disciplinary review boards, participatory policymaking and organizational double-loop learning all serve this purpose. They do not eliminate complexity but cultivate awareness of the simplifications that enable communication. Abstraction amnesia becomes abstraction awareness.

 

Complexity conflicts, in this view, are not temporary crises but the normal condition of an intelligent civilization. As our models, technologies and institutions grow in depth and interconnection, simpler structures will always lag behind, trying to pull reality down to their communicable level. The challenge of our century is to navigate this gap without collapsing into chaos or regression – to learn to live with complexity rather than against it.

 

Doing so requires a shift in epistemic attitude: from the illusion of control to the art of coordination, from the comfort of reduction to the discipline of reflection, from the ideal of simplicity to the ethics of adequacy. Gregory Bateson once warned that “the major problems of the world result from the difference between how nature works and the way man thinks.” Bridging that difference without denying either side may be the defining task of our age.

 

Furthermore:

If the argument of this essay holds, then another question arises almost inevitably:
Are there actually any conflicts that are not complexity conflicts?
After tracing the phenomenon across organizations, politics, science and technology, it becomes difficult to find even a single counterexample.

 

Every conflict, in some form, expresses a mismatch in how much complexity the opposing systems can perceive, tolerate or process. It’s the friction between two different simplifications of the same world.

 

In personal life, the clash between partners, colleagues or ideologies is rarely about the facts themselves. It’s about how many dimensions of those facts each side is prepared to consider. One mind collapses nuance into certainty while another insists on context, ambiguity and interdependence. The quarrel is not over content, but over the degree of compression. The less complex position always feels more stable, the more complex one more truthful and the tension between them becomes the emotional signature of modernity.

 

In organizations, this pattern is structural. Every department, rule or dashboard is a device for simplifying reality just enough to act upon it. Conflicts arise when these simplifications are no longer aligned: when finance sees numbers, HR sees people, and operations sees processes. None of them are wrong, they are merely tuned to different frequencies of complexity. Coordination, therefore, is not a matter of hierarchy but of complexity calibration.

 

Politics and culture magnify this principle. Ideologies are complexity filters: conservatism, liberalism, nationalism and technocracy each draw boundaries around what counts as relevant complexity. Populism, in this light, is not the opposite of elitism but its subcomplex counterpart – a style of communication that flattens multidimensional crises into one-dimensional emotions. The ensuing polarization is not accidental, it’s the social form of a complexity conflict between a hyperconnected world and human systems still wired for linearity.

 

Even science, often idealized as the domain of reason, is shaped by such tensions. Paradigm shifts occur when older theories can no longer accommodate the growing variety of phenomena. Competing schools of thought rarely disagree about data, instead they disagree about how much of the world must be included in an adequate explanation. Thomas Kuhn described this as “incommensurability,” but it can equally be seen as a difference in complexity tolerance between epistemic communities.

 

There may be a few edge cases that escape this logic. A purely mechanical collision like a rock breaking another rock that contains no representation and thus no complexity conflict. The same might be said of raw biological competition before cognition arises. But the moment perception, meaning or organization play a role, complexity reappears. Conflict, in that sense, begins where systems start to model their environments. Once modeling exists, so does the possibility of mismatch. And mismatch is the seed of conflict.

 

What follows from this is unsettling yet illuminating: conflict is not an aberration of civilization but its basic metabolic process. Wherever there is difference in complexity capacity, energy will flow – social, emotional, informational – until equilibrium is reached or systems diverge. Peace, then, is not the absence of conflict but a temporary synchronization of complexity levels. The instant one side evolves or simplifies again, the gradient re-emerges and with it the potential for discord.

 

To live intelligently in such a world is not to dream of ending conflict, but to recognize its informational function. Every argument, every crisis, every breakdown is a message about the gap between the world’s complexity and our current ability to process it. The task is not to erase that gap but to learn from it – to treat conflict as the rough surface of evolution, the necessary turbulence of an adaptive species.

 

In that sense, the universality of complexity conflict is not tragic but generative. It is the friction that keeps intelligence alive. The more we understand this, the less we will see disagreement as failure and the more as feedback or an invitation to expand our models of reality until they can hold, however briefly, the impossible intricacy of the world we inhabit.

 

by mario