OEC David Cota Ontology of Emergent Complexity

Chapter 2

Chaos Is Not Confusion

General Index Field-Book I Theme I Chapter 2

Introduction

What has always existed was never absence. It was what thought refused to acknowledge.

Chaos evokes, in contemporary language, radical disorder, collapse of all form, a recess prior to any intelligibility. The word carries centuries of philosophical operations sedimented within its meaning — each successive layer reinforced the association between material instability and the absence of meaning. But this reading is not neutral. When Western thought first confronted the unstable, it did not ignore it: it recognised it, named it — and immediately subordinated it. Hesiod's Khaos was not ignorance; it was a cosmic gesture of opening. However, the genealogies that followed operated a systematic movement of domestication: instability was always received as an original condition and immediately deactivated as a permanent ontology. Logos would dominate it, then form, then act, then law. Each epoch of Western thought has developed its own technique of expelling the unstable into the remote past, transforming it from a condition into an abandoned origin.

This operation is not philosophically neutral — it is constitutive of an entire picture of being. Presuming stability as a natural state, order as an inevitable horizon, is equivalent to committing to an unproven thesis: that reality has a propensity for rest, that diversity would emerge from a homogeneous background only when external forces disturb it. Conversely, assuming that instability is deviant allows us to construct systems of thought whose aim is to restore the lost order: theology, Hegelian dialectics, Newtonian mechanism — all are based on the same tacit assumption, that, without interference, the universe would tend towards some form of final equilibrium. The thesis is about tradition, not reality — and it is precisely this unexamined presumption that the chapter sets out to dismantle. The cost of this presumption was high: it made invisible what 20th century physics would unavoidably reveal.

What these traditions share — from Hesiod to Hegel, from Aristotle to Laplace — is not a common thesis about the nature of order; it is a common presumption about the nature of instability. Instability is what must be overcome, resolved, tamed, eliminated or absorbed into a narrative that gives it meaning. None of these traditions accept that instability can be a permanent regime, a condition without resolution, a constitution of the real without hope of rest.

Contemporary physics—quantum and relativistic alike—has dismantled this scenario. The quantum vacuum is not rest but permanent fluctuation. The universe did not begin in stability only to be later disturbed by occasional events; it began in radical instability, and every observable organisation is a local, provisional effect of that constitutive condition. Electrons in atoms, molecules in crystals, stars in galaxies: none of these are foundations. They are reorganisations, metastable configurations generated by the turbulence of the real. The stability we perceive is not the rule but an occurrence. Where it endures, it does so because certain material configurations retain internal tensions that keep them from absolute rest—if such rest ever existed. This chapter therefore introduces a threefold distinction that tradition has repeatedly collapsed: instability as the ontological condition of the real, chaos as material operation prior to a stabilised regime of legibility, and disorder as a symbolic judgement passed on organisations that escape capture. Without that distinction, the rest of the book would lack the vocabulary it requires. What follows lays the groundwork for it.

Main text

Axis 1 — Instability as a basis

1.1 Ontological deafness to the unstable

The task that arises here is not historical in the conventional sense. This is not about tracing the evolution of ideas about chaos — that would still be accepting the fiction of linear progress in images of the world. It is, conversely, diagnosing an operation: the systematic repetition, across centuries and different traditions, of the same logical gesture — the recognition of instability followed by its ontological deactivation. Instability was always perceived; it was always simultaneously expelled from the condition. This pattern is not accidental. It reveals a structural vulnerability of thought: the difficulty in supporting the unstable as permanence, in enabling oneself to think about a reality without possible rest, without an inert background from which everything emanates.

Hesiod opens Theogony with an invocation that deserves careful reading. "Yes, first Chaos was born" (v. 116). Khaos, in this original context, is not pure confusion. The etymology is revealing: khainein, to open, to split. Khaos is an opening, an originary spacing, a difference that inaugurates. It is nothing in the sense of denial; it is the interval preceding any form, which allows subsequent proliferation. But note the sequence that Hesiod installs: after Khaos comes Gaia, solid land; then Tartarus and Eros; then the generation of the Titans, then Zeus and the definitive cosmic order. Khaos does not persist. It is exposed as a primordial opening and immediately left behind as an overcome phase. Hesiodic cosmogony operates a fundamental deactivation: the unstable is necessary for the cosmogonic narrative to begin, but not for it to continue. Once divine generations follow one another, Khaos is reduced to an archaeological episode, a remainder without ontological relevance. The operation is subtle: it does not deny the unstable; it recognises it as a starting point and, through the succession narrative, makes it irrelevant. What was a condition becomes an abandoned origin.

Later tradition intensifies this domestication. Apuleius, Nonos and Neoplatonic philosophers progressively convert Khaos into a synonym for confusion, pure indeterminacy, that which simply does not count as anything of importance. What Hesiod left in suspense — a fertile opening, a productive interval — is slowly reconfigured as that which is the enemy of form. The original etymology is deleted. Khaos becomes the name of lack, of inadequacy, of that which always requires external intervention to achieve any value. This reconfiguration is not a mere change of meaning; it is an operation of complete domestication, where the original instability — the material difference that Hesiod recognised as inaugural — is reduced to insignificance, to pure negative. The archaic cosmogony that had maintained opening as a transitional moment is reinterpreted by tradition as a narrative of progression from chaos to order, as if chaos were a deficit that only order could supply.

Plato inherits and refines this pattern in the Timaeus. The Demiurge contemplates the eternal Forms and impresses them upon the chôra—the receptacle of all becoming, that which 'receives' qualities without possessing a nature of its own. The chôra is perpetual agitation, receptivity without a figure of its own; yet, decisively, it is passive receptivity. It does not produce; it only receives what the Demiurge impresses upon it from without. Instability is acknowledged as a material substrate only to be immediately subordinated to an external principle of order: the Forms. The chôra remains what is left when all Forms are removed—and what remains is an indeterminate flow held to be philosophically useless. Plato himself concedes the difficulty: the chôra is a 'difficult and obscure' thing, graspable only through 'a kind of bastard reasoning'—not by true intellect, but by a secondary mode of thought that never reaches full clarity.

The Platonic operation is more sophisticated than the Hesiodic one: it does not expel what is unstable into the past, but fixes it as a subordinate substance, structurally incapable of self-generation. Instability is converted into disability — not into movement whose cause is unknown, but into inability to move itself. Only through the intervention of external intelligence does it acquire value.

Aristotle completes and crystallizes this genealogy through the distinction between dynamis and energeia, potency and act. The argument is implacable: the act is prior to potency. Potency is that which can be but is not yet — it is deprivation of form, absence of realisation. Therefore, natural change is a transition from potency to act, from indeterminate to determined, from unstable to stable. Nature operates according to ends (telos); each thing moves towards its realisation, its complete form. «Act is prior to potency» (Metaphysics IX, 1049b5). Instability is not a permanent condition: it is a transition; it is not its own engine: it is a deficiency that seeks consummation. Instability becomes that which makes sense only when oriented outside of itself — towards its negation, towards the way that overcomes and resolves it.

The legacy that the Aristotelian tradition bequeathed to medieval and modern thought was profound and rarely questioned: the presumption that stability is a natural state, that things have an intrinsic inclination to be what they are in full. Aquinas, Scotus, Leibniz — thinkers separated by centuries — presume without discussion that rest is a desirable horizon, that movement is perfection only as oriented towards final rest. Movement is imperfection — not because it is defective, but because it is incomplete. Rest is perfection — not through the absence of movement, but through arriving at a form that no longer needs to change. Instability is subordinate to this final scheme.

Heraclitus presents himself as an apparent exception. "Everything flows; nothing remains" — the cosmos is in perpetual transformation. Living fire, continuous death, life in death. No ancient thinker articulates the permanence of change with greater force. But careful reading reveals a similar operation. The Heraclitean flow is governed by logos — universal reason that regulates transformations. "Listening not to me but to the logos, it is wise to agree that all things are one" (DK 22 B50). Fire lives and dies "according to measure" (metron); there is harmony in opposites; there is law in change. What looks like radical instability is in fact governed change, change under the rule of universal intelligence. Heraclitus recognises the permanence of flux; he refuses chaos. Instability, in his thinking, is never chaos — it is always an intelligible process, readable according to the principles of logos. The operation is homologous to the Platonic one: the unstable is recognised, but its recognition is immediately neutralised by subordination to a prior principle of intelligibility. The logos does not emerge from fluidity; it precedes it. The real flows, but it flows according to law.

Newton offers an even more radical variation, because he changes the strategy — instead of taming instability through universal law, he eliminates it theoretically. Newtonian mechanism does not tame instability — it eliminates it as such. Absolute space, absolute time, reversible deterministic laws, system of bodies in predictable movement. Laplace formulated the ultimate consequence in 1814: "We can consider the present state of the universe as the effect of its past and the cause of its future. An intellect that, for a given instant, knew all the forces by which nature is animated and the situation of everything that constitutes it — if it were vast enough to submit this information to analysis — would encompass in the same formula the movements of the largest bodies in the universe and those of the lightest atoms. Nothing would be uncertain for it; and the future, like the past, would be present before its eyes." There is no place for instability in the ontological sense. What seems random is just ignorance — if we knew precisely enough, everything would be predictable. Determinism is so absolute that instability is no longer even a necessary concept.

The consequence of this strategy is that instability is not tamed — it is erased. Not because it was recognised and then subordinated; rather, it was forgotten as a problem. The Newtonian universe is a machine, a clock, a system of linked causes and effects. Instability would mean a breakdown of this machine, a defect in the watch. The Newtonian operation is elimination: instability is not tamed according to divine order or universal law — it is denied as such, reduced to appearance. The machine works; movement is variation without ontological importance. This is perhaps more dangerous than previous forms of domestication, because it leaves no trace — there is no recognition of instability followed by subordination, just silent erasure.

Hegel reintroduces maximum sophistication — and maximum danger. The Hegelian system welcomes negativity like never before in Western thought: contradiction is the engine of history, instability is a productive force, conflict generates synthesis. Unlike Aristotle, Hegel makes continuous transformation a fundamental explanatory principle, not a deficiency to be overcome. Unlike Newton, Hegel rejects mechanical determinism and installs historical contingency as a constitutive regime. Unlike Plato, Hegel denies that there are eternal Forms that transcend the process.

But — and here lies the trap — negativity is admitted only as a moment, as an instance in a trajectory that converges to the self-realisation of the Absolute. "The true is the whole" — and the whole is the process by which the Absolute knows itself through its finite determinations. Instability becomes an instrument of progress. Every denial is Aufhebung, every conflict is anticipated reconciliation. Movement converges to synthesis; the synthesis to the Absolute; the Absolute is the truth of all previous movement. What seemed like a radical recognition of instability as a condition becomes, upon careful reading, its most sophisticated subjugation: instability admitted as a driver, but subordinated to purpose. The Hegelian operation is the most dangerous because it is the most seductive. It invites full recognition of instability, change, conflict — but only as moments of a narrative that absorbs them, that makes them instruments of a meaning that goes beyond them. Instability is no longer an autonomous regime; it becomes a structural element of a drama whose end has already been written.

The sequence of these operations reveals a cumulative pattern. The Hesiodic Khaos which is abandoned origin becomes Platonic Chôra which is subordinate substance, which becomes Aristotelian potency which is deficiency, which becomes Heraclitean flux which is order under logos, which becomes Newtonian determinism which is ignorance, which becomes Hegelian contradiction which is a moment in a teleology. In each case, the same fundamental pattern: recognition followed by deactivation. The unstable is not ignored — it would be easier if it were. It is recognised and immediately expelled from the condition, subordinated to an external or superior principle that makes it intelligible only insofar as it ceases to be what it is — autonomous, without fixed destiny, capable of non-subordinate material operation.

The distinction that must now be installed is triple and deconstituting — triple because it breaks the superficial unity of three terms that Western thought treated as synonymous, deconstituting because it refuses to capture that unity in a narrative that reconciles them. Instability is an ontological category — material condition of the real regardless of any regime of inscription, that which fundamentally characterises reality before it is made legible. Chaos is an operative category — the ongoing material difference, the material tensions prior to any regime of symbolic capture, which comes into play when configurations are reorganised. Disorder is a symbolic category — judgment according to the installed legibility, the attribution of negative value to configurations that the registration regime cannot integrate coherently. These are not synonymous — they are operators in different registers. To confuse them is to lose the conceptual precision that the rest of this book demands. The deafness of Western thought consisted precisely in assuming that, without the intervention of logos, form, act or law, the material would be nothing but indeterminate chaos. This presumption is unjustified. The material is always already organised — just in a language that symbolic thought has not yet learned to read.

1.2 The real as fluctuation

Twentieth-century quantum physics provided experimental proof that instability is neither a cosmological exception nor a transition to be eliminated: it is a permanent condition of reality in all its regimes of manifestation. The central lesson lies precisely where classical thought expected to find absolute rest — in the quantum vacuum. Vacuum is not absence. The vacuum is continuous fluctuation, permanent boiling of events whose reality theory demonstrates and experience confirms. This is neither theological metaphor nor philosophical speculation: it is a result deduced from Heisenberg's equations and confirmed in multiple experimental manifestations whose precision leaves little room for reinterpretation.

The Heisenberg uncertainty relation (ΔEΔt ≥ ℏ/2) establishes that energy and time are conjugate variables — they cannot be simultaneously and independently determined with arbitrary precision. Immediate consequence: in a sufficiently small time interval, energy fluctuations are possible without violating conservation laws. The minimum energy state of a quantum field — the ground state — has irreducible zero-point energy: uncertainty relations prevent any mode of the field from reducing to absolute rest. The vacuum is not the absence of activity; it is the regime in which no mode can cease entirely. Every configuration of spacetime is immersed in this fluctuating activity — not movement of something within the vacuum, but the character of the vacuum itself. Instability is not an occasional deviation in a fundamentally stable system. It is a regime without exception.

Two experimental confirmations establish this conclusion in the realm of the observable — and therefore, according to the inscriptive discipline that governs this work, in the realm of the concrete. The Casimir effect, theoretically proposed by Hendrik Casimir in 1948 and experimentally confirmed by Steven Lamoreaux in 1997, offers proof that the vacuum is not uniform. Two conductive metal plates, placed at a very close distance, in a chamber under vacuum, experience an attractive force of measurable magnitude. The explanation: Vacuum fluctuations are not equally possible on all wavelength scales. Between the plates, only vibrational modes whose wavelengths fit the separation can exist. Outside the plates, additional modes with different wavelengths contribute to the energy density. The pressure of the outer vacuum is greater than the inner one; the plates are mutually attracted. What seemed to be nothing — the vacuum — exerts a measurable force, experimentally differentiable from zero. The vacuum is not rest: it is productive asymmetry, a regime where the geometry of the boundaries shapes the available energy. What classical metaphysics would imagine as a total absence manifests itself, in a domain where inscription is possible, as a material difference with measurable and reproducible consequences.

Lamb's spectral shift, measured by Willis Lamb in 1947, offers a second, equally robust manifestation. Hydrogen spectral lines present a shift of around 1 GHz in relation to that predicted by Dirac's theory for the electron in an atom. The cause: the electron not only interacts with the nucleus, but with the sea of vacuum fluctuations that surround it. These fluctuations polarize the electron cloud, slightly alter the electron's probability distribution pattern, modify the energies of the quantum levels, and result in measurable displacement of the lines. Once again, the vacuum is not passive absence; it actively intervenes, shaping the properties of the particles immersed in it. Instability is not a remote and unobservable regime — it is a condition of the most common material, the hydrogen atom that permeates the observable universe.

The convergence of these manifestations — vacuum fluctuation, Casimir, Lamb — points to a conclusion that goes beyond each individual case: instability is not a defect located in special conditions, it is not an anomaly in exotic systems. It is the fundamental regime of the most elementary material substrate. Where classical thought postulated rest — in a vacuum, in "nothingness," in the simplest possible state — quantum physics finds irreducible activity.

However, caution must be exercised in this reading. What is measured — Casimir effect, Lamb deviation — are marks of the concrete regime, of the real made legible through inscription. The "nothing" that underlies, the regime of the real prior to any measurement capture, remains structurally inaccessible. It is nothing in the sense of absolute negation; it is simply insurmountable to inscription — not because it is an unfathomable mystery, but because inscription is always an operation that cuts and stabilises. What the inscription captures is always already relational conformity, material compatibility inscribed in a regime of legibility. What precedes this inscription — pure material difference — is not unknowable through ignorance: it is unknowable through structure, because knowledge is always already an inscriptive regime. This does not reduce the importance of the conclusion: a vacuum is fluctuation, not rest. It just qualifies: this fluctuation is what physics describes with precision in the domain of the inscribable; the material difference prior to inscription remains accessible only through its inscribed effects.

In addition to the fluctuation originating from the vacuum, material instability manifests itself in dynamic processes that structure observable reality. Radioactive decay offers first lesson. A uranium-238 nucleus, placed under observation, decays at an intrinsically indeterminate moment — it transforms into thorium-234 by the emission of an alpha particle. There are, according to all experimental evidence – extensively reviewed through Bell and Aspect's experiments – hidden variables that determine the precise moment of decay. The process is genuinely random in the sense that no prior information can specify when it will occur. But this randomness is not indifference. The core decays on average over 4.5 billion years; there is a law in frequency even if there is no determinism in the singular event. Temporal instability is not chaos without pattern; It is a standard without individual determination, a regularity that does not require prior fixation of each result.

Quantum tunneling offers a second, equally decisive manifestation. A particle faced with a classically insurmountable potential barrier — a wall of energy greater than its kinetic energy — has a non-zero probability of being found on the other side. Classically, this is impossible. In quantum mechanics, the wave function does not cancel out abruptly; it extends across the barrier and remains non-zero on the other side. The consequences are constitutive of the observable universe: nuclear fusion inside stars operates through tunneling. Nuclei in stellar cores, at temperatures lower than those classically required, do not have enough energy to overcome the electrostatic repulsion between protons. They should remain separate. They tunnel, merge, release energy. Without this instability effect, the Sun would not shine. Organic matter would have no energy source. The observable universe would not exist in its current form. Instability — the ability for barriers to be overcome by the indeterminate nature of quantum matter itself — is not an accident in which reality eventually found itself: it is a condition without which nothing observable would be possible.

1.3 Limit tests: instability on a cosmic scale

The quantum instability processes that structure atoms and molecules are not confined to microscopic domains. Twentieth-century cosmological physics has produced converging evidence that instability operates in the largest-scale events — in those that fix the very possibilities of matter and structure in the observable universe. These are not theoretical curiosities or exotic anomalies that decorate the picture without altering it. These are fundamental operations — structuring the matter, energy and geometry of the universe — whose elucidation makes visible that instability is a regime without exception of scale. If instability were just a property of the quantum domain, one could argue that it is an artifact of scale, that microscopic phenomena do not necessarily translate into macroscopic consequences. The following three limit tests refute this restriction: instability operates in the processes that define the very possibility of a universe with diverse matter and observable structure.

Cosmic inflation offers a decisive illustration. In the first moments after the initial expansion, according to the consensus models of Guth and Linde, the universe was contained in a state of false vacuum, metastable. The inflaton field occupied a local energy minimum, located in an elevated valley of the potential landscape. This state is characterized by dynamic instability: it retains internal tension, fundamental incompatibility between the present configuration and the minimum energy configuration. The material dynamics are inexorable: the metastable configuration contains internal incompatibility that exceeds the ability of the symmetrical form to sustain itself. The inflaton field rolls towards true vacuum, towards global minimum. The geological metaphor is inevitable: like a stone on a flat top that a minimal breath moves to the slope, the false vacuum is a configuration that any infinitesimal disturbance — and quantum fluctuations guarantee that infinitesimal disturbances always exist — precipitates in transition. In this transition process, a colossal amount of energy is released, which is converted into an exponential expansion of space-time. During inflation — a period that lasted perhaps 10⁻³⁶ seconds — the universe expanded from a subatomic size to a macroscopic size. Inflation — the most energetic event in known cosmic history — is an episode of instability. It is not a collision between bodies, it is not an external disturbance: it is the material configuration itself that exceeds its compatibility and reorganises itself. The operative excess — what the symmetrical configuration holds in tension — is the engine. There is no "failure" in this transition: there is superior material compatibility that the previous incompatibility could not contain.

The Higgs mechanism offers a second, equally fundamental case. In the first fractions of a second after the initial expansion, when temperatures exceeded teraelectron volts, the Higgs field had zero expected value. In this symmetric configuration, all fundamental particles were massless, all equally coupled. But this symmetry was unstable — not because there was an external law that condemned it, but because its very quantum structure contained the incompatibility. As the universe cooled, the Higgs field's vacuum point transformed. What was a stable minimum became a local maximum. The field spontaneously reorganised into a new minimum, acquiring a non-zero expected value. This transition — spontaneous breaking of symmetry — caused radical consequences: particles coupled differently to the field, acquired different masses, diversity emerged where there had previously been symmetric uniformity. The entire material universe — everything that has mass and makes chemistry possible — emerges from this instability. The symmetrical configuration does not "fail" due to deficiency; the symmetrical configuration exceeds what the symmetrical form can sustain in a cooling universe. Transition is the execution of what instability allows.

Primordial density fluctuations, amplified by inflationary expansion and frozen into the structure of the cosmic microwave background, offer a third confirmation on an even larger scale. Density variations of approximately one part in a hundred thousand, originally quantum fluctuations of the inflaton field, have been expanded by inflation to scales that today reach millions of light years. These fluctuations were frozen when inflation ended and left a mark on the cosmic radiation background, emitted when the universe became transparent, about 380,000 years after the initial expansion. Without these fluctuations — without this primordial instability — the early universe would be completely homogeneous. No structure would emerge. No galaxies, no stars, no possibility of observers. The existence of all observable cosmic structure depends on primordial instability. The development of this relationship — from primordial fluctuation to observable structure — belongs to later chapters. Here the limiting statement suffices: galaxies exist because the vacuum was unstable.

1.4 Metastability: the operator that dissolves the dichotomy

The progression so far installs a seemingly irreconcilable tension. If instability is a constituent condition, if not even a vacuum is rest, if every observable structure is a local and provisional pause — how is it possible for something to persist? How does the universe not disintegrate into an undifferentiated plasma? The answer is not "nothing truly persists" — it would be nihilism that observation belies. The answer is also not "everything is determined and stable" — it would be a return to classical mechanics. The answer is metastability: a regime of material compatibility where apparent stability is compatibility between non-actualised tensions, where the form persists precisely because it retains internal differences that keep it in a state far from absolute rest.

Gilbert Simondon develops this concept in L'Individuation à la lumière des notions de forme et d'information. A metastable system is neither in absolute equilibrium nor in total disequilibrium. It is a configuration that contains potential energy, has not reached the state of minimum global energy, and still retains the capacity for transformation. A simple example helps. Supercooled water, at a temperature below zero degrees Celsius, does not freeze even though it is not in the thermodynamic minimum. The liquid configuration persists because crystallisation requires a fluctuation—an ice nucleus capable of catalysing the transition. Until such a nucleus appears, the water remains liquid—not because it is stable, since it lies outside the thermodynamic minimum, but because it is metastable. It already contains the power of crystallisation; it has simply not actualised it. Supercooled water is an exemplary image: it persists in a compatibility that is not minimal, is precarious, can disappear with any disturbance, and for precisely that reason persists as what it is, differentiated from ice.

Simondon, however, builds his theory of individuation on assumptions that the perspective of this work cannot accept. First: Simondon postulates a pre-individual — an energetic background prior to individuation, a reserve of potential prior to the operation of form. Individuation, for Simondon, is the resolution of tensions, a process where the pre-individual is cut, structured, transformed into an individual. Second: it conceives individuation as resolution — as if tensions had a destination, a purpose, a final configuration that would be the harmonization of the conflict. The perspective that governs this path rejects both assumptions. There is no pre-individual as prior background — there are only material configurations present, always already in operation, always already retaining differences and compatibilities. The "before" of form is not a reserve that awaits; It is ongoing material difference that does not have form as destiny. And emergence is not resolution of tensions — it is local compatibility that no purpose governs, that can disappear as material circumstances rearrange themselves, that tends toward nothing.

The distinction is critical. For Simondon, metastability is a provisional state, a path to individuation that is its resolution. The individual is the result: the tension has disappeared, the power has been updated, the transformation has been completed. Metastability, in this logic, is transitory — it is valid until resolved. Conversely, according to the ontology operative here, metastability is a permanent regime. There is no final resolution. There is no configuration where all tensions disappear. What there is is a continuous reorganisation of material compatibilities, some more persistent than others, some less close to balance than others — but none reaching absolute rest. Metastability is not a phase to be overcome: it is a condition with no end. The configurations — atoms, molecules, stars — persist because they internally retain material differences that keep them active, organised, away from thermal death. When these differences are neutralised, the form disappears. But this is not completion; it is simply the end of a particular compatibility. The matter that made it up reorganises itself. No final harmony is achieved — only another compatibility emerges in its place.

Metastability is, therefore, an operator that dissolves the false dichotomy between stable and unstable. The regime of reality is neither classical rest (where everything converges to final balance) nor indeterminate chaos (where nothing remains). It is metastable: persistence in compatibility that retains unactualized tensions, which can reorganise as new material factors create previously absent possibilities, which have no end or destination. Constituent instability operates through metastability — through configurations that internally contain the same incompatibilities that allow them to persist differentiated.

Metastability is key: apparent stability that retains internal tensions and potential for transformation.

Axis 2 — Random as structure

2.1 Genealogy of chance exclusion

The Western tradition was systematic in its rejection of chance as an ontological category. Not that he ignored the phenomenon — he constantly recognised it as a fact of experience. But there has been a radical separation between what appears accidental and what is truly necessary, between the limited perspective of an observer and the current order that governs without exception. The salient feature is that the best and most rigorous materiality — the atomistic one of Democritus — emerged as a pure negation of chance. A matter that is truly matter, inert and self-identical, could not know indeterminacy. Genuine matter is pure necessity. This axiom structured centuries of natural philosophy and, later, science.

Democritus formulated the thesis with exemplary economy: nothing happens without a cause, everything is due to reason and necessity. Atoms move not because they decide to do so — they have no will --, but because previous movements determine them. One necessary collision follows another, chained together without a gap. The universe is a perfect mechanism where no particle deviates without this deviation itself being necessary due to previous movements. Founding paradox: the genuine materialist is also the most rigorous determinist. The consequence is immediate: if everything is necessary from the beginning, there is no genuine newness in the cosmos — just the revealing of complexities that were already involved. Chance would then be a failure in our understanding, never a property of reality.

When Epicurus intervened with clinamen — the minimum, spontaneous deflection, not determined by time or place --, he was introducing a radical indocility into matter itself. Lucretius captured the gesture in De Rerum Natura: «at an indeterminate moment, in an indeterminate place, the atoms deviate a little — just enough so that it can be said that their movement has changed.» This gesture opened another ontological regime. It would allow atoms to combine in non-predetermined ways, creating variations that no previous causal sequence prescribed. But subsequent tradition refuted Epicurus with remarkable consistency. The clinamen was an open display of irrationality in matter. Democritus prevailed ideologically while Epicurus was relegated to heterodoxy. Chance remained marginal, suspected of anthropomorphism, incompatible with true science.

Aristotle consolidated the exclusion using the sophisticated concept of tyche and automaton. He did not deny that events occur that seem accidental — the builder and the client meeting in the agora without prior arrangement. But he refused that chance was a genuine cause in the ontological sense. Each series — the builder's move, the client's move — has its own, determined cause, inscribed in a purpose or a chain of necessity. The collision between them is accidental only from the point of view of those who observe them; in reality, it is the intersection of two completely determined causal series, each one following its own chain. Chance is perspective, not a property of reality. The operation is philosophically sophisticated: it recognises phenomenologically what experience offers, but dissolves it ontologically. The accidental is true for us, not in being. This strategy — accepting the phenomenon while denying its reality — would become standard in the tradition.

Laplace inaugurated the formulation that would best translate modern intuition and would be hegemonic until the 20th century. The Essai Philosophique sur les Probabilités (1814) presented the vision through the figure of the devil: if someone — an infinite intellect — knew all the forces of nature and the positions of all the molecules, it could calculate the entire future as well as the entire past with perfect accuracy. In this framework, probability does not measure the indeterminacy of reality; measures the observer's knowledge deficit. Randomness is purely epistemic. There is no true contingency in nature, only ignorance about initial conditions. Chance is a shadow of our cognitive incompleteness, never a property of matter. It is the purest formulation of random deafness — and simultaneously the most transparent in its refusal. Laplace's demon presupposes that, if we could see as the devil sees, we would understand that nothing was accidental. The infinite perspective would make randomness disappear. This assumption — that complete knowledge would reveal total necessity — governed physics and philosophy for centuries.

The Laplacian construction is impressive for what it reveals: it is not just a thesis about reality — it is a thesis about the relationship between knowledge and reality. Laplace's assumption is that complete knowledge would dissolve randomness, that probability measures only the gap between what we know and what is. This equivalence between determinism and total knowability would be the orthodoxy of physics up to quantum mechanics. But it is itself a metaphysical assumption — it assumes that the real is structured in such a way as to be completely accessible to calculation, that there is no aspect of reality that structurally resists determination.

Boltzmann received Laplace's legacy but was forced to modify it by statistical thermodynamics. The second law — entropy in an isolated system never decreases — is not absolutely necessary; is extremely likely. Every collision between molecules respects Newton's laws; molecular dynamics is deterministic in principle. But the macroscopic behaviour we observe emerges from ignorance about microstates. High-entropy states are vastly more numerous than low-entropy states; the system is overwhelmingly more likely to be in one of them simply by combinatorial counting. Boltzmann thus preserved a double truth: microscopic determinism, apparent macroscopic indeterminism. Chance remained epistemic, an attribute of ignorance, not of being. Matter remained pure necessity; Only we were unable to see it because the combinatorial complexity of molecular positions exceeds any computational capacity. But this was a difference of degree, not of kind. Laplace's demon, if it knew the exact microstates, could predict the macroscopic future with precision. Probability was a human calculation tool, not a description of reality.

The hegemony of epistemic determinism — from Democritus to Boltzmann, passing through Laplace — built a conceptual fortress lasting two millennia: chance is ignorance, randomness is perspective, necessity is absolute. The fortress is impressive for its internal consistency: within its assumptions, it is irrefutable. But presuppositions are, precisely, presuppositions — and quantum mechanics would demolish their foundations by showing that indeterminacy is not a product of ignorance but an irreducible property of the most fundamental inscriptional regime. However, before the quantum revolution, a lone voice anticipated the rupture.

Peirce remained a confirming modern exception to the rule of deterministic tradition. His tychism — the doctrine that chance is real, that there is genuine spontaneity in nature, that laws evolved from a more irregular primitive state to the deterministic form we observe today — refuted Laplacian and Leibnizian hegemony. Peirce argued that uniformity is not absolute but historical, achieved through progressive habit. The young universe would be chaotic; only over time did the laws crystallize into regularity. But Peirce was a lone voice, marginal even to quantum mechanics. After 1925, Peirce's position gained ontological relevance, although for radically different reasons. It wasn't that Peirce got the metaphysical foundations right. It was that physics discovered an indeterminacy that no Laplacian epistemics could dissolve. The question that Peirce posed in speculative terms — is chance real? — became an experimental question with a measurement answer.

2.2 Bell's theorem and experimental proof

The EPR paradox — formulated by Einstein, Podolsky and Rosen in 1935 — articulated an aporia that seemed insurmountable within orthodox quantum mechanics. Two entangled photons, once separated in space, maintain a necessary relationship: measuring the polarization of one instantly determines the polarization of the other, regardless of distance. Einstein called it "spooky action at a distance" and offered a scathing philosophical diagnosis: this was a sign that quantum mechanics was incomplete. If quantum mechanics were a complete theory and faithful to reality, reality itself could not depend on a measurement carried out hundreds of kilometers away. No influence can cross space without contiguous causal intermediation. There had to be hidden variables — properties intrinsic to each photon — already determined before any measurement, which stored local information. Each photon would carry with it, independently, its complete state, like recorded instructions just waiting to be read. The measurement would only reveal what was already there. This argument appealed to the deepest sensibility of classical physics: locality is sacred.

John Bell, in 1964, made a discovery that would transform the question of metaphysics into an experimental question. He demonstrated mathematically that if there were local hidden variables — if each photon carried with it, in a determined way and independently of the other's configuration, its information — then certain correlations between entangled pairs would respect a precise upper limit. These are Bell's inequalities. But quantum mechanics predicts systematic violation of this limit. Bell did not prove that hidden variables existed or did not exist; did not decide between interpretations. It proved a logical fact: if locality is true, then there cannot be hidden variables that reproduce quantum predictions. Either there are hidden variables that are fundamentally non-local, or there are no hidden variables at all and the indeterminacy is genuine. Bell transformed a philosophical question into an experimentally testable equation.

Alain Aspect, in 1982, carried out the experiment that would directly test Bell's inequality. Two entangled photons were produced from an excited atom and were separated by 12 meters. Before each measurement, the analyzer's orientation was changed — and this was done at intervals of time sufficiently short so that no light signal could travel between the two measurement zones. This eliminated the possibility that one measurement would communicate to the other which orientation to use. The result was unambiguous: violation of Bell's inequalities with statistical significance of several standard deviations. The correlations that quantum mechanics predicted were observed. This meant that one of three things had to be true: either realism was false (there are no properties defined before measurement), or locality was false (there are non-local influences), or determinism was false (the outcome is not predetermined even given all the facts of the past). Anton Zeilinger and collaborators refined the experiment with measurement choices determined by photons from quasars emitted billions of years ago — ensuring absolute causal independence of the source of randomness from any conceivable terrestrial influence. This work was recognised with the Nobel Prize in Physics in 2022.

What this experimental sequence establishes is not just a technical result within particle physics — it is ontological displacement. For three centuries, the structure of natural thought was based on the conviction that reality is locally determined: each point in space contains enough information to specify what happens there, without instantaneous reference to distant points. Bell showed that this conviction is incompatible with experimental facts. The question is no longer philosophical in the sense of being disputed by conceptual arguments; it became empirical, decided by measurement. And measurement decided against local determinism. Whatever interpretation is chosen — and the following section examines the main ones — none of them restores the classical picture intact. The price of each interpretation is different, but they all share the same minimum payment: the renunciation of Laplace's universe, the perfectly predictable machine, the reality that is entirely calculable in principle. The devil who would predict everything, if he knew all the positions and forces, is refuted not by argument but by experience. This result is irreversible — not in the rhetorical sense, but in the sense that any future theory that seeks to restore local determinism will have to contradict experimental facts established with increasing precision over four decades.

The Copenhagen interpretation, the orthodox one until the 1960s, offered an answer of unusual epistemological modesty. The wave function does not describe the real itself; it is a predictive tool. On reality before measurement, it remains silent. There are no pre-existing facts to the measurement; there is only the ability to predict with probabilistic precision what will occur when measured. The collapse of the wave function at the time of measurement is not a physical process; it is a change in available knowledge. This resolves the EPR paradox by radically denying it: if there are no properties before measurement, there is no question about how distant measurement affects them. The interpretation is philosophically modest because it renounces saying anything about the real itself; it deals only with what can be registered. Bohr formulated this position with remarkable rigour: the classical concepts — position, momentum, trajectory — are conditions of description, not intrinsic properties of the system. Quantum mechanics does not reveal reality as it is; it reveals the real as it responds when questioned in a determinate way. The choice of measuring device is not a neutral accessory — it is a constitutive part of the phenomenon. Without measurement, there is no phenomenon; there is only indeterminate potentiality. Copenhagen's modesty is not weakness; it is recognition that the inscription regime is constitutive of the reality it describes.

David Bohm, on the contrary, refused epistemological modesty. He proposed that there are, in fact, well-defined trajectories for each particle, guided by a non-local "pilot wave". Each electron follows a determined trajectory but in such a way that no signal transmission can communicate between separate points. Bohmian mechanics reproduces all quantum predictions while maintaining strict microscopic determinism. The price is radical non-locality: the pilot wave is a physically real entity that permeates the entire universe, acting instantaneously. From the perspective of strict immanence — the real is only that which has a local presence --, the pilot wave raises serious difficulties. It behaves less like a description of how nature operates and more like a theoretical assumption introduced to save determinism at any cost.

Everett proposed a radically different solution: there is never a collapse because all results are updated in different branches. The wave function never collapses; only the observer is in the field where he obtains a specific result. This eliminates the measurement problem by saying that the problem is illusory — all outcomes occur, just in different "universes" inaccessible to each other. The formal elegance is undeniable: quantum mechanics applies universally, without exception for observers, without ad hoc collapse. But the metaphysical difficulty is profound: an infinite multiplication of realities is postulated, each inscribable for itself, but mutually inaccessible. Inscriptional inaccessibility between branches does not necessarily imply independent ontological existence — it may just mean that certain aspects of the real do not have status for this regime of inscription. The proliferation of worlds is an interpretation of formalism, not a direct reading of physics. The Schrödinger equation describes unitary evolution; That this evolution corresponds to the branching of universes is a metaphysical addition that formalism does not require. Furthermore, probability — a central concept in quantum mechanics — becomes difficult to establish in Everett: if all outcomes occur, what does it mean to say that one is more likely than another? The question remains open and constitutes a serious technical objection that defenders of the interpretation recognise without resolving it in a consensual way.

GRW (Ghirardi, Rimini, Weber) and its successor CSL (Continuous Spontaneous Localization) offer a third way: collapse is a genuine physical process, a modification of quantum dynamics that introduces a spontaneous localization mechanism. The wave function collapses spontaneously, with a frequency proportional to the mass of the system — very rare for an individual particle (once every hundred million years, approximately), but frequent for macroscopic systems composed of billions of particles. The consequence is that macroscopic objects are always localized while individual particles maintain quantum behaviour — the boundary between quantum and classical emerges naturally from the parameters of the theory, without the need to postulate observer or measurement as fundamental concepts. It is theoretical reform, not reinterpretation: it modifies the equations of quantum mechanics, adding a non-linear stochastic term. It has singular merit: it suggests that the measurement problem may require theoretical modification of quantum mechanics itself, not just reinterpretation of the existing formalism. The experimental evidence has not yet decided between GRW/CSL and orthodox quantum mechanics — the proposed parameters generate differences observable only in extreme conditions that are at the limit of current experimental capabilities.

None of these four strategies is satisfactory in absolute terms. Copenhagen renounces ontology — but the renunciation may be premature, confusing the limit of the inscriptional regime with the limit of the real. Bohm preserves determinism — but at the cost of radical nonlocality that challenges immanence. Everett eliminates collapse — but multiplies realities without empirical justification. GRW modifies the theory — but without decisive experimental confirmation of the spontaneous collapse parameters. The situation is instructive: there is no consensus because the problem resists solution within the inherited conceptual framework. Perhaps the problem is not technical but ontological — perhaps it requires not a better interpretation of formalism but a better understanding of what the relationship between inscription and reality means.

The position that governs this route does not choose between Copenhagen, Bohm, Everett or GRW. All these strategies converge in a refutation that constitutes perhaps the most important philosophical result of 20th century physics: local determinism is refuted. The convergence is remarkable: interpretations that disagree on almost everything — on the nature of the wave function, on the existence or not of collapse, on the number of worlds, on the modification or preservation of the formalism — agree on this single, decisive point. And this point is sufficient for the argument of this chapter. It is not necessary to choose between them to recognise that Laplace's universe, the perfectly calculable universe, the universe where chance is ignorance, has been experimentally refuted. What Aspect's experiment showed is that the inscription regime by which the real becomes legible does not capture certain traits in a single measurement. There is indeterminacy in the outcome. But — essential inscriptional caution — this does not resolve the ontological question. Saying that the result is indeterminate in the registration regime is not the same as saying that the reality is indeterminate in itself. The transition from "inscriptive indeterminacy" to "ontological indeterminacy" is an interpretative operation. It is not binding on the evidence. Physics allows us to keep this question open: there is something in the real that precedes what can be inscribed, and the indeterminacy that we find may be a shadow of this inaccessibility — or it may be genuine. Both positions respect physics. Both reject local determinism. Both recognise that what Laplace imagined as completely calculable is, in reality, permanently open.

2.3 Constrained indeterminacy ≠ arbitrariness

Chance in quantum mechanics is not disorganisation — it is not random noise in which anything could occur with equal probability. It operates within rigorous constraints that define the space of possibilities. The wave function precisely describes this space; not every change is permitted by the structure of the system. The laws of conservation — energy, angular momentum, electric charge — absolutely limit what results can occur. An energy measurement will always produce one of the eigenvalues of the Hamiltonian operator; no measurement will produce arbitrary energy. An atomic transition will obey selection rules defined by the symmetries of the theory; certain jumps between levels are prohibited. Chance operates within a web of material constraints. The formula that concentrates this is: there is abundant causality in quantum mechanics; what is indeterminate is what the specific measurement result is when the measurement event occurs. Quantum chance is not the absence of structure; it is openness within a rigorously defined structure.

The analogy with data is useful but must be immediately corrected. A common die has six equiprobable sides; the outcome is indeterminate but the space of possibilities is simple and uniform. Quantum random is not like that: the space of possibilities is geometrically complex, defined by the system's wave function, shaped by symmetries and material constraints. The probabilities are not equal for all outcomes; they are specific distributions, calculable with extraordinary precision, dependent on the system configuration. The wave function of the electron in the hydrogen atom defines a precisely shaped probability cloud — spherical, lobular, toroidal orbitals — that determines where the electron can be found and with what probability. Quantum "chance" is, therefore, chance within rigorous geometry. Indeterminacy is not emptiness; It is a structured space of possibilities.

It is not that "everything is possible". It is just that, within what is materially possible — defined by laws and constraints --, the specific result is not predetermined by any previous accessible state. There is room for genuine novelty, but novelty confined to the limits of physical possibility. This is completely different from radical arbitrariness. Indeterminacy is structured, geometrically confined, operationally precise. You can calculate exactly what outcomes are possible; it is just that one cannot predetermine which of them will occur on a specific occasion.

Crystallization offers a prebiotic example of constrained indeterminacy in operation. When a saturated solution cools, nucleation — formation of the first crystalline nucleus — depends on local thermal fluctuations that are genuinely undetermined: where exactly the first nucleus forms, at what time, in what orientation, is not prescribed by macroscopic conditions. Indeterminacy is real. But it is not arbitrary: the chemical properties of the solute — bond angles, interatomic distances, lattice energies — strictly constrain which crystal configurations are compatible. Sodium chloride crystallizes in a cubic, not hexagonal, lattice; Quartz crystallizes in a trigonal system, not a cubic one. The type of crystal is determined by chemistry; the time, location and specific orientation of nucleation are not. Indeterminacy operates within constraints that define the space of the possible without prescribing the singular outcome. The same structure governs star formation: molecular clouds collapse gravitationally, but the exact point of collapse, the precise mass of the resulting protostar, the timing of nuclear ignition — all of this depends on density fluctuations that are indeterminate within the framework of initial conditions. The constraints — gravity, radiation pressure, metallicity — define the space of viable configurations; the fluctuations decide which singular configuration takes place. The novelty is genuine — it was not inscribed before the event — but it is not arbitrary, because it operates within material constraints that delimit the regime of the possible.

The distinction between constrained indeterminacy and arbitrariness is decisive for the entire course of this book. Pure arbitrariness — any outcome with equal probability, no operative law — would be incompatible with the emergence of structure. If chance were truly blind, without constraints, the universe would be uniform noise, without configurations, without regularity, without history. Conversely, complete determinism — every outcome inscribed in the initial conditions — would eliminate novelty: the future would be an unfolding of the past, no genuinely new configuration could emerge. Constrained indeterminacy occupies a precise position between these extremes: there is enough openness for the new to erupt, and enough constraint so that the irruption does not dissolve the entire existing structure. This position is not a compromise between two absolutes — it is not "a little chance and a little determinism". It is its own regime, irreducible to either extreme: the real is structurally open within structurally defined limits.

The cosmic scale offers confirmation of the same structure. If constrained indeterminacy operates at the quantum level and manifests itself in processes such as crystallization and star formation, it should also manifest itself in the cosmological processes that link the microscopic to the macroscopic. It manifests itself. Primordial quantum fluctuations — density differences on the order of one part in a hundred thousand — have been amplified by cosmological inflation by colossal factors. What was microscopic indeterminacy became seeds of cosmic structure. The anisotropies of the cosmic microwave background that the COBE, WMAP and Planck missions have mapped are images of these amplified fluctuations. The entire structure of the universe — galaxies, clusters, filaments, voids, the fabric in which cosmic history unfolds — is descended from constrained quantum chance. The causal chain is verifiable: quantum fluctuation → inflationary amplification → cosmic background anisotropy → gravitational collapse → structure formation. Each link is understood theoretically and confirmed observationally. There is no gap where an ordering principle could be inserted. The equations of general relativity, combined with the initial conditions provided by the amplified fluctuations, reproduce the observed distribution of matter. The cosmic structure is no mystery; it is the consequence of constrained chance operating under gravity over 13.8 billion years. There is no prior plan that these fluctuations carry out. There are constraints that allow these variations to occur. Result: a singular, non-repeatable universe, historically contingent in its foundation. If the universe retraced its history since the Big Bang with exactly the same laws but with different values for those quantum fluctuations, a completely different cosmos would emerge — different structures, different scales. Cosmic history is contingent in its foundation.

What's random is not what we don't know — it is what is not decided.

2.4 Chaosmos: neither cosmos nor chaos

The two previous axes leave tension unresolved. If instability is a condition (Axis 1) and randomness is a structure (Axis 2), what name should we give to the regime that precedes all legibility? It is not cosmos — cosmos presupposes imposed or recognised order. It is not chaos — chaos, in the common sense, presupposes disorder evaluated as such. The real material prior to any legibility is neither of the two because both terms already presuppose criteria of symbolic evaluation. Deleuze and Guattari introduced the term Chaosmos to designate a regime that is neither cosmos nor chaos — neither total order nor total disorder. Cosmos is cosmos because something — an intelligence, a form, a plan — imposed order on reality. Chaos is chaos because complete disorder reigns, no regularity, infinite speeds, no capture. Chaosmos refers to the material operation prior to any stabilisation of the legibility regime. It is difference in progress, material hesitation operating without a plan but with presence. The concept works as a limit operator: where tradition saw cosmos or chaos, a third category is installed. The dignity of the formless: that which has no form is not confused; It is a difference that did not result in legible organisation. This is not to say that there is something between order and disorder — to say so would be to keep them as poles and postulate a continuum. The point is to say that the real prior to any legibility is neither one nor the other because both categories presuppose an already operative legibility regime. Chaosmos is the previous one to this regime.

What this path takes away from Deleuzian operation is precisely the rupture with the order-disorder dichotomy. What it refuses — and this is decisive — is the complementary architecture that Deleuze builds around the notion. The divergence is not one of emphasis; It is structural, and it is important to deduce it step by step.

Deleuze postulates the virtual as an ontological plane separate from the actual one — a field of potentials as real as the actualised one, but different in nature. In Difference and Repetition, the virtual is not a logical possibility waiting to be realised; it is real multiplicity that differentiates itself by updating itself, without resemblance to what it produces. The thesis is strong: there is a plane of consistency — populated by singularities, intensities, differential relations — that subsists beyond any updated configuration and that constitutes the genetic condition of all updating. The ontological cost of this architecture is threefold. First, it doubles the real: for each effective material configuration, it postulates a complementary field of potentials that exceeds it and that has its own ontological status. But if the virtual is as real as the actual, the question arises: what criteria allows us to distinguish real virtuality from projected possibility? The Deleuzian answer — the virtual differs in nature from the possible because it does not resemble what it produces — is a distinction internal to discourse, not to matter. Nothing in the material operation of quantum fluctuations, crystallization, or star formation requires two planes; it only requires present material compatibilities and constraints that delimit the space of what can occur. The virtual thus functions as a projection of a temporal category — what could have been, what could still be — onto the real material present. It is a symbolic artefact that confuses the regime of possibility (proper to discourse) with the regime of being. Second cost: by postulating a field of potentials that subsists independently of actualization, Deleuze reintroduces transcendence under an immanent name. The plane of consistency, although declared immanent, functions as a prior foundation that conditions without being conditioned — a structure that tradition would recognise as transcendental. Third cost: the virtual, as a field of pre-individual singularities, operates as an inexhaustible ontological reservoir — and this reservoir requires that matter be less than it is, a mere case of updating potentials that exceed it. The perspective that operates here inverts the relationship: matter is not a case of the virtual — it is all of reality. There is genuine indeterminacy in the real; there is no separate virtuality that fuses it. Indeterminacy does not need a complementary ontological plane; all that is needed is the constitutive opening of material compatibilities that no configuration exhausts.

The body without organs — a central concept in the ontology of Deleuze and Guattari — designates a surface of pure intensities without functional organisation, liberation from structure, a limit to which all desire tends as a dissolution of the imposed stratification. The notion works as a critical operator against the rigidity of organisms, State apparatuses, and codifications that fix flows. But the ontological consequence is problematic: if the body without organs is a desirable horizon, functional organisation is a constraint to be overcome — and matter, ultimately, would be freer the less organised it is. This route rejects the premise: matter is always constrained by its own conformations, and these constraints are not impediments to its operation — they are the condition of that same operation. Total disorganisation is not freedom; it is dissolution. Material constraints — the chemical properties of nucleotide bases, coupling constants, thermodynamic thresholds — are constitutive of the capacity for reorganisation, not obstacles that limit it. Without constraints, there is no reorganisation; there is only dispersion.

Third divergence, and the most consequential for the anti-teleological regime: Deleuze preserves minimal teleology. The virtual "tends to" update itself; there is "pressure" of differentiation; the update is described in Difference and Repetition as a process by which virtual singularities "incarnate" in current configurations, as if the virtual exerted a request on the current. In O Anti-Oedipus and A Thousand Plateaus, the notion of desire as production — desire that lacks nothing, which is machinic activity — preserves immanent directionality: desire produces, invests, forms rhizomes, deterritorializes. This directionality, however subtle it may be, is teleology: it implies a constitutive orientation, even if not finalistic in the classical sense. This path categorically rejects any teleology, including the minimum: material reorganisation is contingent, not oriented. There is no program of increasing complexity nor pressure for differentiation — there are only operations that can produce greater or lesser complexity as a side effect of material compatibilities under specific constraints. The Chaosmos that matters here is the one that strictly designates the material regime of the real before any inscription — non-transcendental, non-virtual, non-oriented. Just material difference operating within its own constraints.

Chaosmos, thus stripped of its Deleuzian components — without virtual, without body without organs, without minimal teleology --, preserves what is ontologically decisive: the rupture with the order-disorder dichotomy. Tradition offered two possibilities: either reality is ordered (cosmos) or it is disordered (chaos). Chaosmos installs a third position: the real, before any regime of inscription, is neither ordered nor disordered — it is a material difference in operation, below any judgment about organisation. Disorder is as symbolic as order: both presuppose evaluation criteria, a legibility grid, a regime that decides what counts as organised and what does not. The real material precedes this decision.

The operational resumption is direct: instability (Axis 1) plus indetermination (Axis 2) form Chaosmos — the material regime that is a condition of all emergence without being itself a form. Within this regime, differences occur — quantum fluctuations, reorganisations, transient compatibilities. Some of these differences precipitate in settings stable enough that an enrollment regime can capture them. These captures — the atoms, the stars, the cosmic structures — appear as if they had always been oriented toward their current form. The illusion is understandable: to anyone looking at the completed story from the outside, there appears to be direction. But the direction is retroactive. It was built by capture, not prior to it.

Axis 3 — Order as an effect

3.1 Genealogy of the ordering principle

The question about the origin of order is as old as Western philosophy and equally persistent: if the universe is unstable, if randomness is the structure of reality, how does order emerge? The traditional answer has been univocal for two and a half thousand years: order requires an orderer. This orderer took different forms — cosmic mind, demiurge, immanent reason, divinity, intelligent designer — but the structure of the argument remained identical. The genealogy that follows is not intended to refute these positions, but to show that the pressure to subordinate order to a principle is itself undemonstrated, that it rests on metaphysical intuition, and that modern science dissolves it not through counterargument, but through the demonstration of blind mechanisms that produce principleless order.

Anaxagoras of Clazomenae introduced, in the fifth century BCE, the figure of Nous—the cosmic mind that orders the undifferentiated primordial mixture. Before Nous there is only a confusion of infinitesimal seeds, a chaos in which nothing is distinguished and all opposites are entangled without separation. Nous does not create from nothing; it separates, differentiates, and orders. Aristotle praised Anaxagoras as 'a sober man among drunkards' because he introduced an intelligent cause into cosmology. The operation is clear: order cannot emerge from matter if matter is understood as passive, inert substrate; it requires an external, transcendent, intelligent cause. Nous is distinct from matter and irreducible to material process. It remains exterior, directing from without what is otherwise blind material necessity. The consequence for the subsequent tradition is decisive: matter, without an intelligent principle, does not organise itself—it simply dissolves. Organisation is imported, never generated. The basis for this presumption is not empirical—Anaxagoras performed no experiment proving matter incapable of self-organisation. It is metaphysical. It begins from an image of matter as inert substrate whose only dignity lies in being organisable. It is precisely that image that this chapter sets out to dismantle.

Plato formalises the same scheme in the Timaeus: the Demiurge contemplates the eternal Forms and imposes them upon the chôra, the chaotic and undifferentiated receptacle of matter—'he brought order out of disorder, considering order in every way superior' (30a). Form does not emerge from matter; it is imposed upon it. This scheme—passive matter, eternal model, external agent—passes into Christian theology (Augustine, Thomas Aquinas, Bonaventure) with variations in terminology but an unchanged inner logic: matter alone does not produce order; order is imported from elsewhere.

Aristotle rejects the Demiurge and the separate model, but preserves the principle: order is immanent telos, a purpose that the nature of things contains as internal guidance. «Nature does nothing in vain» (Politics I, 1253a9). Aristotelian teleology is more subtle — it does not postulate an external agent --, but it is equally implacable: matter has direction, order is not blind. The Stoics radicalise: the cosmic Logos governs everything without exception, every event is providential and necessary, chance is ignorance. Instability becomes literally unthinkable — if everything is rational and necessary, nothing escapes the rule of reason.

Leibniz, already in modernity, reformulates in line with Christian theology: God created — out of infinite possible worlds — the «best of all possible worlds». The monads are pre-harmonized from the creative act, each one expressing the whole in its own way, without real interaction. The order is global, pre-programmed, inscribed before any contingent event. Nothing is accidental; everything is a consequence of divine selection. The harmonization is invisible — the monads do not communicate — but it is perfect.

Leibnizian harmonization is simultaneously the most elegant and the most vulnerable form of the ordering principle: if everything is pre-harmonized, if nothing is accidental, if each event is the consequence of optimal divine selection, then suffering, catastrophe and destruction are necessary moments of the «best of all possible worlds». Voltaire, already in 1759 in Candide, ridiculed the consequence: was the Lisbon earthquake of 1755, which killed tens of thousands of people, a necessary component of the best possible world? The objection is not merely sentimental; it is logical: global optimality presupposes that there is something to optimize — a criterion, a value function, a measure of goodness. But the existence of such a criterion is precisely what is at stake. To postulate it is to assume what was intended to be demonstrated.

William Paley, in 1802, offers in Natural Theology the most explicit and vulnerable formulation of the ordering argument. If we find a watch on the beach, we necessarily infer a clever watchmaker. Similarly, organisms such as the human eye, the arm, the ear — machines of unparalleled functional complexity — infer an infinite, omniscient, omnipotent Designer. Biological order is proof of cosmic intelligence. Paley does not invent the argument; synthesizes the intuition that runs throughout modern theology. But its analogical precision simultaneously reveals a weakness: the analogy rests on experience of manufactured objects and not on analysis of how order emerges in general. Darwin will show that functional complexity can emerge without a designer, through a blind mechanism.

Hegel, whose operation Axis 1 has already analyzed, represents the limit case: he welcomes instability as a genuine dialectical engine, admits negativity as a productive operator. However, the entire dialectic remains subordinate to the Absolute — instability is a necessary moment in a teleological drama, not an autonomous condition. The Hegelian order is dynamic and conflictual, but inevitable, necessary, oriented towards final reconciliation. The ordering principle, here, is no longer external to matter: it is the rationality of the process itself. Sophistication does not dissolve structure — refining it makes it harder to detect.

Genealogy reveals a constant for two and a half thousand years: order requires an ordering principle. This principle can be transcendent (Nous, Demiurge, Creator God) or immanent (Aristotelian telos, Stoic Logos), personal (Paley's Designer) or impersonal (Hegelian Absolute), but the structure is invariable. The shared assumption — rarely examined, almost never justified — is that matter, left to itself, does not produce order. Matter is inert, passive, chaotic; requires direction. Without beginning, without government, without guidance, matter disperses, disintegrates, dissolves into indistinction. This assumption was not demonstrated by any of the thinkers who invoked it. It is metaphysical intuition transmitted as an axiom, so deeply installed in Western thought that it is rarely identified as a presupposition — it functions as evidence, as truth that does not require demonstration. Its strength does not lie in arguments; lies in ubiquity: from Anaxagoras to Hegel, from Plato to Paley, everyone agrees on the fundamental point, and agreement functions as confirmation. But agreement is not demonstration. The pressure to subordinate order to external principle is itself inherited, not deduced.

The section that follows dissolves this pressure not through philosophical counterargument — it does not replace one ordering principle with another — but through the demonstration of three blind mechanisms that converge to show that matter, under specific conditions, produces order without principle, without intelligence, without purpose.

3.2 Dissolution by blind mechanism

Darwin is relevant here exclusively as a genealogical counterargument: the previous section ends with Paley, and Paley argued from biological complexity. Darwin dissolves Paley's argument in the domain where he formulated it — showing that random variation and differential selection, iterated over generations, produce functional complexity without a designer. The eye that Paley saw as proof of cosmic intelligence emerges from a sequence of incremental modifications, each advantageous in its context, none requiring anticipation of the final result. Jacques Monod, in O Acaso e a Necessidade (1970), distilled the consequence: teleology is the genuine purpose of a conscious agent; teleonomics is the appearance of purpose without intention. Darwin's strength is that he demonstrated that the appearance of purpose can emerge from an entirely blind mechanism. But — and this is the decisive point — the Darwinian mechanism operates exclusively in biotic territory: it presupposes reproduction, heredity, differential selection between organisms. Neither of these categories applies to the prebiotic regime that constitutes the territory of this book. Darwin refutes Paley, but the dissolution of the ordering principle in the prebiotic field requires mechanisms that operate without life, without code, without selection — and that is where the decisive demonstration begins.

Ilya Prigogine dissolves the ordering principle precisely where it matters most: in systems without genetic code, without selection, without life. Bénard cells provide the experimental paradigm. A fluid heated from below remains stratified and inert until a critical threshold is crossed. Once the thermal gradient exceeds that threshold, a transition occurs: spontaneous geometric order emerges without external instruction. Convection cells, often hexagonal, circulate heat with near-crystalline precision. No intelligence directs them. No plan is imposed. The fluid receives a gradient and, above the threshold, the hexagonal configuration acquires causal efficacy—not because it aims to dissipate heat, but because it is the configuration through which the forces accumulated in the gradient can be redistributed. Order is an effect, not a purpose. The explanation invokes material compatibility and thermophysical constraint, never intelligence or design.

The Belousov–Zhabotinsky reaction displays an equivalent pattern in pure chemistry. A mixture of oxidants, reducers, and catalyst undergoes periodic oscillations of colour—blue, red, transparent—in rhythms measured in minutes. Without life, without code, without selection, the system behaves temporally like a wave. In thin layers, spatial patterns also emerge: concentric fronts and spirals of remarkable symmetry. This is pure chemistry exhibiting collective order without external cause. Prigogine's formulation is exact: dissipative structures are forms of order that emerge in systems far from thermodynamic equilibrium. Far from equilibrium, dissipation—the flow of energy or matter through a system—is not destruction but a condition of organisation.

Decisive differentiation is needed here. The ontological cost of Prigogine's position lies in its own virtue: thermodynamic precision. The Prigoginian bifurcation is mathematically localizable — singular point where the system transitions between regimes — and can only be defined in a system with identifiable control parameters (thermal gradient, concentration of reactants, energy flow) and calculable thresholds. When there is no isolable system, when reorganisation operates before any regime of measurement, Prigogine does not respond — nor does he intend to respond — because his ontology is regional. The consequence is twofold. First: if self-organisation constitutively depends on bifurcation in the technical sense, then the real material prior to any regime of measurement would be beyond its reach, and matter would, by default, be inert. Second: bifurcation is a discrete event, but the power to reorganise matter is continuous, immanent, prior to any threshold that makes it visible — matter is always in excess over itself, always capable of configurations that it does not realise, and this exceedance does not wait for bifurcation to operate. What this path proposes is not an extension of Prigogine's framework to new domains — it is recognition that what Prigogine demonstrates in thermodynamics is a local manifestation of a more general ontological property: matter is not a passive substrate, it is a permanent power of reorganisation. Bénard cells are the case where this power becomes experimentally visible; the convergence of multiple domains (thermodynamic, cosmological, chemical) underlies the generalisation.

Pay attention to the language: Bénard cells do not "organise themselves to" dissipate heat. Heat dissipation is not a goal, it is not telos. The hexagonal configuration is material compatibility that emerges when the gradient passes the threshold. The result was not intended; it is contingent. The configuration could have been different if the ratio between viscosity and thermal conductivity had been different. Order is material compatibility, a blind consequence of material constraints, not an objective pursued.

Stuart Kauffman, in the 1980s and 1990s, enriched the convergence with self-organisation theory in complex networks with multiple interacting components. It rigorously shows that in systems of sufficient complexity — elements with sufficient connections, with positive and negative feedback between them — order emerges spontaneously without any natural selection, without a designer, without an intelligent agent. The generic statistical properties of the components — average number of connections per element, type and strength of interactions, presence of feedback — generate collective order. The order is not a rare exception that requires special explanation; It is a generic property of systems located in the critical complexity regime — neither too simple, nor too chaotic, but on the border between simplicity and chaos. Kauffman calls this "free order" — order that costs nothing in terms of selection or design, that emerges from the statistical properties of the components themselves without anyone prescribing it.

Alan Turing, as early as 1952, discovered a fundamental mechanism of pattern formation by reaction-diffusion: two chemical substances that produce and degrade each other, interacting through local diffusion, generate complex spatial patterns spontaneously from an initial homogeneous state. Homogeneity is not stable when two chemical agents with different diffusion speeds interact — instability leads to spatial differentiation, the formation of a regular pattern. The mechanism is purely chemical and mathematical: reaction-diffusion equations produce bands, spots and spirals in entirely abiotic systems — chemical reactions in gels, waves of oxidation on metal surfaces, distribution of minerals in sedimentary rocks. The pattern geometry is not programmed; emerges from the differential properties of the reactants. It is a consequence, not a program.

The convergence of prebiotic mechanisms — dissipative structures in physical and chemical systems, self-organisation in complex networks, pattern formation by reaction-diffusion — shows that non-teleological, entirely blind mechanisms are sufficient to produce order in territory where no biotic category operates. Convergence is more significant than any of the mechanisms alone: each operates in a different domain (thermodynamics, formal networks, chemistry), with different internal logic (dissipation far from equilibrium, generic statistical properties, differential instability of reactants), and they all arrive at the same result — beginningless order. Darwin had demonstrated the same in biotic territory, refuting Paley in the domain he had chosen; but the ontological dissolution that matters for this journey is the one that operates before any life, before any selection, before any code. Prigogine, Kauffman and Turing demonstrate precisely this: matter, under conditions of sufficient instability, spontaneously produces organised configurations without the organisation requiring reproduction, heredity or differential selection. The organisation is no exception that requires special explanation; It is a generic property of material systems sufficiently far from equilibrium.

It is important to note that these mechanisms are not variations of the same principle — they are genuinely distinct, operating in different domains with different internal logics. Prigogine operates by dissipation far from equilibrium into a single system; Kauffman operates by generic statistical properties of networks with interacting components; Turing operates through differential instability between reactants with different diffusion speeds. Convergence is not trivial: independent mechanisms, discovered by researchers from different traditions (non-equilibrium thermodynamics, network theory, applied mathematics), working with different objects (fluids, Boolean networks, chemical systems), reach the same structural conclusion. Redundancy is an argument: if a single mechanism showed order without beginning, artifice could be suspected; independent mechanisms make the conclusion robust.

Occam's razor operates with particular force: if blind mechanisms, demonstrated experimentally in entirely abiotic systems, are sufficient to produce the observed order — from convection hexagons to chemical patterns, from self-organising networks to crystalline configurations --, there is no epistemic reason to postulate intelligent causation. Matter does not need to be shaped from the outside — it organises itself from within, under specific conditions, without purpose. The ordering principle of two and a half millennia dissolves — not by philosophical refutation, but by demonstration of mechanism.

3.3 All stability is pause

The order we observe is not the result of some global cosmic harmony. It arises from transitional compositions of material tensions and constraints that remain in conflict. Relative stability—atomic and molecular structures, planetary orbits, living cells—is a local pause in the entropic flow of the universe, a provisional interruption that endures only so long as the relevant conditions endure. The second law of thermodynamics states an ontologically fundamental truth: every local order is purchased at the cost of entropy exported elsewhere, and every local order is finite in time. No form persists without the material conditions that support it—and no such condition is eternal. The hydrogen atom can be ionised; the crystal fractures under sufficient pressure; the Earth's orbit changes over cosmological timescales. Stability is always conditional: it lasts while conditions last, and disappears when they change. A vortex exists only while the river continues to flow. A Bénard cell remains hexagonal only while the thermal gradient persists; stop heating and it dissolves into homogeneous fluid. Life endures only by drawing order from its environment—food, light—and exporting entropy as heat and waste. There is no global cosmic order. Order is a temporary island in an ocean of increasing entropy.

Matter is not a passive substrate upon which these processes operate from without. Matter is what operates, what reorganises itself, what produces order as an effect. This formulation reverses an image inherited from two and a half millennia: matter is not an inert substrate waiting for form, logos, or intelligence to shape it. Matter is an operator. It produces configurations, sustains tensions, and exceeds its own present form. Tradition treated matter as deficient; non-equilibrium physics and thermodynamics reveal its power. Matter is active, fertile, productive—but not oriented. Its fertility is not teleological; it is the contingent capacity for reorganisation. What produces order is not an external principle, not directing intelligence, not guiding purpose, but matter in excess of itself. Operative excess is the engine. Matter contains more configurative power than the form it presently occupies. Reorganisation follows from that excess; it is never the goal that guides it. Hence the condensed thesis of this chapter: matter is not chaotic and formless, not passive and inert, not deficient and awaiting moulding. It is unstable, metastable, fertile, possessed of genuine potency. It does not need to be shaped, directed, or organised by anything that transcends it. It organises itself without plan and without destination. What we commonly call 'order' is a temporary folding of material power. No form is absolute; every form is a configuration under constraints.

The implication is decisive for the entire course of this book: if order is a local and transitory effect, then no form has the status of a foundation. Every form — atom, star, galaxy — is a provisional configuration that material conditions sustain while they sustain them. This is not relativism: each form is real as long as it exists, its properties are measurable, its relationships are regular. But its reality is conditional, not absolute. It is part of a material compatibility regime that may cease. Stars die. Atoms decay. Structures dissipate. What tradition has identified as permanence — the Platonic Forms, the universals, the principles — is, seen in the light of metastability, a projection of local persistence into an eternity that matter does not sustain.

The refusal of teleology is an ontological requirement, not a stylistic choice. Relational consistency does not "aim" at anything. Self-organisation is not progress. Complexity does not "increase" with meaning. Structure formation is contingent — it could be otherwise, it could not be. That gravity amplified primordial quantum fluctuations, that structures formed, that galaxies emerged — all of this is real, all of this is explainable through mechanisms; none of this is necessary. If the initial conditions were slightly different, no structure would exist. The universe would be a uniform fluid, expanding in darkness. There is no hidden meaning that analysis must discover. There is no telos that must be recovered. The refusal of teleology is not nihilism; It is ontological rigour. Nihilism would assert that nothing has value, that existence is absurd, that order is an illusion. The position developed here affirms none of this. Order is real — Bénard cells are real, molecules are real, galaxies are real. Order is intelligible — the mechanisms that produce it are understandable, describable, in many cases quantifiable with extraordinary precision. Simply put, the order is not oriented. Doesn't walk anywhere. It does not express anyone's intention. Doesn't make any plans. The difference between nihilism and ontological rigour is that nihilism rejects the value of order; ontological rigour rejects only the need for the order to have been intended to have value.

Order is a temporary island in an ocean of increasing entropy.

3.4 Transition: from condition to event

The course of this chapter built, section by section, a triple foundation that the following chapters inherit as a starting point and from which they cannot retreat without losing coherence. Instability is not deviation — it is a primordial condition of reality, a fundamental structure prior to any form. Randomness is not ignorance — it is irreducible structure at the level of the most fundamental inscriptional regime we possess. Order is not a principle — it is a blind effect of mechanisms, local, contingent, provisional.

Taken together, these three conditions redraw the whole field of questions the book pursues. If instability is constitutive, rest is not a starting point but an exception. If randomness is structural, predictability is not a property of reality but a limit of the inscriptional regime. If order is an effect, organisation is not a principle but a contingent result. In this framework, emergence does not happen to something, nor is it directed by something—it breaks out. Instability provides the condition: the universe is always open to transformation. Randomness provides the openness: the path is not predetermined. Self-organisation provides the mechanism: once constraints exceed critical thresholds, order can emerge without a plan. Yet these three conditions still do not explain the concrete singular event. That conditions are unstable does not explain why irruption occurs here rather than there, in these forms rather than others. Instability is necessary but not sufficient. Randomness opens the field, but is not itself a mechanism. Self-organisation is demonstrable, but does not by itself explain how novelty is constituted before any regime of capture. The real continually reorganises itself; no form retains its inaugural configuration. But how is reorganisation possible? What is the structure of the event through which the new emerges? Condition is the map; event is the territory. The tension between condition—instability—and event—the irruption of the new—carries the analysis into the next chapter: how the universe moves from instability to structure, and what that transition means for our understanding of reality.

No form is origin — all form is fold.

The universe does not retain its original configuration — every form is a reconfiguration.