Introduction
Emergence does not happen — it breaks out. There is no before. There is only what insists.
Asking about origin appears to be thought's most natural gesture: Where does this come from? Who did this? When did it begin? Three elementary questions structure our curiosity. Yet every question carries an assumption, and assumptions are never neutral: they function like doors that admit certain answers while barring others. 'Where does this come from?' presupposes something prior, a substrate from which the thing emerges. 'Who did this?' presupposes an agent who effects the transformation. 'When did it begin?' presupposes a fixed point at which non-being gives way to being, or at which what was not yet there comes into existence. These questions seem constitutive of rational thought—the very way Western reason has organised itself for two and a half millennia. They appear neutral, merely natural ways of asking. But that is the illusion. The form of the question already contains the range of answers it will permit and excludes the rest. Each assumption operates as a filter: it admits only answers compatible with its own structure. To ask 'where from' is to demand provenance; to ask 'who' is to demand agency; to ask 'when' is to demand a datable instant. Questions are not neutral windows onto reality; they are frames that delimit what can appear as an answer.
This chapter does not deny that the universe emerged—contemporary cosmology gives us inescapable evidence of radical transformations in material configuration. Nor does it deny the legitimacy of the question. What it does is suspend the assumptions built into the question and show that only through such suspension can emergence be thought differently. The demand for a prior substrate is not a natural deliverance of reason; it is a conceptual habit, consolidated by tradition and preserved through repeated translation. The demand for an operating subject is equally hypostatised: grammar has imposed a structure of agency that no ontological necessity requires. The demand for a fixed point in time is a retrospective projection: the stable present projects backwards a sharpness that the process itself did not possess.
Main text
Axis 1 — Substrate dissolution
1.1 Genealogy of substantialism
The substantialist intuition seems constitutive of Western thought: if something emerges, it must emerge from something; if there is a beginning, there must be a prior condition that makes it possible. This intuition is not discovered — it is a historical construction, sedimented in layers of thought that overlapped and reinforced each other. The genealogy that follows is instrumental: it serves to bring out the pattern that persists beneath the translations, not to offer an encyclopedic overview.
Thales stated that water was the arché — that from which all things derive and to which they return. The transformation is continuous in water: it can be vapor, liquid, ice — without losing substantial identity. The gain is evident: unity of substance explains multiplicity of manifestations. The cost is the same: multiplicity is reduced to derivation from something previous that remains. Water persists under all changes; the differences are merely superficial modifications of a single substrate. Anaximander refuses water — it is too determinate. He proposes the apeiron, the unlimited, the indeterminate. Indeterminacy is more fundamental than any determination that emerges from it; it is an infinite reserve of potential. Again the pattern: something precedes, something contains, something sustains. Anaximenes returns to the determined: air. When rarefied it becomes fire; when condensed it becomes water, then earth. Air is arché because the variation from rare to dense shows in simple terms how multiplicity proceeds from unity. Heraclitus breaks with substantial permanence but not with the requirement of arché: fire is a constant flow, but a flow with measure, with reason — there is an immanent law to the process. Even Heraclitus, who insists that everything flows, presupposes that there is flow, that there is an operative basis — and that change requires that from which it changes. The Heraclitian logos — the reason that governs the flow, the measure that regulates the transformation — is already an ordering principle, arché in dynamic form. The difference between Heraclitus and his predecessors is not the denial of the foundation; it is the form that the foundation takes (process rather than substance). The foundation persists.
Common to these diverse proposals is the presupposition that there must be something permanent beneath the change, a substrate that persists while forms transform. The arché is not only the first in the temporal order — what existed before everything — but the ontological foundation that sustains reality at each moment. Multiplicity derives from unity; change presupposes permanence; becoming rests on being. The very structure of the question — "what is the world made of?" — already contains this presupposition: the verb "to be made" presupposes the material from which it is made. It is not possible to answer "from nothing" without seeming to deny the very existence of the world. The question compels us to look for a substrate, and the substrate found then legitimises the question. Circularity is invisible but operative. This conceptual structure — a permanent substrate that sustains modifications — defines the substantialism that would dominate the Western tradition for two and a half millennia, shaping both philosophy and the science that emerged from it.
Plato does not abandon the archē; he doubles it. There are the eternal Forms—the immutable archetypes—and the khôra, the receptacle, the 'third kind' between being and becoming. In the Timaeus, the Demiurge contemplates the eternal Forms and impresses them upon the khôra as seals upon wax. The khôra receives every form yet possesses none of its own: it is passive, receptive, available. Plato describes it through negations: it has no proper form, receives all forms, and is not identical with any of them. It is 'hardly credible' and can be grasped only through a bastard reasoning. It is a substrate in the sense of a receptacle, not a transforming matter, yet it remains a substrate all the same, a precondition of determination. The gain is obvious: it explains how multiplicity can receive order, how the chaotic can be informed. The cost is equally obvious: khôra remains a passive precondition making all determination possible. The instability it seems to acknowledge—the 'formless agitation'—is immediately subordinated to the Forms. Multiplicity does not emerge; it is received by what waits for it.
Aristotle radicalises the requirement: hylê, primary matter, is an absolutely indeterminate substrate. No quality belongs to it; it is pure potentiality — that which is capable of receiving all form without already possessing it. Substantial change — when one thing stops being and another becomes — is explained by postulating something that remains: hylê. Without the first matter, change would be the absolute destruction of one thing and the absolute creation of another, a rupture without continuity. Hylê restores continuity; it allows us to say that what something was is no longer what it is, while what it is made of continues under a different determination. Gain: explains change without descending into the absurdity of successive annihilation and creation. Cost: ontological anteriority becomes necessary. Without something that precedes, without matter that supports the forms, there is no change — there is only appearance and disappearance. The form/matter scheme subordinates all change to an underlying permanence — and this subordination becomes, in Aristotle, a condition of intelligibility. Thinking about change without a substrate is, for Metaphysics, thinking about the absurd. What Aristotle does not consider — and what later tradition will inherit as unthought — is the possibility that change does not require substantial continuity: that reorganisation may be primary, and the permanence they seem to require is the retrospective effect of a stabilisation long enough to appear to be foundation.
The pattern installed by these four moments—Thales, Anaximander, Plato, Aristotle—is not one pattern among many. It is the founding pattern of Western ontology: all change requires permanence; every emergence presupposes anteriority; all becoming rests upon being. The issue is not whether each philosopher supplied the right answer—water, the apeiron, the Forms, and hylē are radically different answers—but that the form of the question remained the same. So long as the question is 'What is the world made of?' or 'From what substrate does multiplicity emerge?', the answer will always be some prior foundation. Ungrounded emergence is not simply a disallowed answer; it is an answer the question, as framed, cannot even articulate.
Modernity does not abandon this pattern; it retranslates it in new terms while preserving the structure that defines it. The instructive point is precisely that the pattern proves not to belong to a single school of thought but to a deep habit of Western reason. Descartes posits res extensa: matter as substance defined by extension, filling space without remainder. Cartesian matter is no longer Aristotle's indeterminate substrate; it is geometrically defined, describable by figure and motion. Yet its ontological function is identical: something remains beneath change, something sustains variation, something of which transformations are merely modifications. Res extensa becomes the modern name for hylē—a translated substrate whose presuppositional status tradition never questions. Newton pushes the structure further: space and time become absolutes, empty containers existing independently of matter. The universe becomes a theatre in which matter moves upon a fixed stage. In the Scholium to the Principia, Newton distinguishes absolute space and time (mathematical, true) from relative space and time (apparent, measured by clocks and rulers). Absolute space remains always similar and immobile; absolute time flows uniformly without relation to anything external. The nineteenth-century luminiferous ether serves the same function for the propagation of light: if light is a wave, there must be a medium in which it waves. Michelson and Morley failed to detect the Earth's motion through the ether; Einstein abandoned the hypothesis as unnecessary. Water, air, fire, apeiron, Forms, hylē, res extensa, absolute space, ether: each era proposes a different answer, yet each answer obeys the same logic. There must be something prior, something containing, something sustaining. The demand for anteriority is not a rational discovery of a pre-existing pattern; it is a conceptual operation that has sedimented until it appears natural. To question that assumption is not to deny that the world has a history; it is to recognise that ontological anteriority is a conceptual operation, not a logical necessity. Twentieth-century physics makes that recognition unavoidable—not by arbitrary philosophical decree, but through the transformation of theories that describe the world with extraordinary precision.
1.2 Physical dissolution of the substrate
Twentieth century physics progressively dissolved this pattern into three interconnected levels. Each level rendered untenable a layer of substantialism that fuelled the requirement for a prior foundation.
Relativity dissolved Newtonian absolutes — and it did so not through philosophical speculation but through the internal requirement of empirical description. Special relativity (1905) emerged from the impossibility of detecting the luminiferous ether and the constancy of the speed of light in all inertial frames of reference. The consequence is immediate and devastating for substantialism: if the speed of light is the same for all observers, then space and time cannot be absolute — simultaneity is relative to observers in relative motion; two observers disagree about the simultaneity of distant events, about the lengths of objects, about the duration of processes — and both are right in their frames of reference. No frame of reference is privileged; none define "true" space and time. General relativity (1915) radicalised: space-time is not a rigid and immutable stage — it is a dynamic structure that curves in the presence of mass and energy. Gravity is not a force that acts through empty space; it is curvature of space-time itself. Wheeler formulated it precisely: matter tells spacetime how to curve; spacetime tells matter how to move. The Newtonian vessel became an active participant. There is no empty theater where the actors move; there is coupling between form and structure. The ether has been abandoned — special relativity requires no medium for light to propagate. Absolute space has been eliminated — there is no fixed frame of reference. Absolute time has been eliminated — there is no uniform flow independent of everything. The three substrate-receptacles of classical physics were dissolved or dynamized. What remains is not substance but dynamic structure, not receptacle but relational field. Gravity — the paradigmatic force of classical physics, which Newton conceived as action at a distance through empty space — reveals itself to be the intrinsic curvature of space-time. The apple does not "fall" because a mysterious force pulls it; it follows the geodesics of spacetime curved by the Earth's mass. Gravitation is not a bond between bodies in space; it is a geometric property of the spatio-temporal structure itself. The relational regime completely replaces the substantialist regime: distance is not a separation between two points in a fixed container; it is a property of the material configuration itself.
This dissolution is not loss — it is profound transformation of ontological categories. Where classical physics assumed fixed container and variable content, general relativity shows that container and content are inseparable: the geometry of space-time depends on the mass-energy distribution, and the mass-energy distribution depends on the geometry. Circularity is not a defect — it is the very structure of the relational regime that replaces substantialism. There is not first the stage and then the play; there is constitutive coupling between what tradition thought of as "substrate" and what it thought of as "process." Einstein's field equation encodes this circularity with mathematical precision: the Ricci tensor (geometry) equals the energy-momentum tensor (matter-energy). Geometry and matter determine each other; none is prior to the other. The substrate — fixed container that precedes the content — does not survive this circularity. What emerges is a relational regime: the properties that tradition attributed to "space" (distance, geometry, curvature) are properties of the material configuration itself, not of the independent container.
Quantum mechanics brought about an even deeper transformation. Heisenberg's uncertainty principle states that position and momentum cannot be simultaneously determined with arbitrary precision — not because the instruments are insufficient, but because the quantum entity does not possess both properties simultaneously in a definite way. Wave-particle duality shows that quantum entities are neither waves nor particles in the classical sense — they behave like waves in some experiences and like particles in others. Superposition allows quantum systems to be in multiple states before measurement. Entanglement correlates particles so that measurements of one instantly affect the state of the other, regardless of distance — a phenomenon that Einstein called "spooky action at a distance" and that Aspect's (1982) and subsequent experiments confirmed experimentally, violating Bell's inequalities and refuting local realism. The particle as "thing" — localized object with properties defined independently of observation — dissolved. What existed before measurement was not "a thing with hidden properties" (violation of Bell's inequalities rules out this interpretation under reasonable conditions); it was something for which the category of "thing" is inadequate. What quantum mechanics describes are not things that have properties; they are measurement processes that produce results. The underlying ontology remains controversial among physicists and philosophers — interpretations of Copenhagen, Bohm, Everett, GRW/CSL compete with each other without resolution — but none of them restore the classical substance. The quantum world is not made of "stuff" in the ordinary sense; it is constituted by something more relational and more processual, for which the inherited categories no longer provide an adequate name. What quantum mechanics has shown is not just that the world is "weird" — it is that the categories inherited from substantialism (localized thing, intrinsic property, defined trajectory) are not categories of the real in general; they are categories of a particular regime of stabilisation that works at the macroscopic scale and fails at the scale where material compatibilities are reorganised. Quantum dissolution is not relativization — it is categorical displacement: the properties that tradition attributed to "substance" present themselves, in the quantum framework, as effects of measurement regimes, not as intrinsic attributes of self-sufficient entities.
Quantum field theory took the dissolution to its conclusion, operating an even more radical transformation. Particles are not things that inhabit space; they are excitations of quantum fields that permeate space-time. The electron is not a localised corpuscle; it is the quantum of the electronic field. Creating a particle is exciting the field; to annihilate is to de-excite. Two electrons are indistinguishable — not because they look alike, but because they are identical excitations of the same field, like two identical waves in the same ocean. The ontology of things-in-space gave way to the ontology of fields-that-constitute-space. There is not first empty space as a receptacle and then things; there are fields whose dynamics produce both what we call "particles" and what we call "space". The quantum vacuum is particularly revealing: in classical physics, a vacuum is pure absence; In quantum field theory, the vacuum is a state of minimum energy — not absence but particular configuration. Virtual fluctuations continually create and annihilate pairs of particles; the Casimir effect (measurable force between conducting plates, effect of vacuum pressure) demonstrates that these fluctuations are real; the zero point energy is non-zero even in the ground state. Physical "nothing" is nothing metaphysical: it is a dynamic field, an operative configuration, a material regime with measurable properties. The quantum vacuum contains energy — and this energy is not metaphorical; it is calculable, measurable (Casimir effect), and constitutes one of the most profound problems in contemporary physics (the "cosmological constant problem" — the discrepancy between the vacuum energy calculated by quantum field theory and that observed cosmologically differs by a factor of 10¹²⁰, the largest quantitative discrepancy in the history of physics). This conceptual transformation deserves emphasis: the ontology of things-in-space gave way to the ontology of fields-that-constitute-space. The classical substrate — empty receptacle containing substances — dissolves into an ontology in which fields are primary and "things" are derivatives, excitations, ways of being of the fields. The inversion is complete: where tradition thought of things that inhabit space, contemporary physics thinks of fields whose dynamics constitute both things and space. There is no outside; there is no container; there is no more fundamental level than the fields. The ontology is flat — there is no stratification between substrate and process, between receptacle and content. What tradition called "substance" is redescribed as a stabilizing effect — a local, provisional configuration of fields, not a permanent entity that subsists under change.
The dissolution operated by twentieth century physics is, therefore, threefold and progressive. First level: relativity has made the receptacles — absolute space, absolute time, luminiferous ether — untenable. Spacetime is not a container; it is dynamic structure. Second level: quantum mechanics has dissolved substances — the particle as a localized thing with intrinsic properties gives way to procedural entities whose properties depend on the conditions of measurement. Third level: quantum field theory has rendered the ontology of things-in-space inoperative — what there are are fields whose dynamics produce both "particles" and "space." At each level, a layer of substantialism has been displaced. What remains is not nothing — it is a procedural, relational, dynamic regime. But this regime has a property that the substantialist tradition did not conceive: it is immanent without being substantial. There is no exterior that merges it; there is no substance that supports it; operates from itself. It is at this point that a decisive differentiation with Spinoza emerges. Spinoza's radical immanence — Deus sive Natura, no transcendence, no exterior creator — seems to converge with the dissolution of substrates. If there is no external receptacle, production is immanent; Nature is not made, it makes itself. Natura naturans is an immanent productive power that expresses itself in infinite ways. But Spinoza's Substance is infinite, eternal, the cause of itself — causa sui. It functions as the ultimate foundation: the arché under a new name. The attributes (extension, thought) and modes (particular things) derive from the Substance with logical necessity. The relationship Substance→modes is derivation: the multiple comes from the one; emergence is a foundational effect. Spinoza destroys the transcendence of the creator but preserves the substance of the foundation. The position supported here maintains radical immanence — no exteriority to reality, no transcendence, no creator operating from outside — but dissolves the substance that sustains it in Spinoza. Immanence does not mean being based on a foundation; means baseless operating regime. Spinoza's immanence is the immanence of Substance — everything is in God, everything is expressed by Substance. Immanence without substance is immanence of process — everything operates from within itself, with nothing supporting it from below. There is no Substance from which modes derive with logical necessity. There is a material operation whose emergence is not based on anything prior to it. The difference is precise: where Spinoza writes necessity, the position supported here writes contingency. Spinozan modes derive from Substance as consequences derive from axioms — with logical necessity. Material emergence does not derive from anything — it reorganises itself, without the next configuration being contained in the previous one as conclusion is contained in the premises. Spinozian immanence is necessitarian; immanence without substance is contingent. This difference is not a nuance: it is the border that separates a philosophy that maintains the foundation (in the form of logical necessity) from one that dissolves it without recourse.
1.3 Baseless emergence models
Contemporary physics proposes models of "quantum creation" that operate in this regime of groundless immanence. It is necessary to distinguish three categorically distinct positions on cosmic emergence, because confusion between them is a source of persistent errors. The first position is classical substantialism: the universe emerges "from" something — arché, hylê, absolute space, ether. Emergence is derivation; the substrate precedes and supports. The second position is theological ex nihilo creation: God creates from absolute nothing — total metaphysical absence of any determination — by an act of free will. "Nothing" is radical: there is no matter, there is no space, there is no time, there are no laws. The third position is emergence without substrate: it refuses both the classical substrate and absolute nothingness. The three positions are irreducible to each other and cannot be linked to each other without losing meaning. Confusing the third with the first (as if physics simply replaced "ether" with "quantum field" while maintaining the derivation scheme) is an error that ignores the dissolution of anteriority — the quantum field is not the substrate from which the universe emerges; It is a theoretical framework within which emergence is described. Confusing the third with the second (as if "quantum creation" were theology without God) is an error that projects onto physics a narrative structure that is foreign to it — creation ex nihilo presupposes an act of will and nothing absolute, categories that no physical model uses.
Alexander Vilenkin described in 1982 a process of quantum tunneling: the universe emerges "out of nothing" — but Vilenkin's "nothing" is not total metaphysical absence. It is the absence of classical geometry, not the absence of quantum structure or laws. The universe "tunnels" from a configuration without classical spacetime to a configuration with expanding spacetime. There is no temporal "before" — time emerges with the universe. There is no agent — the process is spontaneous, governed by quantum laws. Vilenkin's "nothingness" is already described by a theoretical regime that has its own conditions of possibility — quantum field theory operates as a framework within which the model makes sense. The analogy with alpha tunneling is instructive: an alpha particle that escapes an atomic nucleus does not violate the laws of physics — it crosses a potential barrier that classical mechanics declares impenetrable but that quantum mechanics allows it to cross with calculable probability. Vilenkin applies the same logic to cosmology: the universe "tunnels" from a geometryless configuration to a geometry configuration, without any classical mechanism "pushing" it there. The spontaneity of the process is the essential point — there is no engine, there is no efficient cause, there is no decision. Tunneling occurs because quantum mechanics allows it, not because something compels it. The distinction between "being permitted by laws" and "being caused by an agent" is ontologically decisive: Vilenkin's universe emerges not because it was caused but because nothing prevents it — the probability of tunneling is non-zero, and that is enough. Vilenkin's "nothingness" is already described by a theoretical regime that has its own conditions of possibility — quantum field theory operates as a framework within which the model makes sense.
Stephen Hawking and James Hartle proposed the "no boundary" condition in 1983: in imaginary time (standard mathematical technique in physics), the initial singularity disappears. Space-time becomes a closed regular surface, like a sphere without edges or privileged points. The question "what was there before the Big Bang?" dissolves: it is like asking "what's north of the North Pole?" — applies category (north direction) where it ends. The non-boundary condition is not a prior state that causes the universe; It is a condition of mathematical consistency that determines the wave function of the universe. There is no "before" because time is the internal coordinate of the universe, not the external coordinate in which the universe is located. The analogy with the spherical surface is precise and deserves development: just as a sphere surface is finite but without an edge — walking on it you never find a limit, you never reach a "margin" --, space-time in the Hartle-Hawking proposal is finite but without an initial boundary. There is no first point; there is a regular surface that closes on itself. The proposal does not eliminate the question of emergence — it radically reformulates it: instead of asking "from what previous state did the universe emerge?", the question becomes "what condition of consistency determines the regime of primordial emergence?" The question moves from causality to consistency, from anteriority to structure. The change is categorical, not merely technical: the Hartle-Hawking model does not answer "what was before" with "nothing" or with "something else" — it dissolves the question by showing that the category of "before" does not apply.
Lawrence Krauss argued in A Universe from Nothing (2012) that physics explains the origin of the universe from nothing. David Albert has objected that Krauss's "nothing" — quantum vacuum with operative laws — is not philosophical "nothing" — absolute absence of all determination. The criticism is partially fair: if "nothing" means total metaphysical absence, physical proposals do not explain creation ex nihilo in the philosophical sense. But criticism presupposes that the notion of absolute nothingness is coherent — an assumption that can be questioned. The position held here does not align with either camp: neither with Krauss (who claims too much — that physics has "solved" the question of origin) nor with Albert (who assumes too much — that the notion of "absolute nothingness" is coherent and operative). Quantum creation models are not theological ex nihilo in disguise, nor a demonstration of creation from nothing. They are formulations of the best available theoretical framework that show that the category of temporal anteriority does not apply to primordial emergence. Models, not stabilised descriptions of the real itself. Caution is constitutive: using these results as indicators that the requirement of substrate or "before" is a metaphysical presupposition, not logical necessity — without canonizing them as definitive descriptions. The position is not agnosticism — one does not suspend judgment due to lack of information. It is an inscriptive discipline: recognition that every model operates within a formulation regime that has its own conditions of possibility. Theoretical physics does not describe the "naked" real; describes the real as formulable within specific mathematical frameworks. recognising this does not reduce the scope of the models; delimit it strictly. What can be rigorously stated is that the requirement of temporal anteriority — of a "before" in the substantialist sense — is not supported by the best theoretical framework available. What cannot be said is that these models describe the primordial regime "as it is", regardless of the framework that formulates them. The position is simultaneously strong and modest: strong because it uses the results of physics to dissolve an ancient metaphysical presupposition; modest because it recognises that the results operate within a specific theoretical regime and do not authorize statements about "the real itself". This double requirement — using physics without canonizing it — is the inscriptive discipline that governs all analysis.
1.4 Emergence as primary event
If the substrates have been dissolved — receptacles, substances, "nothing" — what remains? Emergence as primary event. The cosmos does not emerge "from" something; emerges — and that emergence is the fundamental ontological fact. Emergence is not derived from a prior foundation; It is the basic regime from which all stabilisation becomes intelligible. What appears as "substance" is late, local, provisional stabilisation — a configuration that lasts as long as the compatibilities that sustain it persist. A proton is stable — it persists for immensely long times — but its stability is not a substantial property; it is the effect of compatibilities between quarks, gluons and the dynamics of the strong field. If these compatibilities change (under extreme conditions of temperature and density, such as in the first fractions of a second after the Big Bang), stability dissolves and the configuration that seemed permanent turns out to be transient. "Substance" is a retrospective name for stabilisations that, while they last, seem like foundations.
Dissolving the substrate is not denying the real—it is recognising that the real does not need a foundation in order to operate. Emergence is not nothing; it is primary operation. The fear that dissolving the substrate leads to nihilism—that 'without foundation, there is nothing'—is understandable but misplaced. What this dissolution shows is that material operation is more original than any foundation that later seeks to justify it. Matter operates before any such justification, and foundation is itself a late construction imposed on an operation already underway.
Two consequences follow from this result. First: there is no ontological "before" in the sense that tradition presupposed. The category of temporal anteriority does not apply to primordial emergence. Asking "what was there before?" it is asking that structure be applied where it did not yet operate. The analogy is exact: asking what is north of the North Pole applies the notion of north direction to the point where the notion ends. Second consequence: the substrate is retrospective fiction — backward projection of the stability experienced in the present. The present is stable: there is solid matter, there are objects that persist, there are regularities that repeat themselves. It is tempting — and tradition has yielded to temptation for millennia — to project this stability backwards and assume that the "beginning" was also stable, that there was something solid, permanent, foundational. But the "beginning" was not stable; it was process, fluctuation, unprecedented reorganisation. The stability we experience now is a delayed result of the processes that produced it — not a precondition from which those processes derived. The rock that seems eternal is a provisional configuration of nuclear and electromagnetic interactions; the atom that seems indestructible is the stabilisation of quantum constraints that, under different conditions, would reorganise themselves. What appears to be a foundation is an effect; what seems permanent is durable.
If there is no previous substrate, the emergence has no basis. But will there be a subject? The question shifts: the requirement for an agent is as deep-rooted as the requirement for a substrate — and equally unjustified.
The cosmos does not emerge from something — it emerges, and that emergence is all there is.
Axis 2 — Dissolution of the subject
2.1 Genealogy of the agent requirement
The dissolution of the substrate leaves a gap that tradition immediately fills: if there is no permanent substrate, at least there is an agent — someone or something that brings about change. The demand is equally entrenched and equally unjustified. The Western philosophical tradition rests on an assumption that few dared to question until very recently: all change presupposes an agent. This presupposition is not innocent. It stabilises thought in a structure that spans centuries, from Aristotle to Kant, transforming the question "what changes?" in a previous subordination: "who changes?" The efficient cause question — who did it? — precedes and conditions all investigation into change. Without an answer to this question, change remains unintelligible to tradition. Only by understanding this genealogy — recognising the historical sedimentation that produced it — is it possible to dissolve the dependence between change and subject that tradition has consolidated as natural evidence.
Aristotelian causal architecture is the foundation: every change requires an efficient cause, every efficient cause rests on an agent. The chain ascends to the first engine — pure actuality without power, thought that thinks itself. Thomas Aquinas consolidated logic to the highest theological degree: the five ways converge in the requirement of the first agent — unmoved mover, uncaused cause, intelligent organiser. Without an agent, there is not just an absence of movement: there is an absence of intelligibility.
Descartes shifts the axis: the subject is no longer a celestial engine; it is reflective consciousness. The cogito establishes the subject as an irrefutable foundation and as a condition for all possible representation — without a subject, there is no legible world.
Kant takes the argument to its most sophisticated — and most difficult to dissolve — transcendental form. The subject is not just the cause of representation; It is what makes every experience possible. The categories of understanding — causality, substance, unity — are not properties of things in themselves; they are structures of the transcendental subject that shape experience. Without these structures, without this transcendental architecture, there would be no phenomenal world. The requirement for an agent reaches its peak: it cannot be demonstrated that the subject does not exist, as every demonstration presupposes it. Kant elevates the subject to a transcendental condition: it is not only an agent that acts in the empirically given world, but a condition without which there would be no experienceable world. The transcendental subject is not in the world; it is a condition of possibility for the world as an ordered experience. This elevation seems to make it immune to all empirical criticism: it can be shown that a certain empirical subject does not exist, but how can we show that the transcendental subject does not exist, if it is the presupposition of all demonstration?
Tradition has therefore consolidated the requirement of an agent in three registers: ontological (Aristotle — engine that initiates all change), epistemological (Descartes — subject without which there is no representable world), transcendental (Kant — structure without which there is no possible experience). In each register, the same implication: without a subject there is no intelligible operation. Genealogy is not a historical curiosity; It is a diagnosis of a habit that has become entrenched until it seems necessary. recognising sedimentation is a condition for recognising contingency.
It is at this apogee point that Nietzsche introduces a suspicion that undermines the entire construction. In the Genealogy of Morals, he states the blow precisely: "There is no 'being' behind doing, acting, becoming; the 'doer' is simply added to the doing — the doing is everything." Grammar requires a subject for every predicate: "the lightning flashes", "the rain falls", "the wind blows". But the linguistic structure does not reflect the structure of reality — it imposes a separation between agent and action that does not exist in the real process. There is no lightning that first exists as a substance and then performs the act of lightning; the lightning is all the lightning there is. Tradition hypostatized grammatical fiction into a metaphysical entity; projected the structure of language into the world, confusing grammatical necessity with ontological necessity. Nietzsche does not refute the subject; exposes the grammatical origin of the metaphysical presumption. Lightning is paradigmatic: no one supposes that there is an entity that first exists in potency and then performs the act of lightning — but grammar forces us to say "lightning flashes", subject and predicate. When the same scheme applies to cosmic emergence — "something caused the universe to exist" — the grammatical fiction is less visible but equally operative. We need a grammatical subject to form sentences; we do not need an ontological subject for processes to occur. The metaphysical category of the subject — permanent entity that subsists under its actions and that founds them — is not a given of reality but a projection of grammar onto reality. The consequence is direct: the requirement for a founding agent is a projection of linguistic habit onto emergence. The question that arises: what category captures the material operation where there is no subject?
2.2 Gesture as an impersonal operation
If there is no ontological subject, what is there? The answer must emerge from the material processes themselves — from the phenomenon to the operator, not from the operator to the phenomenon. The insufficiency of the existing categories (intentional action and passive event) is what forces us to think about a third: the gesture — an operation that produces difference without emanating from an agent.
The collapse of a star offers an unequivocal paradigm. For billions of years, the star remains in an equilibrium configuration — the thermal pressure of the core compensates for the gravitational pull of the mass. As long as nuclear fuel exists, this balance persists. But fuel is finite. When it runs out, thermal support disappears. What happens then is not a failure or a defect — it is an excess of matter over the containment that held it back. Gravitational mass exceeds any form of constraint. The outer layers fall with violent acceleration. The density in the core reaches extreme values. In this process, there is massive production of difference: heavy elements — gold, platinum, uranium — are forged in the final nuclear reactions; shock waves propagate; neutrinos flood space; gravitational waves ripple through spacetime; possibly a black hole or neutron star emerges. Radical alteration of the local material configuration. And none of this requires an agent. The star did not choose to collapse. It was not intended to create heavy elements. It did not represent its own end. The operation occurs because matter exceeds all configuration — material potentiality overflows any form that held it. The heavy elements forged in the supernova will be incorporated into molecular clouds, where gravitational and thermodynamic constraints will produce new configurations — new stellar systems, new distributions of matter. The difference generated by the collapse propagates: not as an effect of an intentional cause, but as a reorganisation of material compatibilities. Each supernova redistributes the chemical composition of the interstellar medium, changing the conditions under which future configurations will form. The production of difference is cumulative without being oriented — it accumulates consequences without aiming for results.
Mineral crystallization displays the same pattern with simplicity that favours conceptual clarity. Let us consider a saturated solution of sodium chloride: the salt molecules are dispersed, surrounded by water molecules, kept in solution by electrostatic interactions with the solvent. No fixed geometry; no regular pattern. But the system is not neutral — it retains tensions: the attractive forces between Na⁺ and Cl⁻ ions compete with the solvation forces. The solution remains as long as certain conditions persist — temperature, pressure, concentration. No agent orders the molecules. But when conditions are disturbed — cooling, vibration, addition of seed crystal — order erupts. The molecules organise themselves into precise geometric patterns: salt cubes, snow hexagons, quartz prisms. Each molecule responds to the constraints of its immediate surroundings — electromagnetic forces, distances, geometries. No molecule "knows" the final structure. None direct the construction. And yet order emerges, precise, reproducible. Conformity without a conformer. The result may be of extraordinary geometric beauty — the prisms of quartz, the plates of mica, the cubes of salt, the branches of snowflakes — but the beauty is not designed. It emerges from local constraints repeated millions of times, in each site of the crystalline network, without any level of the process anticipating the global pattern. The symmetry of the crystal is the effect of elementary interactions, not the execution of a plan.
Quantum vacuum fluctuations offer an even more radical example. In a vacuum, particle-antiparticle pairs continually emerge, exist for infinitesimal fractions of time, and annihilate each other. This process is not regulated by an external agent. It is a property of the quantum regime, a direct consequence of the uncertainty principle: energy can be "borrowed" from the vacuum for sufficiently brief intervals. The Casimir effect measurably confirms — the force between conducting plates is fluctuating vacuum pressure. Creation and annihilation occur without creator or annihilator — they just happen. Hawking radiation follows an identical pattern: near the horizon of a black hole, virtual pairs split; one particle falls inside, the other escapes. Emission of energy without emitter, reduction of mass without an agent to enact it. Matter operates; no one operates it. Hawking radiation is particularly instructive for the question of gesture because it combines two dissolutions: the dissolution of the substrate (the vacuum is "nothing") and the dissolution of the subject (the emission is no one's act). The black hole does not "decide" to emit; the laws of quantum physics in curved geometry determine the process without any instance initiating or controlling it. The result — slow evaporation, loss of mass, fate of dissolution on immense time scales — is a consequence of compatibilities between quantum mechanics and gravitation, not program execution.
Prebiotic chemical systems reveal dynamics of self-organisation where order emerges from local interactions. In a fluid heated from below — Henri Bénard's classic experiment — when the temperature gradient reaches a certain threshold, convection ceases to be disordered and spontaneously organises itself into hexagonal cells, each rotating systematically. Below threshold, homogeneity; above, hexagons. The transition is abrupt. No agent shapes cells. It is the local interactions themselves — upward heat, viscosity, gravity — that produce this conformation. The Belousov-Zhabotinsky reaction offers even more remarkable dynamics: simple chemical compounds, when interacting, generate rhythmic oscillations — alternating colours, spirals propagating in a thin layer with geometric beauty. Spatio-temporal patterns in a purely chemical system, without external coordination — waves that propagate and reorganise, spirals that rotate and interact, in a dance that no choreography governs. Prigogine called these systems "dissipative structures": ordered configurations that maintain themselves precisely because they are far from equilibrium, fed by a continuous flow of energy. Order is not rest; it is a process sustained by dissipation. The pattern is identical: simple components produce configurations that none of them designed. The formation of large-scale cosmic structures follows identical logic: after the Big Bang, the universe was almost perfectly uniform. Small density heterogeneities, amplified by gravity over hundreds of millions of years, led to the formation of galaxies, clusters, filaments of matter. Where the density was slightly higher than average, gravity attracted more matter, increasing the density further; where it was inferior, the matter was drained away. The gravitational instability was not directed by anyone — it is the property of gravity to operate attractively on any density heterogeneity, amplifying it. Where density exceeds the average, gravity concentrates more matter; where it is inferior, it drains it. The process is blind, local, inevitable given the initial conditions — but it is not teleological, because the initial conditions do not "contain" the outcome. There was no architect of the galaxies, no engineer of the stars. The Milky Way, with its one hundred billion stars, is the result of impersonal processes that unfolded over billions of years without plan or intention.
The distinction imposed by these phenomena needs to be stated with care. The philosophical tradition inherited only two broad categories for change: intentional action (performed by an agent with a purpose and a representation of the end) and passive event (something suffered from outside by an inert entity). A supernova is not intentional action—no one executes it and no one aims at it. Yet neither is it a merely passive event, a blow received from without. It is a transformation arising from the internal compatibilities of a stellar configuration. Crystallisation is not the work of a crystalliser, but neither is it an externally imposed disturbance; it is a reorganisation that appears when local constraints cross a threshold. Quantum fluctuations are neither acts of a creator nor imposed disturbances: they are the permanent regime of a field in its ground state. Each phenomenon occupies the same quadrant: active yet non-intentional. Difference is produced, but without a subject who intends it. The term that best captures this quadrant is gesture: a material operation that produces difference without emanating from a subject and without reducing to inert passivity. Matter is neither inert nor intentional; it operates, transforms, and produces difference without being either passive or conscious. The gesture therefore dissolves the inherited active/passive dichotomy and installs a third regime: impersonal operation. That is not merely a classificatory gain. It has consequences for every thought of origin. If the active/passive dichotomy were exhaustive, then cosmic emergence would have to be either an act of creation or an accident suffered by a substrate. The gesture opens a third path: emergence as a material operation that is neither creative act nor passive accident, but the impersonal production of difference prior to both intention and passivity.
Excess is a fundamental ontological driver — and this point requires particular attention because it determines how the relationship between configuration and transformation is understood. Matter contains potentialities that exceed any actualised configuration. Each gesture updates some possibilities and leaves others unactualized. Carbon can form diamond or graphite or fullerene or graphene — same elemental composition, radically different configurations, incomparable physical properties. Water can crystallize into at least fifteen different structures, depending on pressure and temperature conditions. Silicon can organise itself like sand, like quartz, like glass. In each case, the configuration carried out is a selection between multiple possible ones — and the unrealized ones have not disappeared; they remain as potentialities of the material, available for future reorganisation if conditions change.
This excess is not a lack or waste — it is a condition for reorganisation. Matter is excessive because it gestates: it always produces more than any fixed determination can retain. Stellar collapse is not a failure of the star — it is an excess of matter over the form that held it. The nuclear fusion that sustained the balance was itself a manifestation of excess: nuclear power that, when operating, transformed hydrogen into helium, helium into carbon, carbon into oxygen — each transformation producing a new difference. Quantum fluctuations are not vacuum anomalies; They are an expression that the fundamental state of the fields contains generating power that no configuration can exhaust. The Casimir effect measures this power: the vacuum energy is real, measurable, operative. Excess produces; failure describes. The distinction is constitutive and should not be inverted under any circumstances: to describe stellar collapse as a "failure" of balance is to retroactively describe — based on the previous configuration — what is, ontologically, excess of matter over containment.
2.3 Philosophical affinities and deviations
If the gesture dissolves the subject's need, the question remains: does this concept have philosophical history? The answer is affirmative — and the genealogy is instructive because it shows, in each case, the affinity and the cost.
Spinoza provides a first approximation and its mobilization is inevitable because Ethics constitutes the most systematic attempt to think about production without transcendence. Natura naturans is a pure immanent operation — productive power that does not emanate from an external agent but is an expression of the nature of the Substance itself. Nature is not made by anyone; it makes itself, necessarily, according to the infinity of its attributes. Proximity to the gesture: the operation is immanent, it does not emanate from an external agent. Cost — already identified: the Substance remains as the ultimate foundation, cause of itself, ontological guarantee. Spinoza's immanent operation is the operation of something — of Substance that necessarily changes according to its infinite nature. The gesture, as understood here, is not based on anything — there is no Substance that guarantees production, there is no infinite nature that determines it. There is contingent local compatibility, not substantial necessity. The difference seems subtle but is ontologically decisive: in Spinoza, production is necessary (Substance cannot not change); here, production is contingent (reorganisation may or may not occur, depending on local compatibility).
Heidegger introduces Ereignis — the appropriative event that shifts the question of origin from "what changes" to "how change occurs." The gain is real: it opens up conceptual space for operations that do not require a subject in the classical sense — the happening is more original than the agent. The cost is severe: Ereignis remains tied to Dasein, to the horizon of understanding. In the pre-symbolic field there is no Dasein — there is no experiential horizon where being "gives itself", there is no understanding that receives appropriation. Heidegger opens the door to subjectless operation but is unable to cross it: his thought maintains the correlation between being and understanding as indispensable. The gain is incorporated (the origin as happening, not as the cause) and the cost identified (the dependence on Dasein) without the Heideggerian position being adopted as a framework.
Deleuze offers maximum affinity: the event is impersonal, singularity exists before any individuation. In Logic of Sense, Deleuze distinguishes between states of things (mixtures of bodies, bodily causes) and events (incorporeal effects, singularities that are distributed on a surface). "It rains" — the event of rain does not belong to anyone, it is not the property of a subject. Grammar forces the illusion that "rain" is something that "falls"; but the original event is pure becoming, without owner, without centre. In Difference and Repetition, the position becomes radical: individuation is primary in relation to constituted individuals; pre-individual singularities are distributed in fields of intensity that precede every stabilised form. Affinity with gesture: both operate without a subject, both produce difference, both precede all representation. The cost — the postulation of a virtual as a separate ontological plane — will be differentiated later, when the issue of representation and its limits is made explicit.
Simondon deserves particular attention because he represents the twentieth century's most rigorous attempt to think individuation without an individuator, and the closeness of his position to the present argument makes the difference all the more important. In L'individuation à la lumière des notions de forme et d'information (1958), Simondon rejects the hylomorphic scheme and treats individuation as the primary operation from which individuals emerge as partial and provisional results. The gain is decisive: individuation does not presuppose an already constituted individual agent; the pre-individual field retains tensions and incompatibilities that individuation partially configures. Yet two costs remain. First, Simondon treats the pre-individual as a real field with its own structure prior to individuation, an energy reserve awaiting resolution. The view advanced here refuses that move: the 'pre-individual' is itself a retroactive designation, a cut imposed by the individuated regime rather than a description of a pre-existing field. Secondly, Simondon conceives individuation as a resolution of tensions. The account given here refuses that language as well: emergence is not resolution. What occurs is contingent local compatibility. Tensions are not solved; they yield configurations that may, by contingency, endure. Individuation is not an answer to a prior problem; it is reorganisation without purpose. If individuation is resolution, a minimal teleology re-enters the picture: the field appears to 'seek' equilibrium. If individuation is local compatibility, radical contingency remains intact. That is not a terminological nuance but an ontological difference.
2.4 Late representation and refusal of the virtual
If the gesture is a material operation that produces difference without a subject, then the representation — which presupposes a system already sufficiently differentiated to refer to its own operations — is necessarily subsequent to the gesture. The issue is demonstrated logically-ontologically, without recourse to biotic examples. The star collapses without representing itself as collapsing. The crystal is formed without a model of what it will be. Quantum fluctuation emerges without the vacuum "knowing" that it floats. The blindness of the gesture is constitutive, not a defect. If the gesture knew what it did, it would be intentional action — and the subject requirement would be justified. Every representation is itself a late and complex gesture — a material operation that, due to sufficient complexity, became capable of referring to other operations. The representation does not found the gesture; emerges from it. Recognition is always a posteriori reconstruction, never direct access to the operation that made it possible. The blindness of gesture is not a limitation that future evolution can overcome — it is a constitutive condition of the relationship between operation and representation. Representation requires distance between what represents and what is represented; this distance presupposes already accomplished differentiation; and the accomplished differentiation is itself a gesture prior to the representation that cuts it out. The circle is not vicious; it is constitutive.
Here the differentiation with Deleuze that the previous affinity made urgent takes place. The Deleuzian impersonal event dissolves the requirement of subject — the affinity with the gesture is maximum. But Deleuze postulates the virtual as an ontological plane distinct from the actual one: a field of potentials that is no less real than the actualised one, which "insists" without actually existing. The position supported here refuses: it works only with present material configurations, in their local compatibilities and incompatibilities. There is no plane of transcendental immanence separate from material operations. The Deleuzian virtual can function as a new substrate — a reserve of power that underlies the update. Dissolving substrate and then installing a virtual one that acts as an arché would be a contradiction. Second cost: Deleuze preserves minimal teleology — the virtual "tends" to update itself, there is "pressure" of differentiation. The position supported here rejects all teleology: the reorganisation is contingent, not oriented. The gesture, as understood here, is strictly material and present — it does not refer to a transcendental plane or a virtual reserve. It generates here, now, in this configuration — without reference to a plan that precedes or guides it. The difference is operationally decisive: if the gesture refers to a virtual that "insists", the substrate returns under a new name; if the gesture is strictly material and present, the substrate remains dissolved. Refusal of the virtual is not impoverishment — it is a condition of consistency. The gesture does not need ontological reservation; it only needs local material compatibilities that, by contingency, produce configurations. The supernova does not refer to a virtual plane; refers to gravitational, thermodynamic, nuclear constraints — material, present, measurable. The crystal does not actualise transcendental field potential; reorganises local molecular interactions.
If emergence has no substrate or subject, how is it recognised as an "origin"? The question shifts: the origin is not given — it is instituted. What follows shows that "origin" is the retroactive construction of stabilised systems that needed a beginning to become intelligible.
The gesture does not await an author. It operates—and in its operation, before any name, difference is already there.
Axis 3 — Origin retroactivity
3.1 Fixed point genealogy
Tradition offers a last resort: if the emergence has no substrate or subject, at least the origin is a fixed point in the past — an event that was recognised as inaugural at the moment it occurred. The genealogy of this last resource reveals that it is also a construction.
Mythical cosmogonies installed the inaugural moment as a paradigm: the moment in which the world passed from non-being to being, from chaos to order, from darkness to light. Creative agent, founding act, clearly demarcated before and after. The illud tempus — the primordial time — functions as an ontological foundation: what happened "back then" founds and legitimises what happens now. The origin is a presence repeatable by ritual — recoverable, accessible, paradigmatic. The biblical Genesis narrates creation in six days; the Babylonian Enuma Elish describes Marduk's victory over Tiamat and the formation of the world from her body; the Indian Rig Veda sings of the primordial sacrifice of Purusha, whose dismembered limbs constitute the parts of the cosmos. In each case, there is an absolute inaugural moment — a clearly demarcated before and after, a creative agent and a founding act that separates what exists from what does not. The ontological cost is the presumption that the origin is a full, accessible, repeatable presence — that the "inaugural moment" was recognised as such at the moment it occurred and that it can be recovered.
Pre-Socratic arché works differently but with similar results. Already genealogized in terms of substrate, it is now necessary to re-examine it in terms of temporality. The arché is not just the first in the chronological sequence; it is the permanent foundation, ontologically prior at each moment. Thales' water is not that which existed "before" and then disappeared — it is that which continues to sustain every thing that exists now. Anaximander's apeiron does not precede in time; It is an inexhaustible reserve that operates permanently. The presence of the foundation is not past; is constant. The cost, in the register of retroactivity, is distinct from the substantialist cost: here, what is lost is not the possibility of thinking without a substrate (already dissolved) but the possibility of thinking of the beginning as construction — if the foundation is permanent, the beginning is not an event; It is an eternal condition. The fixed point disappears, but the permanent presence takes its place with identical result: the origin is given, not construction.
Christian theology radicalised the rupture. Creatio ex nihilo — God creates everything from absolutely nothing, including matter and time. Augustine, in the Confessions (book XI), argued that time was created together with the world: asking "what did God do before creating the world?" is a poorly formed question, as there was no "before" before creation. The origin is absolutized — radical ontological rupture between nothingness and being, an act of divine will that founds ex nihilo the totality of what exists. Nothing is more original than creation; before her there was literally nothing. Augustine anticipates Hartle-Hawking in formal structure — time does not precede creation, any more than time precedes the universe — but for theological, not cosmological, reasons. Structural convergence does not authorize equivalence: Augustine aims to protect divine omnipotence (if there had been "before", God would have been idle, which is incompatible with perfection); Hartle-Hawking aims to eliminate the singularity as a mathematical artifact. The reasons differ radically, even when the formal conclusion coincides. What is important to remember is the common structure: in both cases, the question "what was there before?" is dissolved — not because the answer is "nothing" but because the question presupposes a category (temporal anteriority) that does not apply to what it is trying to describe. The dissolution of the question is more radical than any answer.
Modern science breaks with theology while retaining the narrative structure of origin as a fixed point—and this is perhaps the most instructive case precisely because the retention is unintentional. Big Bang cosmology, developed through the work of Georges Lemaître (whose 'primeval atom' already reproduces the structure of an archē), Edwin Hubble, and George Gamow, proposes that the universe emerged roughly 13.8 billion years ago from a state of extreme density and temperature. Structurally, the Big Bang functions as a secularised archē: singularity as zero point, 'birth' of the universe, 'first moments', 'moment of creation'. The language is revealing. 'Birth' presupposes a prior non-existence; 'first moments' presuppose counting from an absolute zero; 'moment of creation' carries theological structure into cosmological vocabulary. Even when physicists insist that singularity is a mathematical limit rather than an observable event, the narrative of origin as fixed point remains operative in both language and popular exposition. Contemporary physics, however, makes that narrative increasingly unstable. Quantum-gravity models suggest that singularity may be an artefact of classical equations. Bounce cosmologies, multiverse models, and scenarios of quantum fluctuation all dissolve the singular origin. Science thus approaches retroactivity without always naming it: origin is not given at a fixed point but constructed retroactively through models that organise present material conformities. The striking fact is that contemporary physics internally dissolves what its own public language continues to preserve.
Derrida has exposed the mechanism underlying this persistence of the fixed point — and his analysis is mobilized here by its diagnostic precision, not by the adoption of its conceptual framework. In De la Grammatologie (1967), Derrida demonstrated that the Western philosophical tradition has systematically privileged presence — the original over the copy, speech over writing, the signified over the signifier, the immediate over the mediated. The origin is a paradigm of this full presence: the moment in which the thing occurs entirely, without mediation, without difference. Différance — simultaneous play of difference and postponement — logically precedes every absolute origin. Every origin is already deferred, supplemented, inscribed in a chain of endless remissions. There is no simple origin; every presence is already a difference. Divergence must be strictly maintained: différance operates in the regime of signification — on the impossibility of full presence in the symbolic-textual regime. The position supported here operates on the ontological-material plane — on real material conformities whose organisation as "origin" is retroactive. The two plans are not confused — and maintaining this separation is not a mere academic scruple; It is a condition of rigour that determines what each analysis can and cannot state. Derrida shows that in the symbolic regime there is no full presence — all meaning is deferred, supplemented, inscribed in a chain of remissions. The position supported here shows that in the material regime there is no fixed point — every "origin" is a retroactive construction based on present conformities. Material retroactivity is not différance, and différance is not material retroactivity. Maintaining this separation is a condition of rigour that cannot be relaxed for the sake of argumentative convenience. Transferring categories from one plane to another — reading différance as a material operation, or reading material retroactivity as a textual game — would be confusing registers that function at different levels and with different scopes. The temptation is real: the formal convergence between the Derridean dissolution of full presence and the dissolution of origin as a fixed point could suggest profound identity. But formal convergence is not ontological identity — the operations are distinct, the plans are distinct, the scopes are distinct.
Foucault reinforces this in another way. If Derrida dissolves the fixed point in the symbolic regime (there is no full presence), Foucault dissolves it in the historical regime: at the beginning of things there is not the pure identity of the origin; there is discord, nonsense, the contingency of conflicting forces. Ursprung — metaphysical search for the first essence — is retrospective fiction. The genealogist finds that the beginning is neither solemn nor grand; it is modest, accidental, the product of contingent encounters that only retrospectively organise themselves as a "foundation". Pure origin — the immaculate moment in which the thing occurs entirely — is a construction that masks the constitutive discord. Foucault adds Derrida: if there is no full presence in the symbolic regime, and if in the historical beginning there is discord and not purity, the fixed point dissolves into two complementary fronts — symbolic and material. Dissolution is not destruction: it is not asserted that there is "no origin" (which would simply be false — the material conformities are real). It is stated that the origin as a pure fixed point — full presence, inaugural moment that is recognised as such — is a retrospective construction of stabilised systems that need a beginning to organise their intelligibility.
3.2 Cosmology as retroactive reconstruction
If philosophical genealogy dissolved the fixed point as a category, contemporary cosmology exemplifies — without fully knowing it — the retroactive structure that dissolution reveals. Cosmology is forensic science: it reconstructs retroactively from present material conformities, it does not directly observe the "beginning". The analogy with criminal investigation is instructive and can be taken far: the detective arrives at the crime scene after the crime has occurred; reconstructs what happened based on present conformities — position of objects, stains, temperatures, testimonies. The "crime scene" did not appear as such at the time of the crime; it is a retroactive construction by the researcher. The cosmologist operates in a similar way: he arrives at the present universe and reconstructs its past based on material conformities observable now. The Big Bang was not observed. No one was present thirteen thousand eight hundred million years ago to record the primordial emergence. What we have are material conformities — present, measurable, reproducible configurations — from which we can retroactively infer what happened. Inference is not free speculation; it is constrained by multiple observational convergences and testable mathematical models. But it remains inference — reconstruction, not observation. The origin, as an object of knowledge, is construction — not direct observation of a past event. Cosmology operates as a forensic science: just as the detective reconstructs the crime based on evidence at the scene, the cosmologist reconstructs the past based on present material conformities. This retroactivity is not a methodological weakness — it is the only possible way to know what is no longer observable. The past, by definition, is past; only its effects remain in the present to be interpreted.
The cosmic background radiation is the most eloquent conformity. Accidentally discovered by Arno Penzias and Robert Wilson in 1965, when they were looking for sources of noise in a communications antenna, this microwave radiation uniformly fills the entire space, with a temperature of approximately 2.7 Kelvin. Its existence was predicted by George Gamow and collaborators in the 1940s: if the universe began in a hot, dense state, the radiation emitted when the first atoms formed — about 380,000 years after the Big Bang, at the time of recombination — should be detectable, stretched by expansion to microwave wavelengths. We don't see the Big Bang; we see material compliance present — radiation measured now — whose interpretation requires a starting scenario. The "photograph of the young universe" is a retroactive reconstruction, not a direct record. Anisotropies — tiny temperature variations, on the order of one part in a hundred thousand, mapped with increasing precision by the COBE, WMAP, and Planck satellites — are conformities whose distribution of possibilities is constrained by the large-scale structures now observed. The galaxies and clusters we see retroactively constrain the field of what anisotropies could have been. The primordial heterogeneities were not "seeds" in the sense of containing within themselves the plan of what they would produce — they were differences in density that, amplified by gravitational constraints over billions of years, led to the observed structures. The narrative that organises them as "seeds of the galaxies" is a retroactive operation of the cosmological present on past configurations.
The abundance of light elements offers an identical paradigm. Primordial nucleosynthesis — in the minutes following the Big Bang — produced a specific proportion of hydrogen (~75%), helium (~25%), with traces of lithium and beryllium. The ratio was predicted theoretically by Alpher, Bethe and Gamow — in the famous 1948 "αβγ" paper — and confirmed observationally. Abundances are current material configurations: we measure isotopes present and infer "first three minutes" conditions — temperature, density, photon/baryon ratio. The origin is not given; is calculated from present distributions. The agreement between theoretical prediction and observation is remarkable — it constitutes one of the most robust confirmations of the standard cosmological model. But the robustness of agreement does not alter the epistemic structure: knowledge is retroactive, inferred from current settings, not obtained by direct observation of the "first three minutes."
The expansion of the universe displays the same retroactive structure with particular clarity. Hubble observed in 1929 that galaxies are moving away from us — and that the speed of separation is proportional to the distance. The further away a galaxy is, the faster it moves away. The relationship is linear: the proportionality factor (Hubble constant) measures the expansion rate. Extrapolating backwards — reversing the expansion — all galaxies converge in the remote past, about thirteen thousand eight hundred million years ago. The "moment" of the Big Bang is the limit of this extrapolation. But extrapolation is precisely retroactive inference — one projects backwards the movement observed now and constructs an origin. What is measured is present movement; what is inferred is the past state. The initial singularity — the point of infinite density and temperature that the equations of general relativity predict — is the limit of the model's extrapolation, not a witnessed event. Where classical equations predict singularity, what is found is the limit of the theory, the point where general relativity ceases to be applicable and a quantum theory of gravity (not yet experimentally confirmed) would be necessary. Where there is a lack of data and confirmed theory, what is obtained is the limit of the calculation, not an accessible event. The "origin" that extrapolation produces is inference, not observation — construction from present movements, not a record of a past moment.
Gravitational waves, detected for the first time in September 2015 by the LIGO observatory (Laser Interferometer Gravitational-Wave Observatory), complete the picture with an entirely new observation channel. They are ripples in the very fabric of space-time — distortions that propagate at the speed of light, compressing and stretching the space they pass through. Predicted by Einstein in 1916 as a consequence of general relativity, they remained for almost a century as a theoretical prediction that was not directly confirmed. The first detection — GW150914, produced by the merger of two black holes about 1.3 billion light-years away — measured distortions of space on the order of a fraction of the diameter of a proton. Gravitational waves carry information about remote events — black hole mergers, neutron star collisions — that no other observation channel allows us to access. The waves we detect were emitted millions or billions of years ago; they now arrive as messengers of distant processes. The event that emitted them only becomes a "datable event" for us when we detect them — when interferometers register the compression of space. The detection retroactively establishes the event as a significant moment in cosmic history. The event occurred independently — the black hole merger happened whether or not there were interferometers to record it. But its organisation as a narrable event — as "fusion that occurred 1.3 billion years ago in the constellation of tal" — is the operation of the present on the past. Detection does not create the event; it retroactively establishes it as a datable, localizable event that can be inserted into a cosmic narrative. Without detection, the event would have been — but it would not have been "for us." Cosmology does not observe origins; rebuild them.
The pattern is identical in all four cases: present conformity (radiation, abundances, movement, waves) → retroactive inference → "origin" scenario. Each compliance is real — not arbitrary construction. But the organisation of conformities as a beginning narrative is a subsequent operation, not a record of something that was presented at the moment it occurred. This distinction between the reality of conformity and the retroactivity of the narrative is constitutive and cannot be elided without consequence. Eliminating it in one sense (denying the reality of conformities) leads to radical constructionism — fiction without anchorage. Eliminating it in the opposite sense (affirming that the origin narrative is as "given" as the conformities) leads to naive realism — recoverable full presence.
Retroactivity is not an epistemological defect — it is the very condition of knowledge of the origin. What is retroactive is not compliance; it is the cut that organises it as "origin". Material conformities exist independently of being organised; its organisation into a beginning narrative depends on present inferential conditions. In each case — radiation, abundances, expansion, waves — real conformity and retroactive narrative coexist without becoming confused. The confusion between the two is precisely the error that analysis dissolves.
3.3 Posteriority, narrative and clipping
Every origin narrative is subsequent to the origin it narrates. Genesis was written centuries after the events it describes; the Big Bang was theorized thirteen thousand eight hundred million years after the event it postulates; The cosmogonic narratives of all cultures were produced by already constituted civilizations that projected their own structure backwards. This posteriority is not contingent — it is not an accident ("we weren't there") — it is structural: there is no way to narrate an origin until later. The origin, as narrated, is always a retroactive construction — what the narrative arrives at when it goes back, not what it starts from. The "beginning" is the end of the investigation, not its beginning. Posteriority is irreducible: even if there were a machine capable of "travelling" to the moment of primordial emergence, the act of observing and narrating it would be a subsequent operation — the origin, to be described, always requires the regime of description that only emerges after it. Asymmetry is constitutive: material processes occur; its organisation as a "beginning" is a late operation of systems that needed a beginning to understand themselves.
Freud introduced the concept of Nachträglichkeit to describe the peculiar temporality of signification: an event is not traumatic at the time it occurs; it becomes traumatic retroactively, when a second event activates and gives new meaning to the first. The second event does not physically alter the first — it alters the regime of meaning in which the first begins to operate, retroactively reconfiguring its status. Lacan radicalised this logic: all meaning is retroactive — the meaning of a sentence is completed at the end; each word gives new meaning to the previous ones. The "stuff point" retroactively rearranges the sequence. Retroactivity is not a peculiarity of cosmology: it is the general structure of the relationship between processes and the narrative that organises them. Freudian Nachträglichkeit and the Lacanian punchline show that, even in the domain of human experience, meaning is not given at the moment the event occurs — it is constructed retrospectively, when later conditions reorganise the field of meaning. If this is so in the domain of experience (where there is a subject, consciousness, language), a fortiori it will be the same in the cosmological domain (where none of these conditions exist at the time of the processes that are intended to be narrated).
Wheeler showed that the narrative of a system is not fixed until later experimental conditions stabilise it. In the delayed-choice experiment, the decision about how to measure can be taken after the photon has passed through the apparatus, and that delayed choice determines the kind of narrative that can be told about the photon's 'path'. The quantum eraser likewise shows that information can be erased and interference patterns restored. This is not physical retrocausality—no information is sent into the past. It is descriptive retroactivity: present measurements select the class of description compatible with the observed correlations. The existence of the universe does not depend on observers; yet the organisation of its past as a narrative of 'origin' depends on present stabilising conditions. Wheeler provocatively asked whether present observations retroactively select the class of description compatible with the universe's history. The decisive point is not retrocausality but the dependence of narrative facticity on posterior inferential conditions. If even in a laboratory interferometer the narrative of the photon's path depends on later measurement, then the narrative of the universe's 'beginning'—dependent on present material conformities and current theoretical models—is, a fortiori, retroactive. The universe did not narrate itself at the moment of emergence; it is narrated by complex systems that arose billions of years later and require a beginning narrative for their own intelligibility.
Badiou conceived the event as an irruption that exceeds the situation: the radically new is not deducible from the previous state. The event brings to light what did not exist in the field of possibilities counted by the situation. Affinity: the Badiouan event shares with the gesture the irreducibility to what precedes it. Cost: Badiou requires a faithful subject — one who recognises the event and is faithful to it by procedure of truth. The event without a faithful subject has no consequences — it is as if it had not occurred. This requirement does not operate in a pre-symbolic regime: there is no faithful subject before there is a symbolic regime. The position supported here maintains irruption but dissolves fidelity: the gesture operates and produces difference without anyone being faithful to it.
Naming an origin is always silencing what was left out. All origin is selection — choosing, constrained but still choosing, among multiple possible configurations. The Big Bang is "the" origin of the universe, but one could go back further (and if there was something before the Big Bang? and if the Big Bang is just a transition in a larger cycle?) or stop earlier (why not the formation of galaxies? of the first stars? of the first molecule?). Choosing a point as "origin" is always excluding other possible points, always silencing other equally legitimate narratives. The term "Big Bang" itself — coined by Fred Hoyle in 1949, ironically by those who opposed the theory — is already a clipping operation: it names an instant as if it were an explosion, when cosmology describes continuous expansion without a centre or surface. The narrative selects; selection excludes; exclusion is constitutive of intelligibility. Every origin is, in this precise sense, symbolic violence — it imposes order on the continuum, draws boundaries where there are transformations, declares beginnings where there are processes. But this violence is necessary: without it, the continuum would remain undifferentiated, without cut, without intelligibility. recognising symbolic violence is not denouncing it — it is understanding it as an operative condition of knowledge. Every cut is partial; all partiality is inevitable; all inevitability is a condition of possibility and not a defect. What is important is to distinguish between necessary symbolic violence (without clipping there is no intelligibility) and the naturalization of violence (assuming that clipping is "given" and not an operation). The origin is a necessary cut — but it is not a natural data of reality.
Closing
Emergence has no substrate, no subject, no fixed point. The three classic pillars of origin thought are dissolved — ontological anteriority, founding agent, inaugural moment. Dissolution is solidary — the three implicate each other, and this solidarity is not accumulation but reciprocal implication. Without substrate, there is no subject of the substrate — the permanent entity that could be an agent of change disappears. Without a subject, there is no datable intentional event — the moment someone or something decides to act disappears. Without a datable event, the "origin" is a retroactive construction — subsequent organisation of conformities present in a beginning narrative.
Material processes do not have absolute beginnings; they have transformations, reorganisations, critical thresholds. The Big Bang is not a point; it is an extrapolation that becomes increasingly problematic as one approaches the singularity. Primordial nucleosynthesis is not a founding instant but a regime of material reorganisation lasting several minutes—from the first second to roughly the twentieth minute—whose temporal boundaries are conventional, fixed by criteria of temperature and density that are themselves model-dependent. Origin is not at the beginning; it is in the middle—at the moment when a stabilised process needs a beginning in order to understand itself, when a sufficiently complex configuration turns back and organises what precedes it as a 'beginning'. Matter does not begin; it persists. That persistence—the fact that material processes operate without an absolute beginning, transforming without a first transformation—is the ontological datum released by the triple dissolution.
The triple dissolution does not eliminate the real; it eliminates the metaphysical categories projected onto it. The distinction is crucial: to dissolve substrate, subject, and fixed point is not to dissolve reality. Material conformities are real—the cosmic microwave background exists, the expansion of the universe is measurable, heavy elements were forged in supernovae, vacuum fluctuations are detectable. What dissolves is not the real but the categories through which tradition has organised it. Emergence is not nothing; it is a primary operation, a gesture without author, a configuration that reorganises itself without foundation or purpose. Origin is real as an effect: real as constructed, built upon the real. The position is therefore neither naïve realism, as though origin simply lay there waiting to be found, nor radical constructionism, as though origin were fiction without reality. Origin is an operation: a functional cut that organises real material conformities into a narrative of beginning, an operator that produces intelligibility from present configurations rather than a substance that subsists independently of its effects.
The triple dissolution is complete. The reader cannot resort to any of the three pillars: neither pre-existing time, nor disturbed balance, nor founding substrate. The material irruption that inaugurates the real has no foundation, no subject, and its recognition as "beginning" is a subsequent operation. But it works — and it is this "but" that prevents dissolution from converting into nihilism. Dissolution is not an end — it is a condition for posing the question in another way. If tradition asked "from what substrate does the world emerge, who made it, when did it begin?", the question that now arises is different: how can reality be effectively reorganised when none of these pillars survive? What mode of operation is possible without foundation, without subject, without fixed point? The question moves from dissolution to positive construction — and it is this construction that the continuation aims to interrogate.
The origin was never there. It was there that we learned to say that it all began.