Introduction
The real does not emerge from nothing; nothingness emerges — belatedly — as a fiction of thought.
If all material reorganisation is by excess and all form is the functional remainder of compatibilities that precede it, what can we say about the concept of beginning itself? "Beginning" carries assumptions steeped in centuries of metaphysical tradition: an absolutely privileged point where something emerges from what was not, a temporal border that separates before from after in an insurmountable way, a passage from emptiness to fullness. Each of these assumptions was progressively dissolved along the previous journey — time ceased to be a fixed stage, order ceased to be a primordial rest, the foundation ceased to be an immovable base. The operative vocabulary that supports this dissolution is already founded: the material difference as an operation prior to any inscription, the form as a functional remainder of compatibilities that does not exhaust the excess from which it emerges, the retroactivity of the institution of origin as a symbolic cut projected onto material conformities. If the real is a permanent excess and if the origin is a retroactively instituted event, then "beginning" belongs to the symbolic regime — it is the organisation of conformities in a founding narrative, not the property of the real prior to any regime of legibility. However, the category of "beginning" persists as a habit of thought, as a way of thinking that structures the question of origin itself.
The beginning is a symbolic figure — retroactive projection onto material conformities where change was simultaneously radical and continuous. What operates in reality is not a temporal cut that establishes absolute discontinuity between before and after. It is folding — reorganisation where compatibilities that a previous configuration contained unrealized precipitate into a new form, where being reorganises itself without ceasing to be. The universe did not begin, in the canonical sense — it did not leap from non-being to being. It doubled. And if the origin is a fold and not a cut, then the origin does not determine the destination: it opens the field where possibilities are updated without guarantee of which will be implemented.
The dissolution of the beginning requires three solidary operations. The first is negative: demonstrating that the original "nothing" — the category that supports creatio ex nihilo, the Leibnizian question, the Heideggerian anguish — is symbolic fiction without material anchoring. The real is excess, not lack; plenitude that differentiates itself, not emptiness that fills itself. The second is constructive: replacing the temporal cut with folding as the transition operator. Where the cut installs absolute discontinuity between before and after, the fold describes continuous reorganisation of material compatibilities by excess — transition without prior annihilation, inheritance without transmission of experience, propagation without centre. The third is consequential: if the fold is contingent — if the reorganisations could have been different — then the origin is not the foundation that contains the destiny as a program. The origin is openness, not determination. Contingency is constitutive of the real, not a transitory defect of knowledge.
Main text
Axis 1 — The beginning is an excess
1.1 The fiction of nothing: genealogy of creatio ex nihilo
The question "why is there something rather than nothing?" is not just the oldest question in Western metaphysics. It is also the question that makes the least sense when examined rigorously, as it contains in its very formulation a premise that makes it circular: it presumes that nothingness is a possible condition, a previous state that could have been maintained, an alternative that reality has rejected. Nothing is not an alternative. It is not a state that the real has left behind. It is retroactive fabrication — symbolic projection that only exists in the legibility regime that produces it.
Parmenides was the first to rigorously confront the logic of nothingness. He argued that non-being is unthinkable and impossible, and that therefore being is one, eternal, immutable. The consequence is catastrophic for any cosmology: change becomes illusion, becoming collapses into sensorial appearance. Parmenides was, however, correct in one crucial aspect — nothing is logically impossible, conceptual fabrication. The divergence is by nature. Parmenides refuses change because all change would imply passage through non-being, and non-being is unthinkable. The argument here dissolves nothingness in another way: there is no passage through non-being because there is excess, because material difference is a continuous operation without any configuration requiring the prior annihilation of the previous one. Change does not require passage through non-being; it requires only that the being contain more than the present form stabilises. Where Parmenides saw unity without difference, contemporary cosmology shows difference without unity — constitutive heterogeneity, permanent metastability, excess that does not resolve into ultimate form. Nothingness is dissolved without change being sacrificed.
The theological-metaphysical tradition that follows persists in the same fiction in successive forms. Christian creatio ex nihilo — formulated with systematic rigour by Thomas Aquinas — makes nothingness the precondition of a free divine act. Thomas argued that only God, as a necessary being and pure act, can produce being where there was nothing: creation is not a transformation of pre-existing material, it is a passage from non-being to being through infinite power. The ontological consequence is twofold: nothingness is preserved as an absence of being outside the creative will, and created being is contingent — it depends entirely on the act that sustains it. If God ceased to sustain the world, the world would revert to the nothingness from which it was taken. In this framework, nothingness is a permanent condition, an always present possibility beneath divine support. The criticism is not of the internal logic of the argument — it is of the premise: creatio ex nihilo presupposes that there is a state of "nothingness" from which being can be produced. If nothingness is not a real state, the entire construction loses its starting point.
Leibniz reconfigured the issue with the question that became emblematic of metaphysics: "why is there something rather than nothing?" His principle of sufficient reason requires that everything that exists has a reason why it exists this way and not otherwise. The existence of the world requires sufficient reason, which Leibniz locates in the divine choice among an infinity of compossible worlds — God contemplates all coherent arrangements and selects the best. Nothingness subsists in this picture as a space of unrealized possibilities, as a logically available alternative that sufficient reason excludes. The cost is that the very formulation of the question installs nothing as the background: it presupposes that the absence of everything is a real option, that the world could not be, and that existence, by contrast, requires justification. If nothing is a pseudo-concept — if it was never a real alternative — the Leibnizian question dissolves in its own formulation, not for lack of depth, but because the premise that supports it is fiction.
Heidegger inverted tradition without abandoning it. He made out of nothing not the condition prior to being, but the Abgrund — the foundationless abyss on which all foundation rests. In anguish, according to Heidegger, beings as a whole retreat and nothingness reveals itself as a permanent possibility: not as an absence that precedes being, but as a horizon that crosses it. Dasein exists in the face of nothingness, which is not logical negation, it is a fundamental experience that everything could not be. The Heideggerian gain is real: it moves nothing from the past to the present, from cosmogony to existence. The cost, however, is that it preserves nothingness as an operative ontological structure — as that which makes transcendence, the question for being, the opening of Dasein possible. The dissolution proposed here is more radical: nothingness has no ontological function, neither as a precondition (Thomas), nor as a logical alternative (Leibniz), nor as an existential horizon (Heidegger). It is symbolic fabrication in all records. The difference between necessity and contingency does not need anything to be articulated — it needs the immanence of the real, the absence of necessity as an intrinsic material property, excess as a condition that makes contingency thinkable without resorting to the background of non-being.
Bergson diagnosed the error precisely: nothingness is a pseudo-concept, manufactured by regressive negation from being. "Non-being" is a symbolic operation that already presupposes a being at hand — it is denying presence. Intelligence models nothingness about existence: it says "where there was something, now there is nothing", when in fact where there was something there is now the rest of the universe without that something. If you deny this, then that, then everything else, the result is not nothing — it is simply the absence of all these particular things. Being is a primitive condition; nothingness was never a real condition. The question "why is there something rather than nothing?" dissolves — presupposes that nothingness is a real alternative to being, when it is fiction constructed by negation. The argument is foundational. Where Bergson diverges is in the solution: he introduces élan vital, an immanent creative force that generates difference and duration. The constitutive temporality is correct — the real is processed in openness. Élan vital, however, reintroduces an orientation into matter, an intrinsic tendency to create increasingly complex forms, which returns to the teleologism that was intended to be avoided. From Bergsonian criticism the essential is retained — the dissolution of nothingness as a fiction of thought, being as primitive. The notion of a directed creative impulse is completely rejected. Excess is not vital, it is not goal-oriented. It is a continuous condition of material difference — without an author, without direction, without any configuration being privileged in advance.
1.2 Excess as a permanent condition of the real
The central operator that allows us to abandon the fiction of nothingness without falling into the vital impulse is excess — understood not as a lack, not as an ontological defect, but as a permanent and structural characteristic of any material configuration. There are always more compatibilities than the present form can simultaneously stabilise. There are always unresolved tensions, relationship potentials that are not actualised under the current regime. This is the invariant condition of the real — not an accidental property, but an ontological structure that persists at any level of description.
The excess is manifested in three registers that should not be confused with separate categories — they are sides of the same operation. The first is concentration: the primordial state of the universe is not emptiness that is filled, it is fullness compressed under extreme constraints. The singularity that general relativity describes is not a point where nothing exists — it is a regime where density, temperature and curvature exceed any available model. The energies involved are not properties of something that "appeared" — they are constraints of a material field whose concentration exceeded all stable form, forcing continual reorganisation. The second is potency — not in the Aristotelian sense of realisation of essence, but rather as a positive capacity for differentiation. Primordial matter contained compatibilities for configurations that no present form exhausted: electroweak symmetry contained the possibility of its own breaking, plasma contained the conditions for recombination, homogeneous gas contained the gravitational heterogeneity that would generate structure. These compatibilities are not latent ends — they are relational properties of specific material regimes that, under specific constraints, allow for specific reorganisations. The third is differentiation: excess is not an inert reserve awaiting mobilization, it is an active operation that produces multiplicity. Each cosmological phase transition is a demonstration of differentiation by excess — the quark-gluon plasma did not transform into hadronic matter because the plasma "lacked" something, but rather because cosmological cooling altered the constraints and compatibilities of the previous regime exceeded the previous stable form, precipitating new configurations. Differentiation is the production of multiplicity from plenitude, not the division of previous unity nor the realisation of a plan contained in the previous form.
Deleuze insisted that difference is primary — that identity is a secondary effect of difference, not the other way around. The gain is real and decisive: it dissolves the primacy of the identical that has governed Western metaphysics since Plato, and allows thinking about the production of the new without recourse to the previous essence. However, Deleuze postulates two ontological planes — the virtual and the actual — that maintain a duality where the virtual functions as an ontologically distinct genetic field: a reserve of intensive differences that are "actualised" in extensive forms. Deleuzian actualization is creation, not realisation of the possible — the actual does not resemble the virtual that produced it. The argument here retains this refusal of the possible as an anticipated image of the real, but refuses ontological duality. There are no two plans. There is only the real, which is material, which is permanent excess — not latent virtuality housed in another regime, but difference in action, immediately manifested as reorganisation. When the quark-gluon plasma reconfigured itself into hadronic matter, there was no updating of a virtual plane — there was a reorganisation of material compatibilities under new constraints. The real/concrete distinction, already founded, replaces the virtual/actual distinction without losing what was productive about it: the thesis that the transition is not a copy of a previous model.
Metastability — the state where forces are in provisional balance, even though any disturbance triggers reconfiguration — is the constitutive condition of reality. There is no absolute stability; There are only situations where differences were channeled into repeated cycles.
The Aristotelian tradition thought of stability as a natural state and change as disturbance. Metastability reverses this: the normal state of matter is not rest, but unresolved tension. True balance is a rare limiting case where the excess has been entirely channeled. In the early universe, each symmetry regime was metastable — it appeared stable, but contained structural potential whose permanence was a physical impossibility under cosmic constraints. The false vacuum exemplifies it precisely: a configuration of local equilibrium that seemed stable, but whose decay was a structural inevitability — the inflationary field occupied a local minimum of energy, not the global minimum; the system could remain in this state for a finite period, but the transition to the lower energy state was a constraint that no local stability could indefinitely suppress. The decay of the false vacuum was neither an accident nor an intervention: it was a consequence of the excess contained in the metastable regime exceeding the capacity of that regime to contain it indefinitely. The quark-gluon plasma was a stable configuration only as long as sufficient energy suppressed confinement; below that threshold, reconfiguration into nucleonic structures was an inevitable material constraint. The same occurred in recombination: when the radiation density dropped below a certain threshold, atomic bonding became an event whose impossibility of postponement was structural. Consequence: no form is definitive. Every observed balance is provisional — it remains as long as constraints allow, it changes when it is reconfigured. The question "why does it change?" is inverted into "why does the appearance of stability persist?" The answer: stability is a relational effect that emerges when multiple constraints align in a zone where variation is dampened.
Simondon articulated this through the pre-individual — that which precedes individuation, the condition prior to the emergence of stable identities. The pre-individual is more-than-one: heterogeneous compatibilities that cannot simultaneously be realised in a single form because multiplicity exceeds any unity. However, the pre-individual is not a separate real field with its own ontological structure — a plan that exists prior to any manifestation. It is retroactive designation — what thought names when, from a crystallized form, it recognises "there were compatibilities that preceded this." These compatibilities do not exist on a separate plane; they subsist in the material operation, in the friction between differences that the form contains and that it cannot exhaust. The pre-individual is the name for the permanent condition where every form contains reorganisation potentials that it cannot simultaneously stabilise. Excess is not a property that individuation resolves — it is a condition that persists through all individuations, because no particular form exhausts the field of material compatibilities from which it emerges.
Tradition mobilizes possibility (Aristotle: dynamis, realisation of the essence) and virtuality (Deleuze: autonomous genetic field, surplus to any update) to think about what is not and could be. The argument rejects both. Possibility projects onto the real a purpose that it does not contain; virtuality projects a separate plane of ontologically unnecessary pure potential. In their place: material compatibilities — structural conditions that, under certain specific constraints, allow certain configurations and prohibit others. Compatibilities do not have a structure separate from the real (they are not a virtual plane), they do not pre-figure a form of realisation (they are not Aristotelian power). They are properties of the field of relations in a specific material regime. The fact that protons and neutrons bind into nuclei under specific energetic constraints is material compatibility — it exists where the matter that supports it exists; It does not reside on a transcendent plane, it does not await manifestation. It is an operation that occurs or not depending on whether constraints align. Compatibility is immanent to the real material; possibility and virtuality project structures that the real does not contain.
In the primordial universe, in a regime of maximum symmetry where the notion of form still had no meaning, there was within this indeterminate regime a vastness of conformities contained in the structure of the quantum vacuum — plenitude of energy, continuous vibration of fields, tension without rest. This is not chaos; it is potential-laden metastability. And it is precisely because the early universe was metastable that each phase transition precipitated new forms. Not because nature had a plan for them — the forms that emerged depended entirely on parameters that could have been different — but because at each moment the compatibilities of matter exceeded the previous form, forcing reorganisations where a configuration of maximum symmetry was converted into a configuration of broken symmetry.
1.3 Cosmology against creatio ex nihilo
Contemporary cosmological models offer descriptions of how the early universe behaved under extreme conditions. They are theories — symbolic constructions that thought creates to organise observable data into an intelligible pattern. This must be stated with absolute precision, because the confusion here is systematic: no cosmological model is a direct window to reality itself. They are symbolic organisations of structures that operated. The effectiveness of the model reveals that there is structure in reality that the description can capture; however, capturing something true is not equivalent to being transparent — it is being a precise symbolic representation, a map that corresponds to the territory without being the territory.
The classical extrapolation of general relativity leads to a singularity — a regime where density, temperature and curvature tend towards infinity. This result marks precisely the border where the classical description ceases to suffice, not a directly accessible physical state. Known laws break at this limit. The very notion of time presupposes the space-time structure whose emergence the singularity precedes. Therefore, what happened before this boundary is indeterminate not due to accidental ignorance, but due to the structure of known physics.
Hartle and Hawking proposed, within the framework of Euclidean quantum gravity, a description where imaginary time replaces real time in the initial state of the universe. On Planck's scale, close to the threshold, the geometry of space-time becomes so curved, so densely concentrated, that one can no longer speak of "before" and "after": time stops being a dimension in which things evolve and becomes a dimension analogous to space. In Hawking's analogy, the surface of a sphere has no point of origin. Anyone who walks along the surface towards the North Pole never finds a point where the sphere begins — the North Pole is the northernmost point, it is not a beginning, it is just the limit where the very notion of "north" ceases to make sense. Finite but unlimited: the question "what was there before?" dissolves theoretically and formally as "what's north of the North Pole?" This does not mean that the real is "boundaryless" — it means that the available model does not require a temporal boundary. The statement is about theory, not about the real before capture.
Vilenkin reformulated the question by proposing that the universe tunneled from a zero-vacuum state through the quantum tunneling effect. This sounds as if there was a prior nothingness from which the universe emerged — precisely what creation designated. However, it is a wrong reading. What Vilenkin proposes is that quantum laws exist logically before the physical existence of the universe, and Vilenkin's "nothing" — a quantum state of zero geometry — is not the nothingness of the philosophical tradition: it is a physical state describable by a wave function and governed by quantum laws. If laws apply, it is nothing: absolute nothingness would not be governed by any law. Tunneling is a process described by theory, not a revelation of what the real does outside of all description. The operational "nothing" is an imaginary extrapolation back from formalism — the point where the energy of the universe would be zero, where there would be no matter and no motion. No observer occupied this spot; no truth about it is possible outside the mathematical language that designates it.
The Casimir effect offers a measurable illustration that a vacuum is never absence. Two metal plates, positioned very close to each other in a vacuum, experience mutual attraction. The reason is that the quantum vacuum is not empty: between the plates, only a subset of the zero-point energy fluctuations can exist; on the outside, the full amplitude of fluctuations occurs. There is therefore an asymmetry of pressure that pushes the plates towards each other. This is measurable; has been confirmed experimentally. Vacuum is never absence of energy — it is always excess potential, always continual reorganisation of material differences, always ceaseless redistribution of intensities. Calling it "vacuum" is a linguistic error that carries the connotation of emptiness; it would be more accurate to call it zero-point energy.
What cosmology therefore exposes is that the entire question of nothingness, the entire genealogy running from creatio ex nihilo to Heidegger and Bergson suffered from a category error: it confused the limit of legibility with the limit of reality. The Big Bang singularity is not an event in the universe; it is the point where the description we have stops working — where formalisms stop converging on stable values. Singularity is a property of our language, not a property of reality. The real prior to that border remains inaccessible in itself. It is nothing — it is that which is not describable in the terms we have. It is not empty — it is excess that does not find an adequate capture regime.
1.4 The primordial asymmetry: excess that differentiates
What can be said in the regime of description that begins just beyond that limit is a radically contingent reorganisation of excess. It is not gradual determination of form from formless matter; It is differential channeling of intensities. The primordial universe, in the first infinitesimal fraction of a second after that threshold, is a plasma of very high symmetry — theory predicts that at sufficiently high energies the electromagnetic and weak interactions merge into a single electroweak interaction, and grand unification models (not yet experimentally confirmed) postulate that the strong force also converged with the electroweak. Matter and antimatter exist in practically identical proportions. Symmetry is maximum in the sense that few differentiations have stabilised — and it is precisely because few differentiations have stabilised that the excess is maximum: there is no channeling, there is no partial stabilisation, compatibilities coexist in a regime of indistinction.
The asymmetry between matter and antimatter — baryogenesis — exemplifies the contingency of this reorganisation. Sakharov identified three necessary conditions for such an asymmetry to emerge: processes that violate the conservation of baryonic number, violation of the C and CP symmetries, and interactions outside of thermal equilibrium. These conditions are theoretically possible at sufficiently high energies, although the concrete mechanism remains an open problem in particle physics. The observed result is an infinitesimal asymmetry — for every billion quark-antiquark pairs that annihilate each other, one particle of matter survives. It is in this asymmetry that all matter in the present universe resides. The exact mechanism remains undetermined; the laws govern which asymmetries are possible, and which asymmetry is realised is a matter of differential reorganisation of excess in an environment of permanent metastability.
The primordial nucleosynthesis that follows — the period between one second and three minutes after the start of the expansion, when energies are sufficient for protons and neutrons to combine into helium and lithium nuclei, but insufficient to break these nuclei — is local stabilisation of the excess. The excess is channeled into a specific structure: the strong bond that holds the nucleons together. If temperatures were slightly higher, everything would dissolve again. If they were slightly lower, all matter would have frozen into different shapes. What sets nucleosynthesis on a trajectory is that the expansion of the universe cools matter at a very precise rate — a rate that was contingent, not determined by anything previous, only accomplished because the geometry of the early universe was such that density and temperature fell in specific ways. Each step is local stabilisation of excess, not void filling.
It is now important to clarify a distinction that the previous argument presupposes without having yet formulated: the difference between temporal priority and operational priority. When it is said "excess precedes form", it is not saying that there was a time when excess existed and form did not yet exist — as if excess was a previous state in the timeline and form was a later state. There is no time "before form" in the chronological sense: as soon as the physical description has scope, there is already a material configuration, there are already constraints, there is already a regime of operation. The priority is operational: excess is the condition from which the form precipitates, not the state that existed before the form in a temporal sequence. The most precise analogy is geometric, not chronological: just as the topography of a land is a condition of the trajectory of the water that flows through it — without the topography "existing before" the water in a temporal sense — so the excess of compatibilities is a condition of the form that stabilises within it. Form is the relational effect of excess, not its temporal successor.
This distinction dissolves a recurring confusion in thinking about origin. The question "what was there before the beginning?" assumes temporal priority — a time axis where states follow each other and where "before" designates the previous position in the sequence. If excess is operationally prior to form, the question dissolves: there was no "before" in the temporal sense, because time is itself the product of material relations that only exist when there is configuration — when there is already form, already constraint, already operative regime. Excess is not "before" time; It is at the basis of every moment, as a permanent condition of every configuration. Each form that stabilises coexists with excess that it has not absorbed — compatibilities that remain unactualized, tensions that the current form does not resolve, differences that press for reorganisation. The operational priority is simultaneous, not sequential: the excess is present in each instant as a structure of the real, not as a memory of a past state. When the inflationary vacuum collapses, you don't go from "pure excess" to "form" — you go from one metastable configuration (which contains excess relative to the compatibilities that will open up) to another configuration (which contains its own excess, different but equally irreducible). Excess never runs out; priority never ceases. It is this permanence that prevents any configuration from being thought of as a final state, as a definitive balance, as a form that has absorbed all available excess.
If the beginning is excess and not lack, the next question arises: how to think about the transition from one regime to another? Not as cutting — temporal separation — but as folding — material reorganisation.
Axis 2 — What is born does not begin: fold vs. court
2.1 Operative continuity: nothing comes from nothing
The scholastic principle ex nihilo nihil fit — nothing arises from nothing — is not a metaphysical truth that the universe would obey as a cosmic moral law. It is an operational constraint. Matter does not contain true emptiness. Where there was energy, reconfiguration occurs. Where there was a field, what the description does not capture persists — it does not disappear. Leibniz formulated this as a principle of continuity — natura non facit saltus, nature does not make leaps — and quantum physics nuances it without refuting it: there are quantum discontinuities (jumps between energy levels, quantization), but these are discrete transitions between available states, not emergences from nothing. Quantization refutes the mathematical continuity of certain quantities, not the ontological continuity — there is no emergence of being from non-being.
The primordial universe offers constant examples of this continuity. When the plasma of the first moments cooled below a certain temperature, protons and electrons were not "created out of nothing": they already existed as elementary particles in a state of maximum agitation. Cooling — a mere change in temperature — meant that the electromagnetic force was no longer able to maintain ionization. Electrons approached protons. Atoms formed. Nothing came out of nowhere. A previous incompatibility — between isolated electrons and room temperature — was resolved through reorganisation.
Primordial nucleosynthesis operated within this absolute constraint — conservation and contingency on two levels. In the first minutes of the universe's expansion, temperature and density were so extreme that isolated nuclei could not exist — energetic collisions at relativistic speeds instantly disrupted any aggregation before it could stabilise. As the universe cooled, the kinetic energy of particles progressively decreased. At a critical point — about a second after the Big Bang — the average speed of the collisions dropped enough that protons and neutrons were able to briefly bond into deuteron pairs. These deuterons were, in turn, able to capture other nucleons, forming helium-3 and, eventually, helium-4 — stable nuclei with two protons and two neutrons. Lithium-7 was formed in minimal proportions because the energy barrier for incorporating a third proton was high. However, a phenomenon occurred simultaneously: as quickly as cooling allowed nuclei to form, the density of the universe decreased — the universe expanded, material spread out. In a more diluted region, collisions between particles became rarer. At a second critical point — about twenty minutes after the initial event — the density was so low that collisions virtually ceased. Nucleosynthesis stopped abruptly. The proportion we observe today has been frozen: seventy-five percent hydrogen, twenty-five percent helium, minimum proportions of lithium-7 and beryllium-7.
None of these proportions were created out of thin air. Each is a consequence of a precise operative window: a period where temperature and density permitted a specific nuclear reaction, during which that reaction occurred with a frequency determined by the rate of collisions and the effective section of the reaction, after which conditions changed and the reaction ceased. The constraint is on two separate and equally fundamental levels. First level: absolute conservation. No particles are destroyed — what happens is that protons and neutrons, which existed freely in the primordial plasma, gradually bind together into compact nuclear configurations. The total number of protons and neutrons remains practically constant at nucleosynthesis temperatures — conservation of baryonic number is respected at these energies, although it was violated at much higher energies during baryogenesis. The number of electrons remains constant — lepton conservation. The total electrical charge remains zero. The total energy remains constant — what changes radically is only the way in which the energy is distributed: the kinetic energy of the free particles is transformed into the binding energy of the nucleus. Nothing is created. Nothing disappears. There is only continuous redistribution of matter and energy according to physical constraints that regulate how fields and particles interact. Second level: irreducible material contingency. The specific proportion that primordial nucleosynthesis produces was neither absolutely inevitable nor determined by the logic of laws. It was contingent on specific parameters of the early universe that are not themselves constrained — the expansion rate, the initial density, the ratio of baryons to photons. If any of these parameters had been slightly different, nucleosynthesis would indeed have occurred, but it would have produced completely different proportions of elements. A universe with a 0.1% faster expansion rate would have produced more hydrogen and much less helium — because the time window for nuclear fusion would have been shorter, and free neutrons that did not bond with protons would have decayed before capturing partners. A universe with a baryon/photon ratio ten times higher would have produced helium in much higher proportions, because nuclear reactions would have occurred more frequently in a denser medium.
This pattern of operative continuity is repeated throughout cosmology. When recombination occurred — about three hundred and eighty thousand years after the Big Bang — electrons bonded with nuclei in a process that is, structurally, the opposite of initial ionization. There was no "creation" of atomicity out of nothing. Atomicity as the ability of nuclei to carry electrons in a stable bound state already existed as material compatibility — however, compatibility does not mean a transcendent plane awaiting manifestation. It simply means that matter, given certain conditions, behaves in specific ways. What was missing was an operative condition: a temperature low enough for the electromagnetic force to be able to keep bound electrons in a stable bound state without energetic collisions continually ejecting them. In ionized plasma, each electron and each proton were free species — separated by thermal energy that prevented binding. As the universe cooled — a gradual process where the average temperature of space continually dropped — regions began to undergo recombination. It did not occur simultaneously: in colder regions, recombination occurred earlier; in even warmer regions it occurred later. Recombination is not the birth of a new form that appears out of nowhere. It is a transformation of a regime of compatibilities: a transition from a regime where protons and electrons were forced into continuous separation by thermal energy to a regime where electrons and protons were able to remain connected in a stable bound state. The universe did not "start being atomic" in an instant — it gradually folded from ionized plasma into neutral gas under continually changing thermal constraints.
2.2 Folding as material reorganisation
The concept of warp replaces the concept of temporal cut. Folding is an operation that does not require cutting — the idea that at a certain moment one form ceases absolutely and another form begins from nothing. The fold is, on the contrary, inflection, a continuous curvature where what was in the previous regime persists but is transformed into a new regime.
The fold that matters is not the Deleuzian one — it is not an operation that mediates between a virtual plane (pure potential, non-actualised multiplicity) and an actual plane (precipitated form). This fold presupposes two separate ontological planes, and the passage from one to the other as genuine creation. For Deleuze, the virtual is an ontologically distinct genetic field that is "actualised" in forms; the Deleuzian fold (Le Pli) operates between virtual and actual. The fold that operates here is entirely internal to the real — there is no separate virtual plane, no transcendent region of potential. The virtual/actual distinction is replaced by the already founded real/concrete distinction: there are material (real) differences that reorganise themselves through excess, and there is the inscriptive (concrete) regime that then cuts these reorganisations into phases and narratives. The gain of the Deleuzian fold is retained — think of the transition without a cut, without a before/after separation. The ontological framework is different.
The philosophical tradition has another classic operator to think about the transition it preserves: the Hegelian Aufhebung. Aufhebung — simultaneously suppression, conservation and elevation — is the mechanism by which each moment of the dialectical process is negated, but retained in the next moment in transformed form. When immediate consciousness discovers that sensible certainty is insufficient, it does not simply abandon it — it suppresses it as ultimate truth and preserves it as a partial moment integrated into a broader truth. Aufhebung shares a formal structure with folding: transformation does not annihilate what precedes it; reincorporate it under a different regime. In this strict sense, there is an affinity between what Hegel thinks of as supersumption and what is here designated as folding — in both cases, what was persists but transformed. When the primordial plasma folds into neutral atomic gas, the matter that operated in the previous regime persists entirely in the next regime — protons and electrons are the same; What changes is your relational configuration.
However, the divergence is threefold and each point is decisive. First: the engine of Aufhebung is contradiction (Widerspruch) — every moment contains internal negation that forces it to surpass itself. The thesis generates its antithesis because it contains logical incompleteness, and the contradiction between the two impels the synthesis. The engine of the warp is not contradiction — it is excess. Matter does not reorganise itself because it contains internal negation; it reorganises itself because it contains more compatibilities than any form can simultaneously stabilise. Excess is not dialectical opposition — it is material overabundance. The form that precipitates is not a synthesis of opposites; it is partial channeling of differences that exceed any configuration. Second: Aufhebung is teleological — each stage is superior to the previous one because it approaches the Absolute, the full realisation of the Spirit that understands itself. Hegelian dialectics has direction: it goes from the abstract to the concrete, from the immediate to the mediated, from the partial to the total. The fold has no direction. No fold is "superior" to the previous one in any sense that implies progress, completeness, or approaching a final state. Nucleosynthesis is not "superior" to quark-gluon plasma; recombination is not "truer" than ionized plasma. They are contingent reorganisations where constraints have changed and different forms have become compatible. Third: Aufhebung requires a Subject — the Spirit (Geist) — that goes through the stages, which is that which denies itself and recovers, that which alienates itself and returns to itself. Without the Spirit as the subject of the process, Hegelian dialectics loses its engine and its intelligibility. The fold is strictly subjectless. No entity travels the cosmic folds; no Spirit recognises itself in the matter that reorganises itself. Matter reorganises itself under material constraints; no interiority, no consciousness, no experience intervenes. The gain of Aufhebung — the attention to transformation it preserves — is retained; the metaphysical framework — contradiction as motor, teleology as direction, Spirit as subject — is entirely rejected.
It means that every material transformation is a fold: inflection of what already operated in a previous regime to a new regime, where compatibilities have changed but matter remains. When primordial nucleosynthesis occurs — when protons and neutrons bind into nuclei — matter does not undergo creation. Suffer bending. The nuclear matter that emerges is a reconfiguration of matter that existed as free protons and neutrons. The nucleus is new in the sense that it has properties that isolated nucleons did not have — stability under cooling, the ability to capture electrons in bound states — but it is not new in the sense that it was created out of nothing. It is matter that folds in on itself, that finds new compatibility, that stabilises in a regime that was previously impossible. There is no time cut; there is no point where the non-nucleon instantly transforms into a nucleon. There is transition: as the temperature drops, the probability of nuclear collisions that result in bonding increases; as density decreases, the time between collisions increases, allowing bonds to form and consolidate.
Recombination offers another paradigmatic example. Electrons are not "created" to bind to nuclei when the temperature drops below the ionization threshold. Electrons already existed. What changes is that compatibility between electron and nucleus — which was impossible in the previous temperature regime — becomes possible and then necessary. The electron finds a stable bound state. Matter folds: what was ionized plasma becomes atomic matter. No matter was destroyed. None were created. There was reconfiguration — folding. What the naming will later describe as "the era of chemistry" — the period where atoms exist as stable entities capable of forming molecular bonds — is a symbolic back-projection onto this fold. Recombination is the material process; "the beginning of chemistry" is the retroactive name given to it by the inscription. In reality, there was no beginning — there was continual reorganisation of compatibilities under changing temperature constraints. The cosmic background radiation — the radiative compliance that recombination precipitated — is the material difference that has persisted since this warp. The cosmic microwave background radiation was not released in a single instant — it was emitted over a finite-width temperature interval, during which the fraction of free electrons gradually decreased. Each photon was released when, at its local point, an electron bound to a nucleus — an individual, contingent event, embedded in a statistical population of similar events that spanned thousands of years. The "ultimate scattering surface" that observational cosmology defines is idealization: statistical surface with real thickness, not instantaneous membrane. These photons have since traveled throughout the expanding universe, the material conformity of a reorganisation that did not have a zero point.
Star formation exemplifies the same operation on a radically different scale. A molecular cloud is a metastable system — thermal pressure against gravity, precarious balance. When the mass exceeds the Jeans threshold — which depends on temperature, density and chemical composition — density perturbations grow exponentially under the action of gravity, and the region enters gravitational collapse. Backward compatibility ceases; new compatibility emerges when the core temperature reaches about ten million Kelvin and nuclear fusion begins. Radiation pressure balances gravity — a star. No matter was created. Old compatibilities have ceased; new ones emerged. The fold is the same: continuous reorganisation without external agent, without creation, without cutting. The matter that collapses — primordial hydrogen, helium — continues to exist, continues to be governed by the same laws. What changes is the scale of cohesion and the configuration of how quantities are distributed.
There is an epistemological point that deserves precise clarification. The folding described here is not an observable process in the ordinary sense. The time scale is extreme: primordial cooling temperatures decrease exponentially, nuclear events occur on millisecond scales. For an observer hypothetically positioned in the primordial universe, what would be noticeable would be an abrupt change in the distribution of matter. This is an observational scale effect, not a denial of material continuity. At the level of the dynamics of individual particles, each collision is a specific event; At the collective level of countless collisions occurring simultaneously, the statistical distribution of this population of events produces an abrupt manifestation of a new material regime — phase transition. The folding is not continuous at the observational level — the manifestation is discrete; it is continuous at the material operative level: there is no zero point where matter ceases to exist. The form that emerges in a fold is not an essence that manifests itself — it is a contingent configuration, dependent on the constraints that allow it. An atomic nucleus exists when energetic constraints create compatibility between linked nucleons; when these constraints change, the core disintegrates. Form ceases without any "essence" surviving.
Refusing the cut has an ontological cost that is important to explain. If there is no true beginning, no irruption that absolutely separates the after from the before, then no form is genuinely new in the strong metaphysical sense — in the sense of creation ex nihilo, of something that had no material precedence at all. The novelty, under this regime, is always reconfiguration, reorganisation of pre-existing differences, not the appearance of an entirely unrelated configuration. This means that every form inherits from a previous form — not as an inheritance in the biotic sense, but as an irremovable material constraint: the configuration that precipitates a specific regime depends entirely on the material conditions that precede it. The hydrogen that exists in the universe today is the same hydrogen that was formed in primordial nucleosynthesis — not descendant, not generational transformation, but literally identical matter in conformation that persisted without altering its nuclear identity. Whitehead characterized the confusion between symbolic categories and real operations as the "fallacy of misplaced concreteness": it is the error of imposing on the real the structure of the symbol that designates it. The cut is misplaced concreteness par excellence: it treats an operation of legibility — the discretization of the continuum, the imposition of a threshold that divides classes — as if it were a constitutive property of the real, as if the real actually contained cuts, ontological discontinuities, abrupt borders. When cosmology says "the era of recombination began at z≈1089" or "the deconfinement transition occurred at T≈2×10¹² K", it is translating into a discrete symbolic regime — named epochs, transition temperatures — what in reality is a continuous gradient, a smooth variation where temperature changes in a non-discrete way and recombination occurs gradually, electron by electron, over a range of temperature and density. The cut is a convention of the legibility regime, not a description of what materially happened. Consequence: refusing the cut is also refusing the ontology of absolute metaphysical novelty. No nuclear configuration is created out of thin air. Every configuration is a transformation of a previous configuration. This does not eliminate novelty — it only eliminates novelty in the sense of ontological rupture. Nucleosynthesis produces helium nuclei that did not exist before: this is operational, relational novelty — the configuration has binding properties, of nuclear cohesion, that two isolated nucleons do not have; changes the field of subsequent compatibilities by introducing increased mass, different effective section, new regime of interactions. It is new on the relational plane, not on the metaphysical plane of having been "brought into being from nothing." The stability of the helium nucleus is real news — it is a form that persists under constraints that previously could not be satisfied. However, the properties that reside in it derive entirely from pre-existing material operations, from pre-existing constraints, from pre-existing symmetries. Nothing comes from nothing; everything emerges from the reconfiguration of what was already operative.
2.3 Transduction: propagation without absolute beginning
The fundamental operation that characterises all these transformations is transduction — Simondon's concept that designates the process by which a transformation at one point of a material induces successive transformations at other points, without an external agent or pre-established plan. Transduction is the operation of the material on itself, propagation of reconfiguration through pre-existing differences. Simondon insisted that transduction works neither by deduction (applying a general principle) nor by induction (extracting general law from particular cases). It works by operation where the transformation of a region affects the adjacent region, which affects the next, in a propagation pattern that has no central guide or prior plan.
Mineral crystallization offers the clearest picture of this operation in the prebiotic regime. When a saturated salt solution begins to cool, molecules cluster around a nucleation point. Once the first crystal cell forms — and this can occur by chance, just by statistical encounter of molecules — it acts as a constraint on the approaching molecules. Molecules that meet the surface of the growing crystal experience an attractive force that aligns them with the structure already present. There is no form instruction from outside; the form that grows is itself an alignment operator. The crystal grows because growing, given the already established structure, is the reconfiguration of minimum resistance for new molecules that meet the growth surface. The crystalline structure persists as it grows because the new molecules encounter the same structural differences — the same distance between atomic planes, the same electromagnetic bonding pattern — as the previous molecules.
Nuclear reactions in the stellar interior propagate by thermal transduction. When, in a star, the core reaches enough temperature for the fusion of hydrogen into helium to begin, this fusion releases energy. The energy released heats the surrounding material. The heated material becomes more suitable for fusion to occur. Fusion in this material releases more energy, which heats even more material, inducing more fusion. The reaction propagates through the structure of the star not because there is a directing agent, but because the heat of fusion at one point induces fusion conditions at another point. Thermal transduction continues until the star's internal structure rebalances under the new energy configuration — until the fuel runs out and the constraints change.
Cosmological transduction continues beyond primordial nucleosynthesis. When the universe recombines and turns from plasma to neutral gas, transduction becomes gravitational. Density fluctuations that had been amplified during inflation now gain importance: slightly denser regions attract more matter because gravity is stronger where there is more mass. Attraction in a dense region induces even greater attraction in neighboring material because the attracted material adds to the mass it attracts. Gravitational transduction propagates through the structure of the universe without a master plan, without intelligence, solely because the gravitational constraint causes difference to grow where difference already exists. The large-scale structure of the contemporary universe — filaments of galaxies, immense voids — is the result of gravitational transduction that has continued for more than thirteen billion years. The cosmic web that contemporary cartography reveals was not designed by an agent nor is it the result of a plan inscribed in the initial conditions. It results from transduction: each slightly denser region attracted more matter, which made it denser, which attracted even more matter, propagating structural differentiation without a centre and without an end. Cosmic voids — regions with much lower than average density of matter — are not absences. These are effects of the same transductive operation: where the density was slightly lower than average, the matter was attracted to denser regions, progressively emptying the regions of lower density. Void and filament are complementary faces of the same gravitational transduction. No point in the process is the "beginning" of the cosmic web; there is continuous propagation of structural differentiation.
Stellar nucleosynthesis extends transduction beyond the early universe. When a first-generation star fuses hydrogen into helium in its core, the energy released sustains the pressure that prevents gravitational collapse — the transductive circuit where the product of fusion powers the conditions that allow further fusion. When hydrogen runs out, helium fuses into carbon, carbon into oxygen, and so on until iron, where fusion no longer releases energy. Each step is a fold within the star: reorganisation of material compatibilities under altered constraints. The star does not "evolve" into iron — it bends successively as each fuel is used up and constraints are reconfigured. The result — supernova, dispersion of heavy elements into the interstellar medium — is transduction that sows the conditions for new stellar generations, new transdurations, new material configurations.
Whitehead, in a very different regime of thought, touched on what Simondon called transduction when he theorized inheritance. For Whitehead, each "occasion of experience" inherits from the immediate past: not because there is an agent who transmits it, but because the past is already incorporated in the conditions of possibility of the present occasion. The past does not disappear; it becomes embarrassment for what comes next. Whitehead was right to insist that the past does not disappear, that there is continuity between occasions, that each present inherits from each past. However, the vocabulary of "experience", "prehension", "sensation" — even if extended to non-conscious occasions — projects an interiority onto the material that is not necessary to explain inheritance. Inheritance is a constraint, it is a material difference that persists and limits — present material conformities constrain future reorganisations. "Inheritance" is continuity of material constraints, not transmission of experience. "Final causation" as the "subjective ideal" of the occasion reintroduces finality that the material reality does not contain. Simondon formulates this more rigorously: transduction is the propagation of reconfiguration across differences, without the need for any subject to experience, without the need for any occasion to feel.
The disagreement with Simondon deserves precise reinforcement here. Transduction is the closest philosophical antecedent to folding as structuring propagation, and the gain of Simondon's formulation is immense: it allows us to think of a transformation that propagates without a centre, without a plan, without an agent. However, Simondon speaks of "pre-individual charge" — the potential energy stored in the tensions of the pre-individual field, which "forces" individuation when a structural germ triggers it. This "charge" is an energetic metaphor that functions while remaining anchored in specific material processes — the tension in the quantum vacuum, the temperature gradient in plasma, the difference in density in a molecular cloud. When generalizing to "charged pre-individual field" without material anchoring, a generic entity is introduced that functions as a metaphysical substrate: a reserve of potential that pre-exists all individuation and that has its own structure. The position here is different: there is no generic reserve of potential. There are specific material conditions — temperature, density, composition, constraints — that in specific regimes allow specific reorganisations. The "excess" that operates is not charge accumulated in a field; It is a relational property of each material regime: the fact that, in any configuration, there are more material compatibilities than the present form can stabilise. Simondon's "charge" becomes, in this translation, a structural property of the real — not a prior substance, but an immanent condition of any regime.
2.4 The cut as retroactive projection
The cut is a category of the inscription regime, not an operation of the real material. If the fold is a continuous transformation of the material regime, the cut is the projection of the inscriptive regime onto this continuity — a projection that is not present at the moment the transformation occurs, but is imposed later, when the observational regime seeks to fix what was continuous difference into discrete phases.
When it says "recombination began" at a specific point — about three hundred and eighty thousand years after the Big Bang — what exactly does this "beginning" mean? It does not mean that material reality has undergone an ontological cut. It means that electrons have crossed a temperature threshold where binding to nuclei has become energetically favorable — not instantly, but progressively. It means that the statistical population of free electrons has progressively fallen below the critical level. It means that radiation stopped being continuously scattered by collisions with electrons and was able to travel freely — not because there was a cutoff, but because the frequency of collisions decreased exponentially as the density fell. In the real material regime, recombination is continuous transduction: electrons approach nuclei, bind, release radiation, induce the binding of more electrons. There is no point at which the call "starts"; there is a continuous spectrum of energies, continuous probabilities of encounter, continuous propagation of reconfiguration. The "recombination temperature" — about 3000 Kelvin — is an idealization of a continuous transition with finite width: the process spans a temperature range, and the nominal value marks the point at which the ionization fraction has crossed a statistically defined threshold, not a cutoff in the real one.
However, when contemporary cosmology measures background radiation, it cuts this continuity and establishes a cut. He speaks of the "recombination epoch" as a discrete phase that followed the "plasma epoch". Each designation is a retroactive cut: after observing that the universe was opaque before a certain date and transparent after, the inscriptional regime cuts the continuity into phases. The cuts are not arbitrary — they respond to the structure of reality, the recombination is effectively transformation in which transparency emerges. However, the discretization into "phase before" and "phase after" is an imposition of legibility, not a property of the material regime.
When the inflationary field decayed into thermal radiation — when the accelerated exponential expansion decreased to normal expansion — the description also instituted a cut: it speaks of "the end of inflation" as if there were a zero point where before there was inflation and then there was none. In reality, there is a continuous transition where expansion parameters continually change, where the field continually changes its state, where energies are continually redistributed. The ontological cost of this cut is that it establishes categories of "before" and "after" that do not exist in the material reality as properties of the material itself. The real material contains only continuous transformation, constraints that redefine themselves, differences that reorganise themselves. The "phases" are artifacts of the interpretative operation.
The retroactivity mobilized here is the same that has already been demonstrated as a general property of the institution of origin: it is not the conformities that are retroactive — it is the cut that organises them as "origin" that is retroactive. The objective is not to refound this operation, but rather to apply it with maximum accuracy to the case of the "beginning": the "beginning" is not a property of the real — it is a section that the description projects retroactively onto present material conformities. The H/He/Li ratio is material (real) difference; dating and narrative are symbolic operations (concrete and theory). The temporal cut — before/after nucleosynthesis — is a symbolic operation, not an ontological border.
However, the cut, although poorly situated ontologically, is an indispensable tool. Readability does not operate on continuous gradients — it operates on discrete distinctions, classifications, partitions. For knowledge to work, for analysis to be possible, it is necessary to discretize the continuum, install thresholds that allow categorization and comparison. A continuous temperature gradient is not analyzable as such; it only becomes intelligible when the inscriptional regime installs cuts — "above 3000 Kelvin, plasma; below, neutral gas" — that transform imperceptible variation into recognizable and distinct phases. Cutting is a symbolic operation whose usefulness lies precisely in allowing the continuum to become legible. This is not a fault of the language — it is a condition of the language. recognising that the cut is a symbolic artefact is not to deny its productivity; is to clearly distinguish the plane on which the cut operates — the concrete plane, legibility, theory — from the plane that the cut intends to describe — the plane of the real, the material operation, continuity. Science cannot function without cuts — without naming transitions, without identifying phases, without establishing operational thresholds that allow measurement and prediction. Conceptual sophistication consists of simultaneously maintaining two assertions: the cut is necessary for intelligibility, an operation without which knowledge would be impossible; and the cut is not a property of the real — it is an operation that the inscription regime imposes on the real to make it knowable. The classic scientific-philosophical error is to confuse these two assertions, to take the operative utility of the cut as proof of its ontological reality. The fold persists in the real; the cut persists in the registration regime. Both are true in their plans: the truth lies in not exchanging them.
Axis 3 — Contingency and openness
3.1 Dissolution of determinism
Newton established that, given the positions and velocities of all particles at an instant, the equations of motion univocally determine the entire future history of the system. This is Laplacian determinism: an intellect that knew the complete state of matter could calculate the future until the end of time. This position was not mere speculation — it was a direct consequence of Newton's equations, which are time-reversible. In this logic, the universe is a perfect automaton.
Quantum mechanics changes this picture — but in a way that resists simplification. The Heisenberg uncertainty relation states that the position and momentum of a particle cannot be simultaneously determined with arbitrary precision. This limit is not technical — it is constitutive of quantum formalism. However, what this limit implies for determinism depends entirely on the interpretation adopted, and interpretations differ radically.
In the Copenhagen interpretation, formulated by Bohr and Heisenberg, the Schrödinger equation governs the evolution of the system in a deterministic way — but only as long as no measurements interfere. At the time of measurement, the wave function collapses to one of the possible states, and this collapse is genuinely random: the probability of each outcome is determined by the wave function, but which outcome occurs is indeterminate. Bohm proposed, in 1952, an alternative: particles follow deterministic trajectories guided by a non-local quantum potential — Heisenberg's uncertainty is preserved as an epistemic limit, but determinism is restored at the cost of accepting non-locality. Everett suggested in 1957 that there is no collapse at all — the universal wave function evolves deterministically, and each measurement causes branching in which all possible outcomes occur on separate branches. The Ghirardi-Rimini-Weber (GRW) and Continuous Spontaneous Localization (CSL) models propose that collapse is a real physical process, not an epistemological one, that occurs spontaneously at a rate dependent on the size of the system.
These interpretations differ radically in their ontology — they differ about what is real, about what happens in measurement, about whether determinism is eliminated or restored in another form. However, they all converge on a decisive point: none restores classical local determinism. In Copenhagen, the collapse is random. In Bohm, there is non-locality. In Everett, there is universal branching. In GRW/CSL, there are spontaneous events not deduced from antecedent conditions. The experimental refutation of Bell's inequalities — confirmed by Aspect, Dalibard, and Roger in 1982 and reinforced by subsequent experiments — eliminates the possibility that local hidden variables restore classical determinism. Local determinism — the idea that the future is univocally fixed by the past through only local interactions — is refuted, regardless of the interpretation adopted.
It is important to note that quantum randomness is, at the very least, a property of the inscriptional regime — what we can describe through the equations of quantum mechanics. It is not possible to state that randomness is a property of the real itself, independently of any legibility regime. However, it is also not possible to say that it is merely ignorance, an incompleteness that future discoveries will correct. The issue remains suspended — and remains suspended constitutively, without any possible refinement dissolving it.
The refutation of classical determinism is still not enough to think of origin as opening. Lorenz discovered in 1963 that nonlinear systems can exhibit an extraordinary property: small differences in initial conditions amplify exponentially over time. In his model of atmospheric convection, a change of a millionth in the initial temperature produced, after a few iterations, completely different predictions. This is deterministic chaos — the equations are deterministic, but the initial uncertainty grows exponentially. Lorenz identified strange attractors: structures in phase space that capture motion in long-term, but fractally complex ways. The system is confined to a finite region of phase space — it does not diverge indefinitely — but its exact trajectory is unpredictable in principle, because it amplifies any initial uncertainty. The consequence is that, even if the equations are deterministic, the precision with which it would be necessary to know the initial conditions to predict the future grows exponentially.
Prigogine's contribution was to show that there is something deeper: irreversibility is not just a practical property — it is a constitutive property of time in systems far from equilibrium. Prigogine studied systems where energy circulates continuously through the structure, and where energy dissipation is not a secondary effect — it is the condition that keeps the system far from equilibrium. When conditions are appropriate, the system reaches a bifurcation point, where multiple future trajectories are possible. Which trajectory is followed depends on fluctuations — on infinitesimal variations that are real and not eliminated by any measurement precision. History decides the future: transitions that occurred at a previous bifurcation point constrain — without determining — future bifurcation points. Irreversibility is thus a constitutive property: the past and the future are not symmetrical, not because the laws prohibit it, but because the history of transitions, bifurcations, dissipations that have accumulated has produced perpetual asymmetry. The nascent universe was maximally removed from equilibrium — therefore, maximally irreversible, maximally governed by bifurcations where structured contingency operated without restriction.
There is a tension here that requires rigorous clarification. The fundamental equations of physics — Newton's equations, Maxwell's equations, Schrödinger's equations, Dirac's equations — are all reversible in time. If the sign of t is reversed, equally valid solutions are obtained. Fundamental physics does not distinguish between past and future; the law remains valid in both directions. However, the macroscopic experience is radically irreversible: heat flows from hot to cold, never spontaneously from cold to hot; a confined gas expands when released, it never collects spontaneously; Complex systems degrade, they never spontaneously reorganise in structure. Boltzmann tried to resolve this tension through statistics: irreversibility would be the effect of the gigantic number of particles — the probability that all the molecules of a gas will spontaneously concentrate in one corner of the room is so infinitesimally small that, in practice, it never occurs. Irreversibility would thus be an appearance: real at the macroscopic level, illusory at the microscopic level.
Prigogine refuses this solution. Irreversibility is not an illusion of scale, it is not an artifact of aggregation. It is a constitutive property of systems with many degrees of freedom in a non-equilibrium regime. In a system far from equilibrium, fluctuations are not minor disturbances on a stable background; are transformation operators that determine the specific path of evolution. Once a bifurcation is crossed — once the system travels one of several possible trajectories — the conditions that preceded it are no longer accessible. Not because the return is just unlikely, but because the field of constraints has changed. The energy has been dissipated through degrees of freedom that cannot be reversed. Information about the previous state was irreversibly dispersed.
This connects directly with the real/concrete/theory tripartition. Equations (theory) are time-reversible because they are abstractions that describe formal relationships between variables — and formal relationships do not contain temporal directionality. The real material is irreversible because it contains excess, because energy is continually dissipated through multiple degrees of freedom, because bifurcations transpose and close previously open paths. Irreversibility is not a property of the equations — it is a property of the material operation. The arrow of time does not reside in theory; resides in the real. When the early universe underwent each phase transition — the breaking of GUT symmetry, the confinement of quarks, nucleosynthesis, recombination — each was irreversible not because the equations dictated it, but because the energy released was dispersed by degrees of freedom that make return materially inaccessible. The universe could not spontaneously rewind through these transitions.
Every cosmological phase transition is an irrevocable bifurcation. When electroweak symmetry breaks, electromagnetism and the weak nuclear force become distinct forces — and this separation opens up a field of electronic configurations that was previously materially inaccessible. The opening is real: electronic orbitals presuppose electromagnetic force as a distinct entity. As long as symmetry holds, these configurations are impossible. When they break, they become upgradeable. Each bifurcation deposits conformities that radically alter the space of future bifurcations — old modes of reorganisation become inaccessible, new modes open up. Irreversibility is material, not narrative.
Heidegger, in his thinking about origin, distinguished three concepts that tradition confused: Ursprung (true origin — original opening), Anfang (beginning — point where something begins for consciousness), and Ursache (cause — sufficient reason). True origin is not a temporal beginning — it is not a moment where before there was not and then there was. It is not an efficient cause — it is not an agent that forces a sequence of necessary effects. True origin is the opening of the field where being is organised, where multiplicities operate, where the future is not predetermined. And Heidegger introduces Ereignis — the event, the opening through which something emerges in a way that cannot be deduced from antecedents. The gain of this formulation is immense: it allows us to think of origin as an opening, not as a logical derivation. However, the cost is that Heidegger remains on the hermeneutic horizon — the opening is the opening of meaning, of the world, of understanding. Ereignis is an appropriative event, not material reorganisation. The translation that is necessary is the following: openness is not hermeneutic, it is material. It is not an opening of meaning — it is an opening of possible configurations, of material states not deduced from antecedents, of irreversible bifurcations that reorganise the space of possibilities.
3.2 Constitutive contingency
Actual material constraints do not determine any specific achievement. They only define the geometry of the space of possibilities — what shape this space has, where the boundaries are, what range of variation is allowed. Within this geometry, countless achievements are possible. This is invariant structure of the material real — not a defect of knowledge.
The cosmic inflation that shaped the geometry of the universe was not determined by laws — it had variable duration, free parameters that the theory describes without forcing. Quantum fluctuations in the inflationary field — inevitable statistical perturbations in the quantum regime — were amplified to cosmological scales during inflation. Exponentially amplified: a density fluctuation of the order of 10⁻⁵ on a subatomic scale became a density difference on a cosmological scale. Different fluctuations would have produced different large-scale distribution of matter — different pattern of galaxies, distinct clusters, reconfigured cosmic web of filaments. The universe we observe — with galaxies arranged in a particular way — is a contingent realisation of a vast space of possibilities. It did not have to be like this. Another pattern of fluctuations would have produced an observationally distinct cosmos.
The matter-antimatter asymmetry offers an even more direct example of irreducible contingency. Quantum theory predicts that matter and antimatter should be produced in equal proportions. However, observations show that one of every billion baryons escaped annihilation; the rest disappeared in matter-antimatter collisions, becoming pure radiation. Without this asymmetry — contingent, not deduced from any fundamental law — there would be no stable matter in the universe. The entire atomic structure that allows molecular organisation depends on this contingent realisation. Sakharov showed that there are three necessary conditions for baryogenesis: violation of the baryonic number, violation of CP symmetry, and interactions outside of thermal equilibrium. These conditions exist in principle. However, why is there an excess of matter over antimatter? Why is the asymmetry as big as it is? Known physics offers no deterministic explanation. The asymmetry is contingent — it could be different, it could be zero, it could be reversed.
Leibniz formulated with exemplary clarity the principle of sufficient reason: nothing happens without sufficient reason — that is, without there being a reason why this is so and not otherwise. This principle governs the entirety of Leibnizian metaphysics. If God created this world and no other, there is reason enough — it is the best of all possible worlds. If a leaf falls at this instant, there is sufficient reason in the laws of mechanics and the conditions of the previous state which, taken together, determine that it could not be otherwise. Nothing is free; everything has a complete explanation that, in principle, allows the event to be reconstructed from the first causes.
Constitutive contingency is precisely the refusal of this principle. There is no sufficient Leibnizian reason why the matter-antimatter asymmetry is one in a billion and not two in a billion, or zero. The physics of weak interactions allows for asymmetry — the vacuum is metastable, it can occupy different states — but no physical law prescribes the specific value of the observed asymmetry. The duration of inflation conditions the geometry of the universe — the time of accelerated expansion conditions the final spatial curvature — but what specific duration occurred is not prescribed by any fundamental law. The distribution of quantum fluctuations in the inflationary field is governed by well-defined, non-chaotic probability distributions — but which particular fluctuation occurs in which particular spatial region does not have sufficient reason in Leibniz's terms.
This rejection of the principle of sufficient reason requires a distinction that is decisive for the position of the present argument: the difference between epistemic contingency and constitutive contingency. Epistemic contingency is information deficit — the observer does not know all the variables, does not have sufficient measurements, and therefore the result seems indeterminate. If the information were completed, the indeterminacy would disappear. This is Laplace's position taken to the limit: if an intellect knew the position and speed of each particle, it would calculate everything, and contingency would reveal itself to be the illusion of ignorance. Constitutive contingency is another thing: it is not about the incompleteness of knowledge — it is about the fact that the specific realisation is not determined by any complete set of variables, however complete it might be. There are no hidden variables whose discovery would restore determination. Bell's theorem demonstrated in 1964 that no theory of local hidden variables can reproduce the correlations predicted by quantum mechanics — and experiments by Aspect, Dalibard, and Roger in 1982 confirmed experimentally that the correlations violate Bell's inequalities. If there were local hidden variables that determined the results, these correlations could not exist. This does not prove that the real is indeterminate in itself — it proves that no local deterministic framework can describe it. Contingency does not reside in the ignorant observer; resides in the structure of what is describable.
It is also important to clarify the ontological status of the physical laws that frame this contingency. Laws belong to theory — they are symbolic reorganisations of the concrete that describe regularities in material constraints. They are not prescriptions that reality "obeys" as if they were commands; they are descriptions that theory constructs by formalizing patterns in observable data. When one says "laws permit multiple realisations," what one is saying is that the theoretical description of material constraints is compatible with more than one configuration — not that there is a cosmic lawgiver that authorizes multiple options. The real contains constraints — properties that operate independently of any inscription. The theory formalises these constraints into equations. Constitutive contingency means that real constraints are not sufficient to force a single realisation among those they allow. The equation describes the space of possibilities; does not contain an operator that selects which possibility is updated. This distinction is operative: if laws were prescriptions of reality, contingency would be a defect — imperfection of a code that should be univocal. But if laws are descriptions, contingency is property — material constraints have the structure they have, and that structure does not fix a single outcome. The difference is between imperfection of the code and openness of the real.
This is not epistemic nihilism. It is not asserted that there is no reason. There are real constraints: the laws of physics determine what is impossible and what is possible. There is structure to possibility: probability distributions are not uniform — some configurations are vastly more likely than others. There is regularity: the constants of nature remain stable throughout the observable universe. Within these constraints, however, the specific achievement does not have sufficient reason. Sufficient reason requires that there be, for each fact, a complete explanation that makes it impossible for it to be otherwise. Structured contingency is the opposite: there are constraints that make some facts impossible and others possible, but among those possible, the selection is without sufficient reason. It could have been another way — material constraints would not have prevented it.
The differential expansion of the universe offers a third example. The universe did not cool uniformly — the rate of cooling depended on what fraction of energy was in the form of radiation versus matter, what the energy density of the vacuum was. These are parameters that quantum theory allows without prescribing. A universe that cooled more slowly would have remained too hot for helium nuclei to form. A universe that cooled much more quickly would have cooled so quickly that recombination would have occurred before nucleosynthesis was complete. The chemical structure of the universe depends on this rate of cooling, which was contingent.
A strict differentiation with Prigogine is important here. Prigogine refers to "dissipative structures" — structures kept far from equilibrium by the continuous dissipation of energy. There is a temptation to read this as if dissipation were the engine, as if the loss of energy was what drives the system to organise itself. The interpretation here is different: the engine is the excess — the amount of energy that passes through the system. Dissipation is a consequence of excess, not a motor. The primordial universe has an excess of energy concentrated in an infinitesimal region. This excess cannot remain immobile — it reorganises itself because constraints force it to do so. Dissipation — conversion of energy into less concentrated forms — is what results from this reorganisation. Irreversibility is therefore a consequence of excess: once energy has been dissipated, the system cannot return to a state with that concentration intact, because the dissipated energy has been released into additional degrees of freedom.
Structured contingency is this: the possible space is structured, delimited by real laws and constraints. Quantum mechanics defines probability distributions — it never forces a single outcome. Field dynamics define potentials — never impose a single global minimum. However, within this rigorously real structure, the concrete actualization is free in the sense of not being logically determined. No law can indicate which point in the space of possibilities will be occupied. Once actualised — once that particular realisation occurs — history becomes irreversible. The space of subsequent possibilities changes. The contingencies that have been updated radically constrain those that are yet to come, they establish conditions for all future material reorganisations. Each fork that updates reduces the logical space of later forks — not because it makes them impossible, but because it changes the constraints that govern which future forks actually open.
3.3 Openness, causality and anti-teleology
If constitutive contingency is a property of the real, then origin does not determine destination. It opens the field where possible updates are made without guarantee of which one will be implemented. The origin is not an efficient cause that would necessarily determine a universe that would execute the plan encoded in it. It is an open field where material constraints define possibilities, where structured contingency operates perpetually, where concrete actualization is always contingent. Nothing in the past obliges the future.
"Opening" is not a figure of speech in this context. It is a precise designation of a material property: the existence of spaces of multiple possibilities that are not eliminated by logical restriction. When a cosmological bifurcation occurs — when the universe crosses a point of instability and opens up in multiple paths — this field of possibilities is not a metaphor. It is the real structure of the dynamic state space of the universe. Consider recombination: before, the universe was opaque plasma; each ion occupies a position in space, each electron has a particular speed. After recombination, this space changes radically: neutral atoms occupy space where there were previously ions. Each atomic configuration — which electron orbits which nucleus — is a distinct realisation within the new space of possibilities. Before recombination, these configurations were physically inaccessible: thermal energies instantly ionized any atom formed. Recombination does not "create" these possibilities out of thin air; opens them, makes them materially accessible.
The universe does not tend towards anything. It does not advance towards complexity as if it were heading towards destiny. The common narrative of "cosmic evolution" — from singularity to cold universe, from simple to complex — is retroactive organisation by theoretical narrative of a sequence that, in itself, completely lacks intrinsic teleological direction. The conformities that we observe today are actually more complex in certain ways than those observed in the first moments. This order, however, is not progress in any sense that implies finality. It is a continuous update of compatibilities that previous constraints did not allow. In a different regime of cosmic history — if the universe had cooled more slowly, or if the baryonic asymmetry had been smaller, or if inflation had had a different duration — chemical complexity would never have emerged. The universe would just be infinitely diluted warm plasma. There is no cosmic arrow pointing to complexity as a goal. There are only material constraints that change, and in certain specific regimes these changes open possibilities for reorganisation that we call "more complex" — not because they are superior, but because they seem complicated to us. In other counterfactual regimes, possibilities for simplification or complete stagnation would also have been materially achievable. The universe does not choose complexity — it actualizes contingencies within constraints that could have been different. Consider the counterfactual: if the cosmological constant were a thousand times higher than the observed value, the accelerated expansion would have diluted the matter before any gravitational condensation could occur. There would be no galaxies, there would be no stars, there would be no heavy elements. The universe would be dispersed radiation in eternal expansion, without structure of any kind. This is not a "worse" universe — it is a different universe, equally compatible with the laws of physics, equally legitimate as a realisation of material constraints. The "worst" evaluation is a projection of readability that values complexity because the possibility of inscription emerged from it. The real material does not prioritize its achievements. In another counterfactual, if the strong nuclear force were 2% weaker, deuterium would not be stable and primordial nucleosynthesis would not have occurred — the universe would consist only of hydrogen, no helium, with no possibility of further stellar fusion. The narrative "from simple to complex" is an artifact of our position as late observers who retro-project progress onto sequence which, from the real side, is just a contingent succession of folds.
The real contains material constraints — differences that persist, conformities that condition future reorganisations, relationships that operate independently of any legibility regime. When hydrogen reacts with oxygen, there are material compatibilities that produce this reconfiguration: electronic states merge into an energetically preferred configuration. No description invents this compatibility; finds it, formalises it in equations, translates it into language. Legible causality — the narrative "A caused B" — is a different operation: symbolic construction that takes real material compatibilities, installs them in linearized sequences, projects temporal anteriority and logical necessity onto them. The "cause" is a retroactive name that the description attributes when it integrates a material conformity into an explanatory chain. It is not a property that conformity intrinsically has; It is a function that it acquires within the narrative.
Hume demonstrated in the 18th century that causality is not observable — we observe constant conjunction and the mind infers necessity. At this point, it partially aligns with the present position: causal necessity does not belong to the observed, it belongs to the observer. However, Hume goes too far: by refusing causal necessity, he also rejects material constraints. For Hume, the conjunction between A and B is contingent — tomorrow could be different. This position denies the regularity of constraints. Gravitational force is not a "habit" of the mind; it is material difference that operates independently of any description. Russell, in his 1913 essay "On the Notion of Cause", converged in a different way: the word "cause" does not appear in the equations of advanced physics, which work with differential relationships between variables, not with causal links. The gravitational attraction between two bodies is not caused by one of them; It is a symmetric relationship described by function. Russell leans, however, towards eliminativism — causality would be a "relic of times past", dispensable.
The position based here is distinct from both. From Hume: agrees that causal necessity does not reside in events, but disagrees that material constraints are reducible to habitual conjunction — they really operate, with a structure that is not a mental habit. From Russell: agrees that "cause" is not a fundamental category of equations, but disagrees that it is dispensable — it is an indispensable operation of legibility to produce intelligibility, although it is not an operation of the real material. There are real constraints (material plane) and there is legible causality (symbolic plane); none is reducible to the other, none is eliminable in favour of the other. Cosmology operates permanently in these two registers: "gravity caused the collapse of the molecular cloud" describes real material constraint through readable causal narrative. Both are necessary — the first for anchoring in the real, the second for intelligibility. The philosophical error lies in confusing them, in taking the causal narrative as a transparent description of reality or in denying, by eliminating causality, the reality of constraints.
3.4 The consequence of folding: space as unfolding
The primordial universe was maximally open: each instant of material differentiation, each phase transition, each bifurcation, opened new possibilities while forever closing others. The universe did not evolve from a determined, fully specified state, where all possibilities already existed in latency. It evolved through successive openings — fields of possibilities that expanded as material constraints changed. And it can never revert to those previous states. Once the electroweak symmetry broke, it was forever inaccessible. However, when it broke, it opened up a new field of material compatibilities — electronic structures that depend on electromagnetism as a distinct force, all of which are now possible.
If folding is an operation without a zero point, without linear determination, and if contingency is constitutive, then space cannot be a container given in advance — an inert stage where processes occur indifferently. Space is a relational unfolding of material reorganisations — a configuration that reorganises itself, not a stage where reorganisations take place. Geometry is not fixed form; It is material conformity that differentiates. When inflation occurs, it does not expand the universe into pre-existing space — it creates space itself through the continual reorganisation of material relationships. Distance is a primary relationship — not a geometric dimension imposed on material points. The fold implies that there is no "inside" and "outside" of the universe — there are material differences that are organised relationally.
The excess that has become established as a permanent condition of reality applies to space: there are more possible configurations of spatial relationships than are actualised, and the actualization is contingent. The space that emerged was not inevitable — it could have unfolded with different geometry, different dimensionality, different curvature. The fact that there is a three-dimensional universe with approximately flat geometry is contingent: no prior principle forces spatiality to take the form it did. The fold opens up space — but in non-predetermined ways. Gravity, in this framework, is not a force that acts in a given space — it is constitutive relational cohesion: the way in which material differences are organised in proximity and distance. When a molecular cloud collapses under gravity, it is not matter that moves within space — it is space itself that reconfigures itself because material relationships reorganise. The distinction between space and content dissolves: space is a relationship between contents, not a container where contents reside.
And if dark matter and dark energy operate as material differences whose full legibility escapes the present inscriptional regime — if there are gravitational constraints without direct electromagnetic emission, if there is an acceleration of expansion without an identifiable energy source — then the excess is not just a philosophical condition: it is an observational fact. The relational real exceeds what the current registration regime can capture. The universe contains more than can be read.
Closing
If beginning is symbolic cutting and folding is material operation, then origin designates the retroactive name of a continuous reorganisation whose field of realisation remained open. Nothingness is not a prior condition — it is a later fiction of thought. What operates is an excess of compatibilities over any present form, an excess that does not tend towards anything, that does not aspire to destiny, that reorganises because conditions no longer allow the previous form.
Folding is the name for this operation: inflection where the previous regime persists but is transformed, where previously impossible compatibilities become achievable, where being reorganises itself without ceasing to be. None of these reorganisations are determined — at each fork, multiple paths remain open, and the concrete actualization depends on fluctuations that no equation can anticipate. Contingency is not a defect in knowledge. It is the invariant structure of reality. Each bifurcation that is updated irreversibly constrains all future reorganisations, without determining any: the past limits without obliging, reduces without prescribing, inherits without programming.
If all reorganisation is a blunt fold, if contingency is constitutive and opening is material, then space cannot be a container given in advance — an inert stage where processes take place indifferently. Space is relational unfolding. And like all material folds, the fold of space is also open: there are more possible configurations than are actualised, and actualization is contingent. It remains to be thought about how this unfolding operates — how material difference is organised in distance, in geometry, in cohesion. If space is a relationship and not a container, if gravity is cohesion and not force on a stage, if contingency penetrates even geometry — then the questions that have been formulated so far about temporality and matter must now be reformulated in terms of relationship. The fold that was established here is not just temporal — it is spatial. Space reorganises itself because it is matter in relationship, and matter in relationship is matter that folds.