Skip to main content

Introduction: The Search for the Primitive

Dissolution of Substance: Trajectories to Quantum Information (1925–1957)

In the three decades between 1925 and 1957, the ontological foundation of physics underwent a total disintegration. The resulting conceptual vacuum remains unfilled. For nearly three centuries prior, the fundamental constituent of reality in physics had been conceived as a substance located in space and time, possessing determinate properties independent of observation. A particle was a particle; it had a position (xx) and a momentum (pp), and these variables traced a smooth, continuous trajectory through the cosmos. The role of the physicist was merely to uncover these pre-existing values, to read the book of nature as it had been written. This period chronicles the death of that classical “It” and its replacement by an ontology of pure information, probability, and correlation. This transition was not merely a modification of equations; it was a philosophical upheaval that forced humanity to abandon the comfort of visualizable reality. The narrative follows the iconoclastic dismantling of the electron orbit by Werner Heisenberg and Niels Bohr; the desperate, brilliant attempt by Erwin Schrödinger to restore continuity, and his subsequent realization of the deeper horror of entanglement; the thermodynamic bridging of knowledge and entropy by Leo Szilard; and the radicalization of the formalism by Hugh Everett III, who dissolved the observer into the equations themselves. By the late 1950s, the “It” had not disappeared, but it had transmuted. It was no longer a rock; it was a bit. It was a measurement record, a correlation, a relative state in a high-dimensional Hilbert space. This section details that historic transmutation, tracing the intellectual lineage from the hay fever-ridden shores of Helgoland to the “participatory universe” of John Archibald Wheeler, whose geometrodynamics would soon confront the full weight of this quantum crisis.

The Iconoclasts: Heisenberg, Matrix Mechanics, and the End of the Orbit

The first casualty of the quantum revolution was the concept of the planetary orbit. By the early 1920s, the “Old Quantum Theory,” a patchwork of classical mechanics and ad-hoc quantization rules developed by Niels Bohr and Arnold Sommerfeld, was collapsing under its own inconsistencies. While it could describe the hydrogen spectrum, it failed miserably for helium and could not account for the anomalous Zeeman effect. More alarmingly, it presumed that electrons moved in defined elliptical orbits, yet these orbits were physically unobservable.

In the summer of 1925, fleeing a severe bout of hay fever, the 23-year-old Werner Heisenberg retreated to the stark, treeless island of Helgoland in the North Sea. Isolated and feverish, he made a decision that would sever the link between physics and visual intuition. He decided to discard the unobservable. In classical kinematics, the motion of a particle is described by a function x(t)x(t), a continuous line in space. Heisenberg realized that in the atomic domain, we never observe x(t)x(t). We observe only the frequencies and intensities of the light emitted during transitions between energy levels. He reasoned that if the electron’s orbit cannot be observed, it should not be part of the theory. This was a radical positivistic move: the theory should contain only quantities that are, in principle, measurable.

In a letter to Wolfgang Pauli dated July 9, 1925, Heisenberg wrote of his “pitiful efforts” to kill off the concept of orbits entirely. He replaced the classical Fourier series, which described the continuous motion of a planet or a vibrating string, with a new calculus. In classical theory, a periodic motion is decomposed into frequencies that are integer multiples of a fundamental frequency. Heisenberg found that in the atom, the frequencies were not harmonics of a single tone but differences between energy terms, in line with the Rydberg-Ritz combination principle.

Heisenberg’s “Umdeutung” (reinterpretation) paper of 1925 proposed a mechanics based solely on these transition quantities. Instead of a single number representing position, he arranged quantities in square arrays, though he did not yet know the term “matrix,” where the element XnmX_{nm} represented the transition amplitude between state nn and state mm. When he multiplied these arrays to calculate physical quantities like energy, he discovered a shocking property: the order of multiplication mattered. In classical arithmetic, 3×43 \times 4 equals 4×34 \times 3. In Heisenberg’s new mechanics, the position array XX and the momentum array PP did not commute: XPPX0XP - PX \neq 0.

Upon returning to Göttingen, Heisenberg handed his paper to his mentor Max Born. Born, recognizing the mathematics from his student days, realized Heisenberg had reinvented matrix algebra. Together with Pascual Jordan, they formalized the theory, famously deriving the canonical commutation relation: XPPX=ih2πXP - PX = \frac{ih}{2\pi}. This mathematical non-commutativity was the tombstone of the classical trajectory. If XX and PP do not commute, they cannot simultaneously possess precise numerical values. The “It” could no longer be a point moving along a line, because “position” and “momentum” were no longer simultaneously definable attributes of reality. They were operators acting on a state, not properties of the state itself.

The reception was mixed. The theory was incredibly successful at predicting spectra, but it was, as Schrödinger later described, “of repelling abstractness.” It offered no picture of what the electron was doing. It reduced the atom to a spreadsheet of transition probabilities, a black box that took inputs and gave outputs but contained no internal machinery. This was the first step toward “It from Bit”: the dissolution of the object into a table of data.

Two years later, Heisenberg cemented the physical meaning of his non-commutative algebra with the Uncertainty Principle. In a 1927 paper, he analyzed the operational limits of measurement, such as using a gamma-ray microscope to locate an electron. He argued that the very act of observation, bouncing a photon off a particle, disturbs the particle. However, Heisenberg’s interpretation evolved. Initially, he viewed the uncertainty as a result of a mechanical disturbance, a clumsy observer bumping into the delicate furniture of the quantum world. But as the Copenhagen interpretation matured, the view shifted. Uncertainty was not a result of imperfect measurement; it was a fundamental property of the “It” itself. An electron simply does not possess a simultaneous position and momentum. The “It” was no longer a solid object; it was a “tendency” to exist, what Heisenberg later referred to, borrowing from Aristotle, as potentia, something standing midway between the idea of an event and the actual event.

The Wave and the Web: Erwin Schrödinger (1926–1935)

If Heisenberg was the executioner of the classical trajectory, Erwin Schrödinger was the counter-revolutionary who inadvertently deepened the crisis. In 1926, disgusted by the “transcendental algebra” of the Göttingen school, Schrödinger sought to restore visualizability and continuity to physics.

Drawing on Louis de Broglie’s 1924 hypothesis of matter waves, Schrödinger formulated the wave equation (Hψ=EψH\psi = E\psi). Unlike Heisenberg’s discrete matrices, Schrödinger’s ψ\psi was a continuous field evolving smoothly in time. For a brief moment, it seemed the classical “It” was saved; particles were simply wave packets, localized lumps of field density moving through space. The physics community embraced Wave Mechanics with relief. It used the familiar tools of partial differential equations, the “crowning glory of traditional physics.” It felt like classical electromagnetism. However, this hope was a mirage. Schrödinger soon realized that the wave function for two particles did not exist in 3-dimensional physical space, but in a 6-dimensional configuration space. For NN particles, the wave function lived in 3N3N dimensions. This was not a physical wave in the ether; it was a wave of information in an abstract mathematical manifold.

Furthermore, wave packets inevitably spread over time. A particle localized as a “hump” would eventually dissipate across the universe. Schrödinger’s attempt to interpret ψ\psi as a physical charge density collapsed when it became clear that the wave packet did not stay together. The “It” could not be a wave of matter.

The interpretation that would seal the fate of the “It” came from Max Born in 1926. Analyzing the scattering of particles using Schrödinger’s formalism, Born proposed that the square of the wave amplitude (ψ2|\psi|^2) did not represent a physical density of charge, but a probability density. In a famous paper on collision theory, Born added a footnote that marked the moment physics abandoned the prediction of individual events. “The motion of particles follows probability laws,” Born asserted, but the probability itself propagates according to strict causality. This created a bifurcation in the nature of the “It” that persists to this day:

  • The Wavefunction (ψ\psi): Deterministic, continuous, strictly causal, but unobservable and abstract.
  • The Measurement Result: Discrete, real, but probabilistic and acausal.

Schrödinger was horrified. He had hoped to eliminate the quantum jumps; instead, his equation became the vehicle for formalizing them. The “It” was no longer a substance; it was a betting slip.

The Copenhagen Synthesis: Bohr, Complementarity, and the Cut

While Heisenberg provided the mathematics of uncertainty and Born the statistical interpretation, Niels Bohr provided the philosophy that made the destruction of realism palatable. Operating from his institute in Copenhagen, Bohr dismantled the classical separation between the observer and the observed, effectively redefining what it means to be an “It.”

In classical physics, a measurement is a passive gaze; the “It” exists “out there,” and the observer merely records its pre-existing properties. Bohr argued that in the quantum realm, the interaction between the measuring instrument and the atomic object is finite, governed by the quantum of action (hh), and uncontrollable. This interaction creates an “indivisible whole” which Bohr termed a phenomenon. One cannot speak of the electron’s behavior independent of the measuring device. The “electron” is an abstraction; the reality is the “electron-plus-Geiger-counter-clicking” event. In Bohr’s view, “No elementary phenomenon is a phenomenon until it is a registered (observed) phenomenon,” a phrase later popularized by Wheeler.

Unveiled at the Como Conference in 1927, Bohr’s doctrine of Complementarity asserted that the “It” has no intrinsic properties in isolation. An electron is not a wave; nor is it a particle. It behaves as a wave in the context of a diffraction grating and as a particle in the context of a collision experiment. These descriptions are mutually exclusive but jointly necessary for a complete description of experience. Bohr drew analogies to psychology, noting the difficulty of separating the subject from the object in introspection. Just as we cannot observe our own anger without altering it, we cannot observe an atom without participating in its definition.

Crucially, Bohr insisted that the “It” of the measuring device must be described in classical language. We must communicate our results unambiguously using heavy, macroscopic concepts (pointers, scales, clocks). This created a conceptual boundary, the Heisenberg Cut, between the quantum system (probabilistic, indefinable, described by ψ\psi) and the classical observer (deterministic, communicable, described by Newton/Maxwell). This was Bohr’s pragmatic solution to the crisis: the “It” had died in the microcosm, but it was resurrected as a necessary fiction in the macrocosm to allow scientists to speak to one another. The price was a fractured worldview where the laws of physics changed depending on the size of the object.

The Clash of Completeness: The Bohr-Einstein Debates

The death of the classical “It” did not happen without a ferocious defense. Albert Einstein served as the attorney for an objective, independent reality, believing that “God does not play dice.” The debates between Einstein and Bohr are legendary not just for their intellectual height but for how they refined the definition of information in physics.

At the Fifth Solvay Conference in 1927, Einstein proposed thought experiments designed to prove that quantum mechanics was inconsistent, that one could measure position and momentum simultaneously better than Heisenberg allowed. He suggested a single-slit experiment where one measures the recoil of the screen to determine the momentum of the particle passing through. Bohr refuted this by applying the uncertainty principle to the screen itself: if you know the screen’s momentum precisely (to measure recoil), its position becomes uncertain, washing out the interference pattern.

In 1930, at the Sixth Solvay Conference, Einstein brought a more formidable weapon: the Photon Box. He imagined a box filled with radiation, with a shutter controlled by a clock. The shutter opens for a brief time Δt\Delta t, releasing a single photon. By weighing the box before and after (measuring mass change Δm\Delta m), one could determine the energy of the photon (E=mc2E=mc^2) to arbitrary precision. Thus, one would know both the precise time of emission (Δt\Delta t) and the precise energy (ΔE\Delta E), violating the Heisenberg uncertainty relation ΔEΔth\Delta E \Delta t \geq h. Bohr was reportedly shocked, wandering the conference looking “like a somnambulist.” But after a sleepless night, he returned with a counter-stroke using Einstein’s own General Relativity. Bohr argued that the weighing of the box requires it to move in a gravitational field (e.g., on a spring scale). The uncertainty in the box’s position (necessary for the weighing) induces an uncertainty in the rate of the clock due to gravitational time dilation. The calculation perfectly recovered the uncertainty principle. Einstein was defeated on consistency, but he would not yield on ontology.

In 1935, Einstein, Boris Podolsky, and Nathan Rosen (EPR) changed the angle of attack. They published “Can Quantum-Mechanical Description of Physical Reality Be Considered Complete?”, a paper that remains one of the most cited in the history of physics. They established a “Criterion of Reality”: “If, without in any way disturbing a system, we can predict with certainty... the value of a physical quantity, then there exists an element of physical reality corresponding to this physical quantity.”

EPR imagined two particles (A and B) that interact and then fly apart. Due to conservation laws, their positions and momenta are perfectly correlated (xAxB=x0x_A - x_B = x_0, pA+pB=0p_A + p_B = 0). If we measure the position of A, we instantly know the position of B. If we measure the momentum of A, we instantly know the momentum of B. Since A and B are spacelike separated, our choice of measurement on A cannot physically disturb B (assuming Locality). Therefore, B must have had both a definite position and a definite momentum all along. Since quantum mechanics says B cannot have both, the theory is incomplete.

Bohr’s response, published under the same title, was a “bolt from the blue.” He essentially rejected the separation of A and B. He argued that “the whole arrangement,” the source, the particles, and the distant detectors, constitutes a single, unanalyzable phenomenon. There is no “It” at location B independent of the setting at location A. Bohr redefined “physical reality” to include the context of the measurement. This was the capitulation of local realism. The “It” was now non-local, spread across the entire experimental context. The “cut” between observer and observed now extended across light-years.

The Paradox of Entanglement: Schrödinger’s Cat and the Holism of Information

Schrödinger, observing the EPR debate from Oxford, was inspired to identify the singular feature of quantum mechanics that defied classical ontology. In a 1935 paper, he coined the term Entanglement (German: Verschränkung). Schrödinger realized that when two systems interact and then separate, they can no longer be described by independent wave functions (ψA\psi_A and ψB\psi_B). They possess only a single, joint wave function (ψAB\psi_{AB}). The “It” (the individual particle) ceases to exist mathematically; only the “System” exists.

Schrödinger wrote: “Maximal knowledge of a total system does not necessarily include total knowledge of all its parts, not even when these are completely separated... and do not influence each other at present.” Information is stored not in the particles, but in the correlations between them. He also introduced the concept of steering: by measuring particle A, the experimenter can “steer” particle B into a specific state (eigenstate of position or momentum) without touching it. This anticipation of quantum teleportation highlighted that information in the quantum world is non-local and shared.

To demonstrate the absurdity of the prevailing “blurred” reality accepted by the Copenhagenists, Schrödinger devised his famous Cat thought experiment. He imagined a macroscopic system (a cat) entangled with a microscopic one (a radioactive atom). According to the formalism, if the atom is in a superposition of “decayed” and “not decayed,” and the decay triggers a mechanism to kill the cat, then the cat must be in a superposition of “dead” and “alive” prior to observation. Schrödinger intended this as a reductio ad absurdum. He believed the “It” of a cat must be either dead or alive, regardless of observation. He wanted to show that the “smearing” of reality (superposition) shouldn’t apply to everyday objects. Ironically, history inverted his intent. We now understand that the cat is in a superposition (until decoherence sets in). Schrödinger inadvertently laid the groundwork for the “Many Worlds” interpretation and modern decoherence theory. He showed that the “smearing” of reality could not be confined to the atom; it infected the observer’s world as well.

The Thermodynamic Link: Leo Szilard and the Birth of the Bit (1929)

While Bohr and Einstein debated metaphysics, a Hungarian physicist, Leo Szilard, was quietly forging the physical link between the abstract “bit” and the concrete “atom.” His work provided the “missing link” explaining why observation is an active physical process.

In 1929, Szilard published “On the Decrease of Entropy in a Thermodynamic System by the Intervention of Intelligent Beings.” He addressed the paradox of Maxwell’s Demon, a hypothetical being who controls a door between two gas chambers, sorting fast molecules from slow ones to create a temperature difference, thereby violating the Second Law of Thermodynamics. Szilard realized that for the Demon to sort the molecules, it must first measure them. It must acquire information about their position and velocity. Szilard analyzed a simplified “one-molecule engine.” He showed that the Demon must:

  1. Measure the particle (Left or Right).
  2. Store this result in a memory.
  3. Actuate a piston based on this memory to extract work.

Szilard postulated that the act of measurement (or the subsequent erasure of that memory to reset the cycle) carries an entropy cost. He derived that the acquisition of one bit of information (distinguishing between two possibilities) corresponds to an entropy increase of: ΔS=kBln2\Delta S = k_B \ln 2. This was a monumental realization, anticipating Claude Shannon’s Information Theory by two decades. It established that information is physical. The “It” (entropy/energy) and the “Bit” (information/knowledge) were convertible currencies. Szilard’s engine showed that one could not talk about the “It” of the gas without accounting for the “Bit” in the observer’s memory process. This resolved the paradox: the entropy decrease in the gas is compensated by the entropy increase in the Demon’s memory process. This work lay dormant for decades but eventually led to Landauer’s Principle (1961), which confirmed that the erasure of information is the thermodynamic step that generates heat. In the context of the 1920s revolution, Szilard provided the mechanism for Bohr’s “uncontrollable interaction”: the observer is not a ghost; the observer is a thermodynamic engine entangled with the system.

The Universal Machine: Hugh Everett III and the Relative State (1957)

By the 1950s, the “It” was in a fragile state: maintained as a probability wave by Schrödinger, fragmented into complementary contexts by Bohr, and tied to entropy by Szilard. Yet, the “Measurement Problem” remained: how does the probability wave (ψ\psi) collapse into a single “It” (a specific record) upon observation? The standard Von Neumann formulation relied on an ad-hoc “Process 1” (collapse) that defied the Schrödinger equation (“Process 2”).

In 1957, Hugh Everett III, a graduate student of John Wheeler at Princeton, proposed a solution that required abandoning the last vestige of the classical “It”: the uniqueness of history. He took the Von Neumann formulation seriously but removed the collapse. He asked: What if the Schrödinger equation applies to everything, including the observer? He modeled the observer not as a metaphysical external agent (as Bohr effectively did by placing them on the classical side of the cut), but as a physical system, a mechanical automaton with a memory. He analyzed the interaction between a quantum system SS and an observer OO.

Everett showed that if the observer interacts with a superposed system, the observer themselves enters a superposition. If the system is in state αUP+βDOWN\alpha |UP\rangle + \beta |DOWN\rangle, the observer evolves into a state of: αUPIsawUP+βDOWNIsawDOWN\alpha |UP\rangle |“I saw UP”\rangle + \beta |DOWN\rangle |“I saw DOWN”\rangle. There is no collapse. There is no single “It” that emerges. Instead, the reality is the correlation between the system and the memory. Everett called this the “Relative State” formulation. Relative to the memory state “I saw UP,” the electron is UP. Relative to “I saw DOWN,” it is DOWN. Both branches exist simultaneously in the universal wavefunction.

Everett was driven by the “Wigner’s Friend” paradox: if a friend observes a system, the friend sees a result. But to Wigner, standing outside the room, the friend is in a superposition until Wigner opens the door. Who is right? Everett answered: Both. The “It” is relative to the observer. While Bryce DeWitt later popularized this as the “Many Worlds Interpretation,” Everett’s original conception was closer to a pure information theory. He proved that a “typical” observer (defined by the measure of the Hilbert space coefficients) would record a sequence of results satisfying standard quantum statistics (the Born rule). Everett’s work represents the ultimate triumph of Information over Substance. Substance: There is no single, solid world. Information: The universal wave function is a purely information-theoretic entity, a catalog of all possible correlations.

This quantum revolution, with its dissolution of the independent “It” into informational potentia, measurement phenomena, entangled correlations, thermodynamic bits, and relative states, provided the philosophical and mathematical engine for Wheeler’s later “It from Bit” paradigm. The classical field’s continuity, already challenged by Einstein’s geometry, now faced an even more profound assault from the quantum domain, rendering the transition to geometrodynamics’ collapse not abrupt but inexorable, a culmination of the crisis that began with Heisenberg’s matrices and Bohr’s indivisible phenomena.

Synopsis: The Quantum Dissolution of Substance 1925–1957

Heisenberg’s matrices, Born’s probability rule, Bohr’s complementarity, Schrödinger’s entanglement, Szilard’s thermodynamic bit, and Everett’s relative states collectively annihilated the classical “It”: there remains only correlation, information, and observer-relative facts: no independent substance, no absolute trajectory.

The Collapse of Geometrodynamics and the Rise of Information (1950–1980)

If Einstein established the stage, it was John Archibald Wheeler who attempted to dismantle it to find what lay beneath. Wheeler, a physicist of immense imagination who had worked with both Niels Bohr on nuclear fission and Albert Einstein on unification, spent the mid-20th century obsessed with a radical unification program known as Geometrodynamics. His intellectual journey from “Everything is Geometry” to “Everything is Information” represents the pivot point of modern physics.

The Failure of “Everything is Geometry”

Wheeler’s ambition in the 1950s and 60s was to eliminate “matter” entirely. He proposed that particles like electrons and protons were not foreign objects placed on the stage of spacetime, but were rather intense, localized knots of curvature in spacetime itself, structures he termed “geons” (gravitational electromagnetic entities). In this monistic view, there was no “It” separate from the geometry; there was only empty curved space. A charge was not a particle but a “wormhole” mouth trapping lines of force.

However, this dream of a pure geometric ontology collapsed under the weight of quantum reality. Wheeler realized that at the Planck scale (103310^{-33} cm), the smooth manifold of Einstein must break down into a “quantum foam,” where topology fluctuates violently, creating and destroying microscopic wormholes. Furthermore, the existence of spin-1/2 particles (fermions) proved mathematically impossible to construct purely from standard 4D geometry without introducing external structures. The “It” refused to be reduced to pure geometry.

This failure drove Wheeler toward a profound philosophical pivot. If geometry was not the bottom, what was? His interactions with his student Jacob Bekenstein regarding the thermodynamics of black holes provided the spark for a new ontology that would eventually be called “It from Bit.”

The Teacup and the Black Hole: Exploring the Entropy of the Void

In the early 1970s, Wheeler challenged his PhD student Jacob Bekenstein with a thought experiment that would ultimately destroy the concept of a classical continuum. Wheeler, alluding to the Second Law of Thermodynamics, joked about committing a “crime” by mixing hot tea with cold tea, thereby increasing the entropy of the universe without doing work. He noted that if he threw the teacup into a black hole, the entropy would seemingly vanish from the observable universe, violating the Second Law. The black hole, according to classical General Relativity, was a featureless pit; it had “no hair” (a phrase popularized by Wheeler, which his wife Janette purportedly noted showed his “naughty side”). If it had no internal features, it could have no entropy.

Bekenstein’s solution was radical: the black hole itself must possess entropy. Crucially, he proposed that this entropy was proportional not to the black hole’s volume (as one would expect for a container of gas), but to the area of its event horizon. This was derived from the realization that the area of a black hole event horizon can never decrease, mirroring the behavior of entropy.

This was the first crack in the geometric facade that would lead to the Holographic Principle. It implied that the “amount of reality” (entropy/information) a region of space could hold was bounded by its 2D surface, not its 3D volume. The “It” (the matter inside the hole) was encoded on the “Bit” (the surface area). Wheeler championed Bekenstein’s result, despite initial skepticism from Stephen Hawking (who later confirmed it via Hawking Radiation), and it led Wheeler to realize that thermodynamics and information were deeper than geometry.

“It from Bit”: The Participatory Universe (1989)

By 1989, Wheeler had fully transitioned from a “geometry-first” perspective to an “information-first” perspective. In his seminal essay “Information, Physics, Quantum: The Search for Links,” presented at the 3rd International Symposium on Foundations of Quantum Mechanics in Tokyo, he coined the aphorism “It from Bit.”

Wheeler’s thesis was a mandate for radical reconstruction. He argued that the physical world is not a pre-existing machine, but a construct built from binary choices. He drew on the philosophy of Niels Bohr, whom he considered the greatest thinker since Einstein, and the mechanism of quantum measurement to argue that: “Every it, every particle, every field of force, even the spacetime continuum itself, derives its function, its meaning, its very existence entirely, even if in some contexts indirectly, from the apparatus-elicited answers to yes-or-no questions, binary choices, bits.”

Wheeler illustrated this with a modified version of the “Game of 20 Questions.” In the classical version, an object exists (e.g., a cat), and the player asks questions to identify it. This corresponds to classical physics: reality exists independently, and we measure it. In Wheeler’s “quantum” version, there is no object chosen beforehand. The players (nature) only decide that the answers must be consistent with previous answers. If the first answer is “animal,” the second cannot be “mineral,” but the specific animal is not determined until the final question is asked. The “object” (reality) emerges only at the end of the questioning process.

This view, which he termed the Participatory Universe, posits that the observer is not a passive spectator but a co-creator of reality. The continuum of spacetime is not fundamental; it is a secondary illusion synthesized from the aggregation of billions of binary quantum events. This marked the transition from the Absolute Stage (Newton) and the Dynamic Stage (Einstein) to the Emergent Stage (Wheeler).

Wheeler pushed this logic to its extreme with the concept of “Law without Law.” He argued that just as species evolve in biology, physical laws themselves might evolve from a chaotic, lawless beginning. In the “Big Bang,” there was no geometry, no time, and no laws, only the potential for information processing. The laws of physics, in this view, are merely the “frozen habits” of the universe, stabilized over eons of quantum questioning. This radical anti-reductionism set the stage for the modern informational turn in quantum gravity.

Causal Set Theory: The Discretization of History

The most direct heir to the idea that the continuum is an illusion is Causal Set Theory (CST), championed by Raphael Sorkin and Fay Dowker. Dowker, a Professor of Theoretical Physics at Imperial College London, possesses a direct lineage to the architects of spacetime thermodynamics; she completed her PhD under Stephen Hawking in 1990. Her work represents a mathematization of Wheeler’s intuition that the “deepest bottom” is discrete.

The Rejection of the Continuum

Dowker and Sorkin argue that if one takes the “It from Bit” seriously, one must abandon the notion of continuous space at the Planck scale. General Relativity predicts its own demise through singularities, points where the curvature becomes infinite and the laws of physics break down. To Dowker, singularities are not errors but signals: they indicate that the continuum assumption is merely an approximation, much like fluid mechanics is an approximation of discrete atoms. Just as water appears smooth but is composed of discrete molecules, spacetime appears smooth but is composed of discrete “atoms” of causality.

The Core Thesis of CST:

  • Discreteness: Spacetime is not infinitely divisible. It is made of discrete elements or “events.”
  • Causality: The fundamental relation binding these atoms is “causal order” (Before → After).

In this framework, the universe is not a geometry but a partially ordered set (poset). The “It” is the causal link itself. A spacetime geometry is recovered only when one “sprinkles” these causal points into a manifold via a Poisson process, much like a Pointillist painting reveals an image only from a distance. This “sprinkling” is critical because it solves a major problem in discrete gravity: the violation of Lorentz invariance. A regular grid (like a chessboard) violates relativity because it has preferred directions. A random sprinkling, however, preserves Lorentz symmetry statistically, allowing the smooth manifold of Einstein to emerge from the discrete dust of causal sets.

The “Birthing of Time

Dowker utilizes the Hasse Diagram to visualize this, a graph where nodes are events and directed edges represent causal influence. The “Bit” here is the existence (1) or non-existence (0) of a causal link between two events. As Dowker notes, “The causal structure is the substance of the theory.”

This approach radically reinterprets the passage of time. In the Einsteinian “Block Universe,” all of spacetime (past, present, future) exists simultaneously; the “now” is a subjective illusion. In Causal Set Theory, the universe is a “growing set.” New atoms of spacetime are born sequentially, adhering to specific probabilistic rules known as Classical Sequential Growth (CSG) dynamics.

This model reintroduces a genuine “becoming” into physics. The universe is not a static block but a process. Dowker argues that this provides an “objective physical correlate of our perception of time passing.” The “now” is the active edge of the causal set where new events are being birthed. Consciousness, in Dowker’s view, is the “internal view” of this objective birth process. What we experience as the flow of time is the accretion of new causal atoms onto the existing history of the universe. Here, the “bit” is the birth of a new event, a digital tick of the cosmic clock that expands the universe one atom at a time.

Relational Quantum Mechanics & LQG: The Disappearance of Time

While Dowker seeks to discretize the spacetime container to save the “flow” of time, Carlo Rovelli, a founder of Loop Quantum Gravity (LQG), seeks to dissolve the “container” entirely and argues that time itself is the illusion. Rovelli’s work is a synthesis of Wheeler’s insistence on the observer and the non-perturbative quantization of gravity.

Relational Quantum Mechanics (RQM)

Rovelli introduced Relational Quantum Mechanics in 1994, derived from the realization that quantum mechanics acts much like special relativity. In relativity, velocity is not a property of an object; an object has velocity only relative to an observer. Rovelli extends this to all physical states: an electron does not have a “position” or a “spin” in the abstract. It has a state only relative to a specific physical system interacting with it.

“The universe is not just simply the position of all its Democritean atoms. It is also the net of information systems have about other systems.” In this view, there is no “View from Nowhere” or “God’s Eye View” of the universe. There are only localized, relative descriptions.

The Relational “Bit”:

  • Bit: An interaction between two systems (a “measurement”).
  • It: The resulting correlation established between them.

This framework resolves the paradoxes of quantum mechanics (like Schrödinger’s Cat) by acknowledging that for one observer (inside the box), the cat is definite, while for another (outside), it remains entangled. There is no contradiction because there is no single, absolute “state of the universe.”

Loop Quantum Gravity and Spin Networks

When applied to gravity, this relational perspective leads to Loop Quantum Gravity. In LQG, the continuous metric of Einstein is replaced by Spin Networks, graphs of adjacency where nodes represent chunks of volume and links represent surfaces of area.

Crucially, these areas and volumes are quantized. Just as an electron can only have specific energy levels, space itself can only exist in discrete “packets” of volume (109910^{-99} cm³). These networks do not exist in space; they are space. A “spin foam” describes the evolution of these networks.

The Thermal Time Hypothesis

Perhaps the most radical consequence of Rovelli’s work is the Thermal Time Hypothesis. In the fundamental equations of LQG (the Wheeler-DeWitt equation), the variable tt (time) disappears entirely. The theory describes how physical variables change with respect to one another (e.g., how the position of a pendulum changes with respect to the position of a clock hand), but there is no external “time” governing the whole.

If time is not fundamental, why do we experience it? Rovelli argues that time is a macroscopic, statistical phenomenon, akin to “heat.” Just as “temperature” is not a property of a single molecule but an average of billions, “time” emerges only when we statistically average over the microscopic informational states we cannot track. Time is the expression of our ignorance. In a universe of perfect information, there would be no time, only a frozen network of relations. This aligns perfectly with the “It from Bit” ethos: the macroscopic world of “It” (time, heat, flow) is a blurred approximation of the microscopic “Bits” (relations).

Holography and “It from Qubit”: The Unification

The third and perhaps most dominant modern path fuses Wheeler’s information theory with High Energy Physics and String Theory. This movement, often unified under the slogan “It from Qubit,” posits that spacetime is a hologram. This field is led by figures like Juan Maldacena at the Institute for Advanced Study, Leonard Susskind at Stanford, and the researchers of the Simons Collaboration on “It from Qubit.”

The AdS/CFT Correspondence

In 1997, Juan Maldacena made a discovery that shook the foundations of physics: the AdS/CFT correspondence. He showed that a theory of quantum gravity (String Theory) in a bulk, saddle-shaped 3D space (Anti-de Sitter space or AdS) is mathematically equivalent to a quantum field theory (Conformal Field Theory or CFT) living on its 2D boundary.

This was the mathematical realization of the Holographic Principle hinted at by Bekenstein’s black hole entropy. It implies that everything happening inside the universe (gravity, stars, black holes) is a “hologram” projected from the interactions of particles on the boundary of the universe. The “It” (the 3D bulk) is generated by the “Bit” (the 2D boundary data).

Entanglement Builds Geometry

While Wheeler spoke of classical “bits” (Yes/No), Maldacena and his colleagues realized that the glue holding spacetime together is Quantum Entanglement. The slogan was explicitly updated from “It from Bit” to “It from Qubit” to reflect this quantum nature.

The connection between entanglement and geometry was solidified by the Ryu-Takayanagi formula, which relates the entanglement entropy of a region on the boundary to the area of a minimal surface dipping into the bulk spacetime. This suggests that the “area” of space is literally a measure of the entanglement between quantum fields.

ER=EPR: The Wormhole Connection

The most startling insight in this domain is the ER=EPR conjecture, proposed by Maldacena and Leonard Susskind in 2013. This conjecture links two concepts proposed by Einstein in 1935 which were thought to be unrelated:

  • ER: The Einstein-Rosen bridge (a wormhole connecting two regions of spacetime).
  • EPR: The Einstein-Podolsky-Rosen pair (two particles connected by quantum entanglement).

Maldacena and Susskind proposed that these are the same thing. Entanglement is a wormhole. A single pair of entangled particles is connected by a Planck-scale wormhole. If you entangle two black holes, they are connected by a large, geometric wormhole.

This resolves the Black Hole Firewall Paradox. The paradox suggests that if black hole evaporation preserves information (unitary quantum mechanics) and the event horizon is smooth (General Relativity), a contradiction arises regarding the entanglement of particles. ER=EPR resolves this by identifying the “inside” of the black hole with the entanglement radiation on the “outside.” The geometry of the interior is built out of the entanglement with the exterior.

This view suggests that the smooth connectivity of spacetime is an emergent property arising from the entanglement of quantum bits. If one were to break the entanglement (the “qubits”) between two regions of space, the space between them would literally pinch off and separate. Spacetime is not the stage; it is the result of the information processing of the boundary qubits.

Analysis: The Nature of the “Bit”

To fully grasp the magnitude of this revolution, one must compare how these distinct frameworks define the fundamental informational unit, the “Bit,” that builds the “It.” The divergence in their mathematical approaches disguises a striking convergence in their ontological conclusions.

┌──────────────────────────────────────────────────────────────────────┐
│ WHAT IS THE "BIT" OF REALITY? │
└──────────────────────────────────────────────────────────────────────┘

1. THE CAUSAL BIT (Sorkin/Dowker)
The Universe is a growing order.
BIT = A directed link (A causes B).
[ A ] ───> [ B ]

2. THE RELATIONAL BIT (Rovelli)
The Universe is a correlation.
BIT = Interaction between Systems.
[System 1] <~~~ (Correlation) ~~~> [System 2]

3. THE HOLOGRAPHIC BIT (Maldacena/Susskind)
The Universe is a projection.
BIT = Entanglement on the boundary (Qubit).
Boundary: 01101 ---> Bulk Geometry: (Spacetime)

4. THE PARTICIPATORY BIT (Wheeler)
The Universe is a question.
BIT = The Answer (Yes/No).
Observer (?) ───> [Nature] ───> "Yes"

The Shift from “Law” to “Code”

A subtle but profound trend visible across these theories is the shift from physics as “Law” (binding differential equations acting on a continuum) to physics as “Code” (algorithmic rules acting on discrete data).

In the Newtonian and Einsteinian paradigms, the universe was governed by differential equations. These equations assume a continuum; you can zoom in infinitely and the laws still hold. But in the “It from Bit” and “It from Qubit” paradigms, the laws are akin to cellular automata or logical gates.

Hilbert’s Dream Revisited: In 1900, Hilbert wished to axiomatize physics. While his specific continuum-based axioms were superseded, the spirit of his program has returned with a vengeance. Causal Set Theory and “It from Qubit” essentially attempt to find the “machine code” of the universe. The Bekenstein Bound (S=A/4S = A/4) acts as a constraint on the memory capacity of the universe, much like a hard drive limit. This implies the universe has a finite computational capacity.

╔══════════════════════════════════════════════════════════════════════╗
║ SYSTEM UPDATE: ONTOLOGICAL KERNEL CHANGE ║
╠══════════════════╦════════════════════════╦══════════════════════════╣
║ PARAMETER ║ VERSION 1.0 (Physics) ║ VERSION 2.0 (Info) ║
╠══════════════════╬════════════════════════╬══════════════════════════╣
║ PRIMITIVE UNIT ║ Point Mass / Particle ║ Qubit / Causal Link ║
║ ║ (Hard "Stuff") ║ (Pure Logic) ║
╠══════════════════╬════════════════════════╬══════════════════════════╣
║ OPERATING SYSTEM ║ Continuum (R⁴) ║ Discrete Graph (G) ║
║ ║ (Smooth Manifold) ║ (Network/Lattice) ║
╠══════════════════╬════════════════════════╬══════════════════════════╣
║ DYNAMICS ║ Differential Equations ║ Algorithms / Rules ║
║ ║ (∂x/∂t) ║ (If A then B) ║
╠══════════════════╬════════════════════════╬══════════════════════════╣
║ TIME ║ External Dimension t ║ Internal Update Step ║
║ ║ (The "Block") ║ (The "Tick") ║
╠══════════════════╬════════════════════════╬══════════════════════════╣
║ PERSPECTIVE ║ "View from Nowhere" ║ Relational / Internal ║
║ ║ (God's Eye) ║ (The User/Observer) ║
╚══════════════════╩════════════════════════╩══════════════════════════╝

The End of the “View from Nowhere”

Classical physics assumed an objective state of the world that existed independent of observation, a “God’s eye view.” Wheeler, Rovelli, and Dowker all dismantle this.

In Causal Sets, the “growth” of the universe happens, but there is no external time parameter to track it. The process is internal.

In Relational QM, there is no “state of the universe,” only states relative to specific subsystems.

In Holography, the description of the universe depends on where you place the boundary.

The “Bit” is always perspectival. Information is not a thing that exists in the void; it is a measure of correlation between two entities. The dematerialization of the “It” brings with it the realization that reality is fundamentally relational.

Synopsis: The Informational Turn 1957–2025

Wheeler’s “It from Bit”, Bekenstein–Hawking horizon entropy, causal set theory (Sorkin–Dowker), relational quantum mechanics (Rovelli), and holography (Maldacena–Susskind) converge on a single conclusion: geometry, time, and matter are emergent from a more primitive informational substrate composed of relational quanta (causal links, entanglement, qubits).

The Eternal Recurrence of the “It” and the Second Revolution

The quest for the fundamental constituent of reality has followed a cyclical pattern across millennia. Two archetypal models repeatedly contend: the discrete, which posits indivisible units moving through a void (Democritus’s atoms, Kaṇāda’s paramāṇu, Ashʿarite time-atoms, Newton’s corpuscles), and the continuous, which envisions an unbroken plenum of connection and resonance (Anaximander’s apeiron, Stoic pneuma, Chinese qi, Descartes’s vortices). Newton appeared to crown the discrete view with his hard, inert particles set against absolute emptiness. Yet history proved otherwise. Concepts once marginal, Kaṇāda’s unseen forces (adṛṣṭa), Chinese resonance (gǎnyìng), and Mohist relational time, re-emerged as electromagnetic fields, quantum entanglement, and relativistic spacetime.

By 1905 the classical “It” had already dissolved. Newton’s solid mass gave way to Leibniz’s perceiving monads, Maupertuis’s teleological Action became Hamilton’s abstract variational principle, heat and motion fused into Boltzmann’s statistical ensembles, and action-at-a-distance yielded to Faraday-Maxwell fields. The ether crisis exposed the final contradiction: a mechanical universe could no longer rest on an absolute, rigid stage. Matter was no longer a thing but a ripple, a probability, a curvature.

Einstein and Minkowski fused space and time into a dynamic continuum, making the stage itself an actor. For half a century geometry seemed ultimate. Yet Wheeler’s mid-century program revealed that even curved spacetime collapses at Planck scales into quantum foam. The true revolution, the second after relativity, was the recognition that geometry is emergent. Contemporary approaches converge on this insight:

  • Causal Set Theory (Dowker) replaces the continuum with a discrete partial order of events, from which spacetime approximates.
  • Loop Quantum Gravity (Rovelli) derives geometry from relational spin networks; time dissolves into thermal perspective.
  • Holographic duality (Maldacena) projects bulk spacetime from entanglement on a lower-dimensional boundary.

In each case the fundamental entities are not material points or geometric manifolds but causal links, relational quanta, and qubits. The universe is not a machine governed by prior laws. It is a participatory information-processing system. As Wheeler declared, physical reality arises from the answers to yes/no questions posed by observers embedded within the system itself.

The “It” has returned to its ancient discrete roots, yet transformed. The atom is now the bit, the causal connection, the entangled correlation. The void is not empty. It is pregnant with potential observations. What began with Thales’ water and Democritus’s atoms has culminated in the realization that the world is made neither of stuff nor of seamless fabric, but of information, the ultimate, self-referential substrate from which both continuity and discreteness emerge.

Starting from the premise that information is a fundamental constituent of reality, the first and most crucial question is: What is the simplest possible “bit” of reality and the simplest process of “participancy” from which a universe could emerge? We conclude that a single point is structurally sterile, lacking the relational potential for evolution. A single qubit is pure potential, a description of what could be, not what is. Its measurement outcome in a given basis is random, incapable of predicting anything beyond its own statistics. For a measurement to be meaningful, a relationship must already exist.

The Logical Operations of Reality

The historical trajectory from substance to field to information leaves us with a universe composed of bits, qubits, or causal sets. Yet, a fundamental problem persists: data without a processor is static. A "bit" is merely a state; it does not explain its own evolution or its own persistence.

To move from a description of states to a theory of dynamics, we must look to the logical operators that govern the relationship between pieces of information. If reality is fundamentally informational, its behaviors must derive from the two primary relationships available to any logical system: distinction and equivalence.

  • Inequality (\neq) is the Engine of Time. For a universe to be dynamic, the current state must be distinguishable from the next. The condition ABA \neq B establishes a gradient, a difference in information potential. This inequality is the fundamental requirement for any transition to occur. It differentiates cause from effect and provides the "imperative" for the system to update. Without inequality, there is no sequence, only a static singularity.

  • Equality (==) is the Architecture of Space. For a universe to contain objects, it must possess stability. The condition A=BA = B establishes a state of equilibrium where the drive for transition ceases or cycles. This equality is the fundamental requirement for structure to emerge. It creates a "solution" to the informational flux, allowing a pattern to persist against the flow of change. Without equality, there is no durability, only fleeting noise.

These two conditions, the logical drive to differentiate and the constraint to balance, provide the minimal framework required to construct a universe that both flows and endures. As we move to formal construction, we shall see that the drive of Inequality physically manifests as the Causal Link, while the constraint of Equality stabilizes as the Closed Cycle.

From Potential to Prediction

Building Geometry from Causality

A prediction is a statement of correlation. It is the ability to measure a property here and, based on that outcome, infer a property over there. This requires a system of at least two parts whose states are correlated. The minimal structure that contains such relational information is not a point or a qubit, but a causal connection.

We therefore posit that the most primitive element of reality is the directed edge, or causal link, denoted ABA \to B. This is not a statement about objects AA and BB. Instead, it describes the pure, directed relation of causal influence itself: the indivisible, pre-geometric atom of temporal order, “before implies after.”

While vertices (points, events) and edges (connections, relations) may be the simplest conceptual pieces of information, they are pre-geometric. Therefore, we propose a novel axiom: Relational cycles (loops) are the fundamental quanta of geometric information. This line of reasoning leads us to propose a foundation for the theory of Quantum Braid Dynamics, stated in two parts:

  1. The Primitive of Causality: The fundamental entity of the universe is the directed causal link, denoted ABA \to B. This is the irreducible atom of causal order.

  2. The Primitive of Geometry: The simplest, stable structure that can be built from these links, and the fundamental quantum of geometric information is the closed 3-cycle, ABCAA \to B \to C \to A. This self-referential loop provides the first stable standard against which metric intervals can be quantified and structure can be measured.

From matter to motion, we now stand at the threshold where philosophical speculation must yield to formal construction. The task ahead is to translate these conceptual primitives into a precise deductive system capable of generating dynamics, geometry, and ultimately cosmology using only the minimal assumptions required for a self-consistent universe to build itself from relational information alone.