The Founding of an Event-Ontology: Verlinde's Emergent Gravity and Whitehead's Actual Entities
by
Jesse Sterling Bettinger
A Dissertation submitted to the Faculty of Claremont Graduate University in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate Faculty of Religion and Economics
Claremont, California 2015
Approved by: ______
© Copyright by Jesse S. Bettinger 2015 All Rights Reserved
Abstract of the Dissertation
The Founding of an Event-Ontology: Verlinde's Emergent Gravity and Whitehead's Actual Entities
by
Jesse Sterling Bettinger
Claremont Graduate University: 2015
Whitehead’s 1929 categoreal framework of actual entities (AE’s) are hypothesized to provide an accurate foundation for a revised theory of gravity to arise compatible with Verlinde’s 2010 emergent gravity (EG) model, not as a fundamental force, but as the result of an entropic force. By the end of this study we should be in position to claim that the EG effect can in fact be seen as an integral sub-sequence of the AE process. To substantiate this claim, this study elaborates the conceptual architecture driving Verlinde’s emergent gravity hypothesis in concert with the corresponding structural dynamics of Whitehead’s philosophical/scientific logic comprising actual entities. This proceeds to the extent that both are shown to mutually integrate under the event-based covering logic of a generative process underwriting experience and physical ontology. In comparing the components of both frameworks across the epistemic modalities of pure philosophy, string theory, and cosmology/relativity physics, this study utilizes a geomodal convention as a pre-linguistic, neutral observation language—like an augur between the two theories—wherein a visual event-logic is progressively enunciated in concert with the specific details of both models, leading to a cross-pollinized language of concepts shown to mutually inform each other. The geomodal framework will be implemented in this study as an exegetical modeling convention. From this study we will attempt to construct a set of narratives for string theory and AE’s on the basis of an event logic and process ontology. Combining these two fields brings to light novel connections between the sciences and humanities as well as offering a method for realizing a new, narrative logic in string theory and philosophy of mind.
"On ne voit bien qu'avec le cœur. L'essentiel est invisible pour les yeux." (One sees clearly only with the heart. What is essential is invisible to the eye.) A. Saint-Exupery
Acknowledgements
Heartfelt, special thanks where it is due. Gratitude to my advisors, Phil Clayton, Vatche
Sahakian, and Paul Zak for their willingness to take on the project; to Tim Eastman for providing useful feedback and intellectual motivation; to Edris Stuebner for positive encouragement and moral support over the years; to the Athenas, to the Stars, and to the
Lions: you were the inspiration every step of the way; to the Negritto family for their generosity and for setting the tone for the year; to the Frazier family for their gracious friendship, talks, and example; to Scott Bracken and Jeremy Ognall for the phenomenal rugby experience and for taking us to the beach. Experiencing that was life-changing and provided the determination and wherewithal to see this project through to the end. You have no idea how much my time with you all has meant, but it has meant the world.
iv
Table of Contents
PART I – Introductory Materials 1. Introduction – Prospectus + Methodology 2. Physico-Conceptual Foundations of Emergent Gravity PART II – Outline of Models 3. Verlinde’s Emergent Gravity 4. Whitehead’s Actual Occasions PART III – Comparative + Geomodal 5. Origination 6. Creativity + Synthesis PART IV – Review 7. Discussion –Einstein and Whiteheadian Gravity 8. Conclusion – Review and Denouement
1. Introduction ------1 1.1. --- Methodology ------3 1.2. --- Conceptual and Phenomenal Placement ------5 1.3. --- Scale ------7 1.4. --- Ontology ------8 1.5. --- Organization of Chapters ------9
2. Physico-Conceptual Foundations of Emergent Gravity ------13 2.1. --- Emergence of XT ------13 2.2. --- General Relativity ------15 2.2.1. Curvature 2.2.2. Expansion + Cosmological Constant 2.3. --- Quantum Theory ------17 2.3.1. Quantum Mechanics 2.3.2. Quantum Field Theory 2.3.3. Standard Model 2.3.4. Gauge Theory 2.3.5. Quantum Chromodynamics 2.3.6. Yang-Mills Theory 2.4. --- Dark Energy/Dark Matter ------21 2.4.1. Accelerated Expansion 2.5. --- Vacuum Energy of QM ------22 2.5.1. Virtual Particles 2.5.2. Vacuum as Plenum 2.6. --- Quantum Gravity ------22 2.7. --- Quantization v. Non-Quantization ------24 2.8. --- UV/IR Mixing ------25 2.8.1. Planck Scale 2.9. --- Black Holes ------26 2.9.1. Four Laws 2.9.2. Statistical Mechanics
v
2.10. --- Geometrical Entropy ------29 2.11. --- Hawking Radiation ------31 2.12. --- Black Hole Information Paradox; Entropy as Information ------32 2.12.1. It from Bit ‰ geometric entropy of Planck horizon 2.13. --- Holographic Principle ------33 2.14. --- String Theory ------35 2.14.1. Basic History 2.14.2. Open Strings 2.14.3. =4 Super Yang-Mills Theory 2.14.4. Closed Strings 2.14.4.1. --- Closed strings as phonons 2.14.4.2. --- Closed strings as coupling constants 2.14.4.3. --- Quantum coupling constants as dynamical 2.14.5. D-Branes 2.14.5.1. --- Solitons 2.14.6. Open/Closed String Correspondence 2.14.7. Gauge/Gravity Duality ‰ AdS/CFT Correspondence 2.15. Emergence of Gravity ------48 2.16. Summary ------50
3. Verlinde’s Emergent Gravity ------51 3.1. --- Introduction ------51 3.2. --- Non-Quantizational Approaches to QG ------51 3.3. --- Sakharov’s Induced Gravity ------52 3.4. --- Jacobson’s Gravitational Thermodynamics ------54 3.5. --- Distinguishing Verlinde from Predecessors ------57 3.6. --- Verlinde’s Entropic Gravity ------59 3.6.1. Universality of Gravity 3.6.2. Emergence of Space-time and Gravity 3.6.3. Information 3.6.4. Holographic Principle 3.6.5. Entropic Force 3.6.6. Polymers and Black Hole Thermodynamics 3.6.7. Information and Storage on Holographic Screens 3.6.8. Derivation of Newton’s Laws 3.6.9. Emergence of Space 3.6.10. Coarse Graining 3.7. --- String Theoretic Approach ------67 3.7.1. Open-Closed String Correspondence and AdS/CFT 3.7.2. Matrix Theory 3.7.3. Adiabatic Reaction Force 3.7.4. Hidden Phase Space 3.7.5. Inertia and Gravity as Adiabatic Reaction Forces 3.8. --- The End of Gravity as a Fundamental Force ------70 3.9. --- Summary ------71
vi
4. The Actual Entities ------72 4.1. --- Philosophy of Organism ------72 4.2. --- Experiential Metaphysics and Speculative Philosophy ------74 4.3. --- From Substance to Event Ontology ------75 4.4. --- Uniquely Suited to Mathematical Physics ------79 4.5. --- Actual Entities ------80 4.5.1. Prehension 4.5.2. Simple Physical Feelings 4.5.3. Subjective Forms 4.5.4. Initial/Subjective Aim and Decision 4.5.5. Concrescence 4.5.6. Satisfaction 4.5.7. Unity and Determinateness
5. Origination, Emergence, Reenactment ------98 5.1. --- Geomodal Construct ------100 5.1.1. Geometry and Physics: Two Metrics, Not One 5.1.2. Minkowski’s Lightcone 5.1.3. Hypersurface of the Present and Manifold 5.1.4. Ontological Immediacy v. Sensory-Conscious Present 5.2. --- Whitehead and Verlinde Signatures in an Event-Ontology ------106 5.2.1. Sea of Strands ------107 5.2.1.1. Strands in Geomodal method 5.2.1.2. Strands in Chew 5.2.1.3. Quantum fluctuations and Casimir effect 5.2.1.4. Strands as pre-XT microscopic data 5.2.1.5. Link to Tachyonic String Theory (26d Bosonic) 5.2.1.6. Strands as pure potentialities 5.2.2. Snapshot + Photograph ------110 5.2.2.1. Measurement Problem 5.2.2.2. Snapshot (as Mechanism) 5.2.2.3. Similar to Sen’s 1 st example 5.2.2.4. Snapshot of Frozen Strands 5.2.2.5. Dn-brane of Open Strings 5.2.2.6. Multiplicity of Initial Data 5.2.2.7. Emptiness and Dependent Arising 5.2.3. Holographic Dual of Snapshot ------121 5.2.3.1. Initial Data ‰ Objective Data ‘Reenactment’ 5.2.3.2. Open-Closed String Correspondence 5.2.4. Phonon ------125 5.2.4.1. Phonon is emergent from snapshot elements 5.2.4.2. Similar to graviton radiating off a D0-brane 5.2.4.3. Instead of graviton, phonons qua closed strings 5.2.4.4. Phonon as a revival of the “objective datum” 5.3. --- Dictionary and Summary ------129
vii
6. Selection, Creativity, Synthesis ------133 6.1. --- Phonon as Coupling for Prehension during Renormalization ------135 6.1.1. Coarse-graining: Foliation = Prehension: Concrescence 6.2. --- Prehension and Coarse-Graining ------139 6.2.1. Prehension/Coarse-Graining as a Selection Process 6.2.2. (-) Prehension & integration-out of open strings in a matrix 6.2.3. (+) Prehension & open-string acquisition of expectation value 6.3. --- Concrescence and Foliation ------145 6.3.1. Phases of Concrescence 6.3.2. Concrescence of feelings ‰ emergent dimension of space 6.3.3. Foliation of “feelings” ‰ genetic phases 6.4. --- Verlinde’s Matrix Theory------148 6.5. --- Satisfaction = Gravitational Self-Energy ------150 6.5.1. Max. of Coarse-Graining ‰ Newton’s Potential Φ 6.6. --- Summary ------155
7. Discussion ------158 7.1. --- The Principle of Relativity ------158 7.2. --- Philosophical Distinctions Between Einstein and Whitehead ------160 7.2.1. Experience 7.2.2. Two Metrics, Not One 7.2.2.1. The First Metric 7.2.2.2. The Second Metric 7.2.3. Space and Time 7.2.4. Uniformity 7.2.5. Measurement 7.2.6. Simultaneity 7.3. --- Comparing Whitehead and Einstein to Verlinde’s EG ------178 7.4. --- Summary ------181
8. Conclusion ------182 8.1. --- Science in an Emergent Paradigm ------183 8.2. --- Précis ------187 8.2.1. Prehension and Concrescence 8.2.2. Satisfaction and Maximization of Coarse Graining 8.3. --- Denouement ------197 8.3.1. String Theory Epical Narrative 8.3.2. AE’s Epical Narrative 8.3.3. Two Aspects of One Process 8.3.4. Closing
viii
Chapter 1 – Preface/Methodology
I look for the hour when that supreme Beauty, which ravished the souls of those eastern men, and chiefly of those Hebrews, that through their lips spoke oracles to all time, shall speak in the West also. The Hebrew and Greek Scriptures contain immortal sentences that have been bread of life to millions. But they have no epical integrity; are fragmentary; are not shown in their order to the intellect. I look for the new Teacher, that shall follow so far those shining laws, that he shall see them come full circle; shall see their rounding complete grace; shall see the world to be the mirror of the soul; shall see the identity of the law of gravitation with purity of heart; and shall show that the Ought, that Duty, is one thing with Science, with Beauty, and with Joy.
- Ralph Waldo Emerson, closing statement of "Divinity School Address" (1838) to graduating class
This study is pursued in the philosophy of physics and aims to serve in part as a contribution to the postulates underlying physics through an exegesis of Verlinde’s (2010) emergent gravity (EG) model in the context of Whitehead’s (1929) process dynamics and descriptive account of the actual entities (AE’s). The goal is to provide the foundations of contemporary physics with a philosophical foothold and narrative within a process and event logic. While Verlinde’s model is not the only one, the way he constructs his case sets it most in line with Whitehead’s development of AE’s. This defends a joint (physical and philosophical) ontology beginning from processes and events. In this framework, not only gravity and string theory (Sakharov, 1967; Jacobson, 1995; Verlinde, 2010, 2011; Padmanabhan, 2012; Frampton, Kephart, 2005) but even space and time (Dijkgraaf, 2012; Seiberg, 2009; etc.) are emergently-derived from processes and events (Whitehead, 1922).
As an interdisciplinary study in the truest sense, this effort is meant to bring together two highly-technical disciplines—in one case, written for experts in mathematical-physics and in the other case, for experts in Whiteheadian studies—and in such a way as to esteem the concomitance brought to bear progressively throughout the course of this report. To these ends, the neologisms of one subject-area might not be immediately accessible to an expert from either field, in particular; however, as we’ll encounter in the comparative chapters, the concepts and terms used by one field are made readily available by pairing them with a term found in the other field. As such we create a “dictionary” between both fields. This allows for an expert in physics and string theory to recognize the concepts they understand within a different set of neologisms in Whitehead’s philosophy. Providing these dictionaries also ensures against the claim from either side that we have excluded them from the ideas of the other; instead we provide an on-the-spot translation for both sides to participate with each other.
In physics, the quest to understand the microscopic structure of space-time represents the driving method of scholars attempting to merge quantum theory with gravitation (Chivukula, 2010); historically speaking however, quantum mechanics, as a theory of the exceptionally small, appears incompatible with general relativity, as a theory of the exceedingly large. Such a tension drives Mäkelä to state that “ instead of attempting to understand the microstructure of matter, we should…attempt to understand the microstructure of spacetime itself ” (Mäkelä, 2010). Out of this apparent antagonism arises the sub-discipline
1 of quantum gravity and the subsequent approaches that have emerged over the last half- century to try and reconcile gravity with the other known laws.
Still today, conceptual perspectives in physics continue to differ with regard to the understanding of space and time: while some scholars hold to space and time as fundamental and discrete (see, e.g., Polchinski, 1998; Rovelli, 2004; Gao, 2011), others like Verlinde take them to be more emergent properties (see, e.g., Witten, 2004; Seiberg, 2006; Padmanabhan, 2012). It is suggested here that these differences can be attributed to artifacts of distinction between substance- and process metaphysics. As well, they are seen as representative of two basic approaches to spacetime and gravity: the quantization approach to quantum gravity, and non-quantizational, induced or emergent approaches.
In accordance with the general theme of this study we lend our attention to non- quantizational (emergent) approaches and look at two of the most substantial contributions within semi-classical methods: Sakharov’s induced gravity and Jacobson’s gravitational thermodynamics, both of which serve in setting the mathematical and conceptual stage for Verlinde’s 2009 insight that gravity may not be a fundamental force but a macroscopic phenomenon emerging as the result of thermodynamic principles applied to phase-changes of information in dynamic mass-distributions (Chivukula, 2010).
To advance this logic, a geomodal method will be employed in chapters five and six to describe a kinematic and dynamical event-sequence – serving in addition as an exegetical standard used to compare Verlinde’s EG to Whitehead’s AE’s. We often think of the term “geometry” as referring to shapes and their properties. The geomodal method adds another layer to include spaces-within-shapes (or attributes of a natural symbol) as ontologically and conceptually significant. Here, a ‘physical symbol’ arises from Minkowski’s 4d lightcone of space-time, a central tenet both of Einstein’s Special Relativity as well as Whitehead’s first metric. The purpose of the geomodal method is to situate an event-ontology at the seat of experiential/material dynamics in the form of a basic, pictorial logic made easy for comprehension.
In this study, Whitehead’s 1929 categoreal framework of actual entities is hypothesized to provide a coherent foundation for Verlinde’s revised theory of gravitation to arise compatible with his 2010 emergent gravity model—not as a fundamental force, but as the result of an entropic force qua thermodynamics and string theory. If this can be established, this study aims to show how the EG effect could be interpreted as an integral sub-sequence within the description of Whitehead’s AE process. From this we will propose that an event serves as the foundational unit in physics, rather than a substance. This signifies a shift in both philosophical and physical paradigms from material to events and process—out-of-which material values emerge: a both/and (see Eastman, 2009).
At first blush there might seem little reason for trying to link two so-seemingly disparate fields and concepts together. After all, the AE’s were developed out of a discontent with substance metaphysics whose response levies an attempt to describe science predicated
2 on, and amenable to, experience. Verlinde’s model, on the other hand, developed as an insight into the non-fundamental description of gravity predicated on the physics of gravitational thermodynamics, statistical mechanics, and the holographic principle in the context of black holes and string theory. In another sense, however, the potential to recognize Verlinde’s gravitational theory in light of Whitehead’s actual entities does not come as a complete surprise. In fact, there are a few, substantive reasons why a theory of emergent gravity should find comport in Whitehead’s categorical program. We consider these now.
1.1 – In Methodology : speculative philosophy and speculative physics
Both Verlinde and Whitehead’s conceptual frameworks can be recognized as speculative ventures into speculative physics/cosmology and speculative philosophy, respectively. Bradley defines speculative philosophy succinctly as, “ a theory of the conditions of the actualization of the empirical world ” (2007). We use this definition for our study. For Whitehead, speculative philosophy qua metaphysics is “ the science which seeks to discover the general ideas which are indispensably relevant to the analysis of everything that happens ” (RM 84). As he states: “my arguments must be based upon considerations of the utmost generality untouched by the peculiar features of any particular natural science ” (PRel 14). During Whitehead’s time this was likely the safe bet, given the state of physics, though today we have uncovered new models that can in fact make contact with the natural sciences from the location of Whitehead’s AE’s, as will be identified and highlighted in this study.
On Verlinde’s side, of all the physical theories about nature only two branches are independent of concrete details of the system being considered: thermodynamics and relativity (see Liu, 2010). 1 This sets them onto the scale and order of universal theories and what Whitehead would agree describe some of the most general features of nature. As Liu explains, “ thermodynamics and relativity are two theories about the universal principles every physical system must obey and are hence referred to as principle theories ” (Liu, 2010). As such we have set from the start a stable basis for comparing the universality and generality of both Verlinde and Whitehead’s programs. Verlinde further describes the universality of gravity in another (2010) passage:
Of all forces of nature, gravity is clearly the most universal: gravity influences and is influenced by everything that carries an energy, and is intimately connected with the structure of space-time. The universal nature of gravity is also demonstrated by the fact that its basic equations closely resemble the laws of thermodynamics and hydrodynamics.
Aptly, both Whitehead and Verlinde’s programmes 2 can be shown to derive on the basis of Aristotelian “ first principles” qua the most general concepts and universal phenomena.
1 Renormalization Group theories in QFT are known to yield multiple levels that could also function partially independent of micro-system details; this possibility will be addressed in chapter six. 2 The Lakatosian research programme (1978) provides a framework within which research can be conducted on the basis of 'first principles.' As such it also resembles Kuhn's notion of a paradigm (1962). 3
Both are methodologically approached from the level of a speculative venture and as general ideas. Speculative philosophy as conceived by Whitehead represents the “ the endeavor to frame a coherent, logical, necessary system of general ideas in terms of which every element of our experience can be interpreted ” (see Sherburne, 1966). As he explains in (PR 6):
The first requisite is to proceed by the method of generalization so that certainly there is some application; and the test of some success is application beyond the immediate origin. In other words, some synoptic vision has been gained. In this description of philosophic method, the term ‘philosophic generalization’ has meant ‘the utilization of specific notions, applying to a restricted group of facts, for the divination of the generic notions which apply to all facts.’
Turning to the basis of Whitehead’s approach as a ‘speculative metaphysics’ linked to Aristotle’s ‘first philosophy’ by Ramal (2003) we can also recover the ingrained motivation for framing reality from the first principles of ‘ being qua being ’ (Aristotle). As Ramal explains: “ the first philosopher looks for the first principles that render reality intelligible by means of descriptive generalizations ” (2003). Whitehead is also known to have predicated his philosophical method on the pursuit of “imaginative rationalization” (PR 7) or what he also calls a “descriptive generalization” (PR 15). For Verlinde things are also suitably generalized; as he describes, “starting from first principles and general assumptions, using only space independent concepts like energy, entropy and temperature ,” his paper shows how Newton's laws of gravitation “appears naturally and practically unavoidably ” (2010). Ramal links the ‘first principles’ to being-as-such (Aristotle); he explains:
Since the “essential attributes” of being as such are the first principles, first philosophy differs from mathematics and the other sciences in that it seeks to study the most universal first principles, not simply the general principles or causes of a particular aspect of reality (Ramal, 2003).
Where Verlinde indicates a method predicated on a description of the general principles underwriting the emergence of gravity as a ‘universal feature’ intimately linked to the structure of space-time, and ‘influenced by everything that carries energy,’ so too the topics of space and time are also shown to have chief import in the description of phenomena underwriting reality as expressed in Whitehead’s Principle of Relativity (1922), and Process and Reality (1929), in constructing the categoreal scheme of the AE’s. As will be shown in chapters seven and eight, Whitehead’s model defines space and time as abstractions from AE’s.
While linked to a firm physical basis, Verlinde’s model is still patently speculative and communicated almost exclusively from the level of first principles and general ideas. As he states: “ I use a lot of ideas from string theory but…I feel one should try to extract the essence from it and start from certain principles…I think the principles will be more important ” (2011). In another case he develops his motivation further: “ I’m more interested in finding out how nature works ” (Verlinde, 2011). What little equations are found in his 2010 paper, he admits in an interview, were just for making a point to the reader and weren’t even essential to conveying the idea of the hypothesis (see 2011). This might sound like a reason not t0 pay
4
much attention to the model from a physicist’s perspective, but from our philosophical standpoint, Verlinde’s model—plus concepts in similar ones: Berenstein, 2006; Li and Wang, 2012—provides the kind of framework most useful for assessing compatibility with Whitehead’s actual entities..
1.2 – In Conceptual and Phenomenal Placement : AE’s housed within a principle of relativity standing in most important predictions alongside Einstein’s theory. It will be shown in the discussion how the conceptual differences between Whitehead to Einstein’s theory set the former closer in alignment with Verlinde.
In 1922 Whitehead wrote The Principle of Relativity with Applications to Physical Science (PRel) with the aim of reformulating Einstein’s theory of gravity in such a way that “gravity would no longer be identified with the allegedly variably curved space-time, but with a physical interaction (Whitehead’s gravitational impetus) that can be defined against the uniform background of Minkowski’s space-time ” (Desmet, 2010). As Whitehead explains:
The present work is an exposition of an alternative rendering of the theory of relativity. It takes its rise from that “awakening from dogmatic slumber”—to use Kant’s phrase—which we owe to Einstein and Minkowski. But it is not an attempt to expound either Einstein’s earlier or his later theory. The metrical formulae finally arrived at are those of the earlier theory, but the meanings ascribed to the algebraic symbols are entirely different. (PRel, v)
In fact, the two theories can be considered largely equivalent in many important respects. For example, Fowler is known to have constructed an interpretation of Whitehead's theory qualifying it as an alternate, mathematically equivalent presentation of GR (see Fowler, 1974). As Bain elaborates, Whitehead’s theory “ makes the same predictions as general relativity with respect to the perihelion advance, the deflection of light rays and the gravitational red-shift; indeed, Eddington (1924) has shown that it is equivalent to the Schwartzschild solution of Einstein’s field equations for the one-body problem” (Bain, 1998).
For Whitehead the geometric structure of nature grows out of the relations among actual entities (Fowler, 1974). Unlike Einstein, Whitehead was after a theory predicated on experience, broadly interpreted. Desmet explains how in the preface to Whitehead’s Principles of Natural Knowledge (1919) he stresses that “ the modern theory of relativity, because of its union of space and time, has opened the possibility of a new answer to the question of how the space of physical geometry can be conceived as the logical outcome of generalizations from experience ” (quoted in: Desmet, 2007). As a result of this, Whitehead’s theory “ holds a different paradigm from Einstein's—elegant and simple in mathematical formulation and with its own philosophical background. It has been called a thorn in Einstein's side because it agrees with Einstein in its prediction for all the classical tests ” (Tanaka, 1987). This guides us towards the realization that the real issues between Einstein and Whitehead are not physical but philosophical (see Desmet, 2010). As Fowler expresses: “ No empirical test can decide the issue of the adequacy of Whitehead's basic theory of relativity. This issue must be settled on other grounds ” (Fowler, 1974). To these ends we seek a philosophical and contextual assessment.
5
For Whitehead, the principle of relativity was paramount to his speculative metaphysics: “The doctrine of relativity affects every branch of natural science, not excluding the biological sciences ” (PRel 3). We see Verlinde’s model also make indirect contact with the biological through his concurrent development of the polymer example throughout the 2010 paper. This provides a level of contact not witnessed in Einstein’s more-mentalistic framework that can be seen on account of the fact that, as Fowler explains:
The key foundational principles of Einstein’s theory -- the constancy of the velocity of light and the equivalence principle -- are postulates which are the free creations of the mind and not open to immediate experience. (Fowler, 1974)
Einstein goes even farther, fully in keeping with a neo-Kantian perspective, saying that "time and space are modes in which we think, not conditions in which we live ." By contrast, Whitehead’s uses his appeal to the “ immediate experience of simultaneity and the contemporary world as the foundation of relativity ” (Fowler, 1974). This is, for Whitehead, a predication of space and time as abstractions from events as relations, and as such are also shown to be emergent and creative. As he explains:
The whole investigation is based on the principle that the scientific concepts of space and time are the first outcome of the simplest generalizations from experience, and that they are not to be looked-for at the tail end of a welter of differential equations. (Whitehead; PNK vi)
Whitehead's theory of relativity is so-closely connected with the processual nature of his speculative metaphysics that we cannot attempt to understand it without paying due attention to his philosophy. As Bain reiterates, “The ontological relationship between the two must be fleshed out in the context of Whitehead’s philosophy of nature ” (Bain, 1998). Thus we can be led to the view, like Fowler draws, that Whitehead’s theory of gravitation offers a framework based within a comprehensive philosophy of nature whereas Einstein’s model seems to resemble little approaching the likes of experience (see Fowler, 1974).
As described at the beginning, in distinction from Einstein, the formula Whitehead adopts for the gravitational field involves both the flat metric of Minkowski space-time and a dynamic metric dependent on the presence of source masses. In order to find a mathematical expression for the law of gravitation, Whitehead introduces the second metric, dJ 2, to represent the gravitational field of a particle and describe the way a particle “pervades” its future (PRel 74 ). This is specifically developed with the AE’s in mind. As Whitehead explains, the “ individual peculiarities of actual occasions” represent the properties of the physical contingent world (dJ 2) while the " background of systematic geometry " represents the metric of uniform background space-time, dG 2 (PRel 58).
Whitehead describes the physical field in his Principle of Relativity as expressing “ the unessential uniformities regulating the contingency of appearance ” (PRel 8). Similarly, to complete the quote from above, in Process and Reality he describes the physical field as the " interweaving of the individual peculiarities of actual occasions on the background of systematic geometry " (PR 507). The geometry is systematized to the extent that all values
6 are valued in a mode of being; that is, from derived hypersurfaces of a manifold. When we advance to Verlinde’s approach, the move to introduce AE’s into what Verlinde refers to as the inessential microscopic information has a very proper and defendable justification: to the extent that Whitehead defines the physical field as expressing the “unessential uniformities regulating the contingency of appearance, ” Verlinde’s model locates “inessential microscopic data” predicating the emergent gravity effect. As he explains: “The universality of gravity suggests that its emergence should be understood from general principles that are independent of the specific details of the underlying microscopic theory ” (2010).
With each distinction raised in this chapter we aim to realize how Whitehead’s model draws closer in semblance to that of Verlinde’s approach rather than with Einstein’s. Whitehead’s two-tensor construction is one of the key distinctions; perhaps equally important, however, we also realize how Whitehead programmed the AE’s directly into the microscopic details of his theory of gravity. The major claim (to be developed in chapter seven) is by that maintaining the notion of gravity through a real, physical interpretation, as Whitehead does, ultimately brings us closer to Verlinde’s development of gravity as arising through an entropic force—than it does with Einstein’s notion of gravity as solely the result of geodesics and geometry. 3
1.3 – Scale : Both AE’s and EG describe dynamics at smallest level of phenomena as well as in the largest, cosmological contexts. We refer to this in physics as UV/IR mixing.
Our motivation for linking Whitehead’s AE’s to Verlinde’s EG draws from dynamics encountered on the order of string theory. String theory combines quantum mechanics and general relativity into one framework in a rather elegant way. That said, string theory is also subject to revisions due both to a lack of experimental testing as well as theoretical loose-ends. As Verlinde explains: “ String theory has many correct elements, but I think we need to rethink the starting point. We have all kinds of elements but we don’t really know how they hang together. We have to find this new starting point ” (Verlinde, 2011). Out of this Verlinde hopes to change the view of string theory from a given- to an emergent process. I propose that Whitehead’s cosmology of “actual entities” offers a potential platform for this venture and that an event-logic represents the desired starting basis.
If we are to take Whitehead seriously in acknowledging the categoreal scheme of actual entities as describing the most-fundamental values and dynamics in the “ experience of subjects ”—apart from and behind which “ there is nothing, nothing, bare nothingness ” (PR 167), and that “there is no going behind actual entities to find anything more real” (PR 23) — then this implies that the pursuit of fundamental dynamics must also be located at the smallest distance-scale of nature: the Planck scale. It would be here where we should
3 A program of completing the geometrization of general relativity was attempted in the 1960’s by Wheeler and colleagues. This “geometrodynamics” represents the bid to describe space-time and associated phenomena wholly in terms of geometry. More recently, Isham and Butterfield (1999) also develop a quantum version to evaluate work toward a quantum theory of gravity. 7 expect to encounter, at least in part, the type of dynamics able to be correlated with what Whitehead had in mind for the AE’s.
‘t Hooft surmises that when we get to the Planck scale we should encounter a new type of dynamics he refers to as pre-quantum (1999). As Verlinde explains, the microscopic theory is without space or laws of Newton. The hypothesis tended in this study suggests that the precursors of AE’s: ‘continuous potentialities’ (PR 102) are what dwell, at least in part, at the Planck scale and pre-quantum, like a sea of bosonic strings. In addition, within physics we encounter phenomena at the Planck distance-scale in: string theory, UV/IR mixing, QFT/tachyons, and the holographic principle plus AdS/CFT correspondence. 4 These prove rich in their descriptions of dynamics that can be read into the AE program with ease through an event-ontology. As Whitehead explains:
The actual entities—are the final real things of which the world is made. There is no going behind actual entities to find anything more real. […] The final facts are, all alike, actual entities, and these actual entities are drops of experience, complex and interdependent. (PR 18)
By assuming string theory as the actual physical basis for linking up with the AE’s—given Whitehead’s description of speculative metaphysics and Verlinde’s development of string theory, both predicated on dynamics taking place at the smallest distance scale of spacetime—we should expect to find phenomena described by string theory to also bear some resemblance to Whitehead’s description of AE’s. In fact, this is precisely what occurs in a vivid overlap between the concepts underwriting both descriptions. Out of this we will show string theory to provide a physical basis for the AE’s in a vivid overlap between the concepts underwriting both descriptions. These are spelled out in precise detail and an epical ordering in chapters five thru seven that can be read into other sequential interpretations of the AE’s such as in Cobb (2008) and Ford (1974).
1.4 – In Ontology : AE’s and EG both describe emergent phenomena
Like entropic gravity, the AE’s are duly predicated in the light of emergent phenomena: creative satisfactions qua final concrescences that emerge from the combinatorial, (positive and negative) prehensive dynamics of a feeling tone amidst a collection of multiplicities of objective data stemming from a set of initial data. As Cobb explains:
For the most part the occasion and all its prehensions express the causal efficacy of past occasions. The prehensions are better understood as expressing their causal efficacy in the constitution of the new, emergent occasion, which only comes into being as these prehensions integrate in it. (Cobb, WRB, 2008; p.35)
On Verlinde’s behalf, “ Newton's law of gravitation is shown to arise naturally and unavoidably in a theory in which space is emergent through a holographic scenario ” (2010). In fact, he considers
4 We review these concepts in chapters two and three. 8 gravity, space-time, and strings all as emergent phenomena. Seen in this light, Whitehead and Verlinde are both shown to seek the general principles underwriting emergent phenomena: the AE’s in Whitehead’s categoreal framework predicated on a two-tensor approach to space-time and relativity—and space-time/gravity, in Verlinde’s approach. From a wide view, this represents a general nod on the conceptual level to the paradigm of process/event logic over that of the classical substance metaphysics.
Given these four qualifications, it is no surprise that a tertiary framework predicated on an event-ontology and experientialism can be developed to demonstrate the same basic dynamics at play in both programmes. What is a surprise, however, is that while Whitehead’s alternative theory might not have resolved “ the true identity of gravity ” (Emerson, 1838), his Principle of Relativity—predicated on the categorical framework of actual occasions— he still might have brought crucial light to the matter as a harbinger towards what would eventually be outlined by Verlinde in concert with basic ideas pursued first in Jacobson, Bekenstein, and Sakharov plus later in Padmanabhan, Liu, Lee, and others. Put simply, Whitehead’s framework could provide the philosophical groundwork for emergent approaches in physics and cosmology.
This study aims to show that not only are Whitehead and Verlinde’s frameworks co- relevant, they are also closely coinciding. Even if Verlinde’s model requires further adjustments from within physics; and even if entropy were not the ultimate basis of emergent gravity but instead something like conformal matter in a world crystal (as in Danielewski, 2007; and Kleinert, 1987)—the general concepts underwriting his approach should still be shown to hold even if some of the details ultimately develop differently. To substantiate this claim, this study elaborates the conceptual architecture driving Verlinde’s emergent gravity in-concert-with the corresponding structural dynamics of Whitehead’s philosophical and scientific logic comprising actual entities. This proceeds to the extent that both are shown to mutually integrate under the covering-logic of a generative process and event-cycle underwriting experience and the physical universe. In comparing the components of both frameworks across the epistemic modalities of pure philosophy and cosmology/relativity physics, this study utilizes a pictorial modeling convention as a tertiary, “ neutral observation language ” — like an augur between the two theories—wherein an event-logic is progressively-enunciated in concert with the specific details of both models leading to a cross-pollinized, mutually-informing language.
1.5 – Organization of Chapters
In order to set the stage for a comparative analysis of the details underwriting emergent gravity and actual entities, a chapter is initially spent introducing the physico-conceptual foundations in their historical and conceptual contexts. This serves as a logical narrative for emergent gravity, beginning with an introduction to the historical branching of quantum gravity approaches—predicated on the desire to unify quantum mechanics with relativity—and with regard to the decision whether to quantize gravity or not. This study focuses on non-quantization approaches. From here we select for detailed analysis two
9 definitive works characterizing semi-classical methods within the non-quantization approach and setting the stage for Verlinde’s ultimate paper: Sakharov’s (1967) induced gravity and Jacobson’s gravitational thermodynamics (1995).
Predicated on the early triumphs of Sakharov and Jacobson’s non-quantization approaches to quantum gravity—and taken in tangent with foundational breakthroughs in black hole thermodynamics beginning with Bekenstein, Bardeen, Carter, and Hawking (1973)—a step-by-step conceptual outline of the ‘emergent gravity’ hypothesis is framed within Verlinde’s 2010 paper, in chapter three. With Verlinde’s proposal the notion of emergent gravity receives a solid conceptual foundation using minimal equations to grasp the idea in an accessible way; thus, it is presented as a general theory on the order of a speculative proposal. While it is generally understood that Verlinde’s model is not exacting in all mathematical details and further work remains, the generality of Verlinde’s approach provides an ideally-suited perspective for comparing essential features of that physics-based approach with Whitehead’s speculative model.
The “actual entities” chapter (four) figures-in the horizon in which AE’s locate: an experience-based process paradigm and philosophy of organism within a speculative philosophy and process (event) ontology. The paradigmatic update from substance- to event metaphysics implicit in this move is also described (see Eastman, 2009). A basic framework for the elements discussed in the comparative chapters is given by way of an introduction to Whitehead’s “categories of the ultimate” and “categories of existence,” leading to the “corrected categories of existence” (see PR). The corrected version becomes important in comparative chapters for grasping the sense in which Whitehead ultimately comes to terms with the non-fundamental (emergent) nature of ‘multiplicities’ in a way that proves to overlap nicely with Verlinde’s recognition of string theory as emergent.
After introducing both AE’s and EG sufficiently to acquaint the reader with general ideas underwriting each program, the next two chapters divide Verlinde and Whitehead’s programs into three phases comprised of different elements shown to interrelate to each other in progressively correlated dynamics through a conceptual exegesis of events.
For Whitehead, chapter five covers from the basic setting to the holographic dual (or reenactment) of a ‘multiplicity’ of ‘initial’ data into ‘objective data.’ For Verlinde, chapter five draws from the nature of UV/IR mixing, D-Branes of open strings, and the open/closed string correspondence.
After introducing the geomodal model, chapter five aims to correlate four connections between Verlinde and Whitehead’s models: (1) Whitehead’s “eternal objects” in flux with Verlinde’s ‘microscopic information’ qua ‘pre-event strands’ as discussed by Chew (2004). Out of this initial environment will be shown to yield a clarification of the ‘measurement problem’ in physics, not as a direct collapse of the wave-function, but as a sampling process of local vacuum through the hypersurface of a manifold.
10
This gives rise to (2) a “multiplicity” of “initial data” qua “open strings” on a “D-Brane” and leads to the formation of (3) a “primary datum” correlating with a “closed string,” or a “phonon.” In (4), recognition of the “snapshot” as holographic leads to a model of dual- projection. This is shown to resolve what this study calls the “data/datum dilemma” representing the historical tension in Whitehead to clarify whether ‘concrescence’ begins with a ‘primary datum’ or multiplicity of ‘data.’
The most salient points of chapter five shine light on the emergent nature of strings and how Whitehead’s correction of his “categories of existence” can be shown to reflect this same line of thinking. Here, Whitehead removes two of the original seven categories, ‘multiplicities’ and ‘objects,’ for reasons based the recognition that they are not in fact fundamental categories but instead emergent values formed within the process itself.
In another instance, it is remarkable how the last-minute substitution into the PR drafts of the “primary datum” for an “initial multiplicity” (in terms of how concrescence begins) is the single-most significant edit between the Gifford and ultimate PR draft; furthermore, in itself the primary datum is also the single most-cited example in the Gifford draft (Ford, 1984). This demonstrates that there was a certain struggle involved for Whitehead in ultimately and categorically replacing the most-used concept of earlier drafts (original datum) with an initial multiplicity—unable to reconcile how to maintain the process and arrive at objective data if initial multiplicity leads to a primary datum (see Ford). We account for this through Whitehead’s anticipation of aspects of the AdS/CFT correspondence, which is not developed until the late 1990’s. In the capacity of Whitehead’s model, the AdS/CFT correspondence proves essential for explaining how the initial multiplicity of data can be “reenacted” into “objective data” ready for “prehension.”
Chapter six picks up with the “primary datum,” or ‘phonon,’ ready to “prehend” the holographic-dual of the snapshot-sample (of the wave-function) qua “objective data” of the “initial data,” in Whitehead . Compared to a “closed string,” the phonon acts as a “coupling constant” during the phase of renormalization (see Verlinde 2012 lecture). This is linked in light of Whitehead’s description of “prehension” (+/-) and “concrescence” as correlated to the renormalization procedures of coarse graining and foliation in Verlinde’s approach. Here, “ negative prehension ,” is shown to correlate with Verlinde’s ‘exclusional’ property of coarse-graining and use of the book-keeping device of Newton’s potential, ɸ; as such it can be given a role not originally realized (or specified) in Whitehead’s works.
In order of cyclicity, prehension and the phases of concrescence qua renormalization ala coarse-graining and foliation ‰ maximization of coarse-graining ‰ Whitehead’s satisfaction qua gravitational self-energy and Verlinde’s emergent gravity effect qua accumulative reaction-force of the negative prehensions. The ‘maximum’ of coarse- graining is like a polymer thermalized onto the horizon (see Bekenstein, 1973 ; Verlinde, 2010). The contact of the polymer with horizon is comparable with the ultimate outcome of the satisfaction as the “ final, real things of which the world is made up ” (PR 23).
11
Following the correlational exercise in the two comparative chapters, the discussion chapter (seven) looks at the major philosophical and conceptual distinctions between Whitehead and Einstein’s theories of relativity with the intent of framing a case for how Whitehead’s version can be recognized as more-aptly suited to understanding the conceptual background of Verlinde’s emergent gravity hypothesis than are standard interpretations of Einstein’s approach.
Broadly, Whitehead offers an accompanying philosophical framework for his theory whereas Einstein renders more of a purely mental/visual construct than one that actually ties into our lived experience of the world and phenomenology. On a mathematical level, Whitehead’s model is also shown to resemble Verlinde’s in a few specific places; for instance, in the case of two-versus-one tensors. Verlinde, Whitehead, and Minkowski all selected two tensors but Einstein combined them both into just one . In other instances, space and time, non-locality, uniformity, light and measurement, and simultaneity, including what Whitehead refers to as “presentational immediacy” – all developed to provide additional examples of conceptual differences between the two.
The final, conclusion chapter (eight) begins with a cumulative review of the narrative and details acquired throughout each chapter, and builds up to a portrait of two theories interlaced with the same fundamental dynamics from two different fields of description. We should be able to recognize Whitehead’s actual entities—an early exemplar of a microscopic theory complete with a philosophical worldview predicated on experience, broadly interpreted—as smoothly accounting for gravitation as an emergent, large-scale process arising out of the microscopic dynamics. From this, Verlinde’s account gains a general philosophy and worldview predicated on experience to explain the origins of gravity; meanwhile, the AE program gains an overall unity of purpose and a selective clarification of logic, plus a posthumous completion of saga for Whitehead via a unification of diverse topics through the integration and blending of a revised gravitational theory into a new understanding of actual entities.
If Verlinde is to overturn the logic of the last three-hundred years in supposing that gravity is not a fundamental force then he will need some philosophical leverage. By the end of this study we should be able to show how Verlinde’s descriptive account of gravity as an emergent phenomenon can effectively represent an integral sub-sequence of the generative process of actual entities. Actual entities are recognized as the corresponding philosophy underwriting Verlinde’s emergent gravity. Equipped with this process philosophy of organism and event ontology, Verlinde will then have an adequate conceptual architectonic and worldview in which to naturally house his emergent gravity proposal. Both Whitehead and Verlinde stand to gain from this synthesis of frameworks.
12
Chapter 2 – Physico-Conceptual Foundations of Emergent Gravity
The purpose of this chapter is to provide a thorough narrative tracing a set of principles, phenomena, mathematical theories, and observational discoveries in modern physics leading up to string theory and emergent gravity, with roots dating back to the early 20 th century. Within this we’ll also encounter the guiding notion that space-time is emergent at smallest scale. We’ll weave this narrative into the context of: general relativity; quantum theory; the vacuum; and Planck scale; plus the integral motion to fuse these two theories together to obtain a quantum theory of gravity. In addition, we’ll explore the UV/IR mixing connection that predicates quantum gravity theories in the form of black- hole horizon environment at Planck scale. This will lead us to geometrical entropy, statistical mechanics, and Hawking radiation; the black hole information paradox; Wheeler’s it-from-bit; and right-on-up to the holographic principle. From here we transition into string theory via a brief historical development before encountering: emergent bootstrapping and S-Matrix approaches; open strings; closed strings; =4 SYM; phonons; D-branes; and solitons. This leads to a discussion of the open/closed string correspondence in light of gauge/gravity duality and the AdS/CFT correspondence. Finally, we can transition directly into Verlinde’s approach on the basis that it serves as the “ most-radical consequence of the AdS/CFT ” (Dijkgraaf, 2012) .
By the end of this chapter we aim to have all but defined the concepts and progression of ideas in physics and cosmology necessary for a lead-in to approaches and phenomena pertinent to Verlinde’s emergent model of gravity. The other major topics to be addressed are Sakharov’s induced gravity (1967), Jacobson’s gravitational thermodynamics (1997), and the holographic renormalization procedure, all to be covered in next chapter.
There is, however, one necessary disclaimer we must append before continuing. The existence of black holes is a theoretical construct as opposed to observational evidence. This study adopts a prudent stance with regard to the ontological validity of black holes, leveraging instead the 1/0 logic underwriting them in the capacity of pure dynamics. As such we’ll consider the mathematical possibility of these phenomena as a sign of their archetypal significance for matters to be contextualized under a different model.
We begin this larger background narrative by reconsidering the notion of space-time at the microscopic scale as a fundamental tensor, instead following routes of suggestion motivating it as an emergent phenomenon; thus, not only will we suppose that gravity is emergent, we’ll also consider that space-time, too, emerges at the smallest scale.
2.1 – SPACETIME AS EMERGENT
In order to construct a maximally-adequated comprehension of fundamental physics at the highest and lowest scales, we need to subscribe our knowledge of the macroscopic order (general relativity) to a development of the microscopic dynamics underwriting thermodynamical and quantum-mechanical properties. These prove most effective at
13 near-horizon black hole environs in the context of spacetime and string theory, as well as in the Large- gauge sector of quantum field theories. What remains to be established is a structural/contextual basis (or framework) for quantum gravity and all related phenomena, though the above examples provide a solid center for the venture.
In a philosophical sense, Aristotle’s ontology deals with macroscopic phenomena and consequently a substance metaphysics, whereas Whitehead’s process ontology and characteristic development of the AE’s appears precisely-suited to motivate a microscopic description of space-time and value dynamics; therefore, we should look to find comport in the endeavor to frame a foundational comprehension of quantum gravity within the general, process-relational philosophy of Whitehead, and in particular, his rendering of AE’s as a primitive process underwriting fundamental physical and experiential modes operating at the seat of—our connection to—nature and cosmology (1922).
The conceptual phylogenesis of modern physics leads to perhaps no more astounding a class of hypotheses than those underwriting the microscopic nature of spacetime, and by extension, quantum gravity. This mixing of the macroscopic with the microscopic scales is notably encountered in the context of UV/IR mixing, black holes, non-commutative geometry, and =4 super-Yang Mills theory, as well as in the holographic principle, Hawking radiation, string theory, and the AdS/CFT (gauge-gravity) correspondence.
Indeed, the quest to understand the microscopic structure of space-time “ represents the driving force in attempting to merge quantum theory with gravitation ” (Chivukula, 2010). Philosophical expectations differ, however, with regard to the understanding of space and time in the first place. Some hold to space and time as fundamental (see, e.g., Polchinski, 1998; Rovelli, 2004) whereas others take them to be emergent abstractions (see, e.g., Dijkgraaf, 2012; Verlinde, 2010; Berenstein, 2006; Seiberg, 2006). These differences can be ultimately attributed to artifacts between substance and process metaphysics—as discussed in the final chapter of this study. In addition, they can be taken as representative of two methodological approaches to space-time and gravity: (i) the quantization approach; and (ii) non-quantizational, induced and emergent approaches.
The guiding supposition and central motif of this study considers that space-time—as well as gravity and perhaps even QM (all matter)—will prove to be emergent phenomena at the smallest scales. Instead of fundamental synthetic-a-priori values they are derived “from a more-intrinsic dimensional occasion and dynamical framework” as Clara Moskowitz frames the matter in a recent publication of the Scientific American:
We often picture space and time as fundamental backdrops to the universe. But what if they are not fundamental, and built instead of smaller ingredients that exist on a deeper layer of reality that we cannot sense? If that were the case, space-time’s properties would
14
“emerge” from the underlying physics of its constituents, just as water’s properties emerge from the particles that comprise it. (2014) 5
The notion that the fundamental structure of spacetime might be something other than a continuum has been around for many decades (Palma and Patil, 2009) and many scholars are actively pursuing the construct (for instance, see: Seiberg, 2006; Yang, 2009; Liberati, 2006; Hu, 2009; Markstrom, 2010; Dreyer, et al 2006, 2009; El-Showk and Papadodimas, 2012). One indication in quantum theory recognizes the Riemannian smooth manifold as an inaccurate depiction of XT according to quantum mechanics at the smallest scale, which resembles more of a bubbling chamber of virtual particles and vacuum quantum fluctuations (see Gross, 2014; Dijkgraaf, 2012). This means that space-time at smallest scales isn’t smooth anymore but resembles more an ocean of activity. 6 Dijkgraaf explains:
XT gets replaced at small distances by something more involved via large- gauge theory as described in the AdS/CFT correspondence, whereby all physics is equivalent to a theory only living on the boundary of the black hole. (Dijkgraaf, 2012)
This suggests that XT doesn’t represent the fundamental basis for our arguments but should instead emerge in the process. “ Many philosophers of science and mathematical physicists alike are turning to this paradigm as the next big movement after relativity theory ” (Wüthrich, 2006). Whitehead’s AE framework should prove prescient to this approach, as we will explore in the upcoming chapters. First we begin by looking at two of the main pillars of science: Einstein’s general relativity and quantum mechanics—before building- up a narrative constellating other significant advancements leading to Verlinde’s model.
2.2 – GENERAL RELATIVITY
Originally space was considered by the Greeks to be a rigid, absolute container, like a big stage where natural phenomena play out their existence. Time on the other hand, according to Newton, was this big clock that would tick and set the stage directions (Dijkgraaf, 2012). Then Einstein came along and said that instead there is a spacetime continuum that unifies the two, plus space as a stage isn’t rigid but actually flexible and can curve and shape on the basis of energy and mass, or gravitation. Specifically, Einstein’s general relativity is a geometrical theory of gravity and space-time based on the curvature of space as a collection of physical events determined by the distribution of matter and energy present (Verlinde, 2010, Mäkelä, 2010).
5 “Water is made of discrete, individual molecules, which interact with each other according to the laws of quantum mechanics, but liquid water appears continuous and flowing and transparent and refracting. These are all ‘emergent’ properties that cannot be found in the individual molecules, even though they ultimately derive from the properties of those molecules” (Jacobson, 1995). 6 In another example, the noncommutative method of quantum gravity provides a third approach resolving that at microscopic level it’s improper to think about points in space below the Planck scale; instead spacetime itself diminishes into a fuzzy, pixelated region wherein values can be indexed as Planck areas (see e.g., Ambjørn, 2002). Non-commutation roughly means that “ although the average values of the fields vanish in a quantum vacuum, their variances do not” (Evans and Kielich, 1994). 15
The curvature interaction between matter and space-time is defined by the system of partial-differential equations underwriting Einstein’s field equations. These describe the relationship between the geometry of a four-dimensional, pseudo-Riemannian manifold representing smooth space-time (plus) the energy–momentum present in that same region (Wald, 1984; Weinberg, 1972). Put simply, Einstein said we can associate gravity with space-time by the way it warps.
Gravity in terms of this geometry of space-time is based on the local equivalence between gravitation and inertia, or the local cancellation of the gravitational field by local inertial frames: the equivalence principle. 7 As Yang explains, “ The equivalence principle guarantees that it is ‘always’ possible at any spacetime point of interest to find a coordinate system such that effects of gravity will disappear over a differential region in the neighborhood of that point ” (2009). Paraphrasing Wheeler, the Einstein equations say that matter tells space-time how to curve and space-time tells matter how to move (1990); therefore, the space-time metric is not a fixed stage but part of the equations and as such is dynamical and can be affected by the matter content of the universe. “In this sense, gravitation may be considered as a manifestation of the curvature of spacetime ” (Peltola, 2007).
The paths of objects moving in space are determined by the geometry of spacetime, and objects in a free-fall move along geodesics, i.e., routes between spacetime points. Space and time, therefore, have physical properties according to Einstein and the influence of this curvature describes the motion of particles under the influence of gravity. As such, gravity corresponds to changes in the properties of space and time, which in turn changes the straightest-possible paths, or geodesic routes, that objects will naturally follow.
The relevant physical content of Einstein’s theory isn’t the metric, however, but the diffeomorphisms of the metric; namely, shifts around points and the mapping of points from one manifold to another (see e.g., Mason and Newman, 1989; Chamseddine, 2001) . This is important for our present study because by definition a diffeomorphism is understood to map a sequence; this means that the order of events remains the same. This is taken as the principle of background independence , which states that only events and their relations are physical (Markopoulou-Kalamara, 2010). As we will see in chapter four, this principle also motivates the Whiteheadian process framework whereby only AE’s and their relations are physical, and XT is taken as an abstraction (see PRel, 1922).
For Einstein this means that all of physics is geometry “ and thereby space and time became no longer just the stage, but an active player in the game [...] In some sense, however, quantum theory will probably be victorious over the underlying ideas of geometry that were so dear to Einstein ” (Dijkgraaf, 2012). Notable others have also pushed back against the notion that matter distributions impact the geometry of spacetime. In fact, they’ve done so since the beginning. Not only Minkowski, Eddington, Silberstein, but also Whitehead relates similar sentiments: “It is
7 Einstein once recalled that the equivalence principle was the happiest thought of his life. 16 inherent in my theory to maintain the old division between physics and geometry. Physics is the science of the contingent relations of nature and geometry expresses its uniform relatedness ” (PRel 10). We will come back to explore Einstein’s framework of general relativity in relation to Whitehead’s version in the discussion chapter (seven) of this study.
2.3 – QUANTUM THEORY
In order to understand the very beginning of the universe, we have to understand the laws of the small elementary particles of quantum theory to describe the structures that we find there. (Dijkgraaf, 2012)
Einstein struggled with comprehending the rudiments of quantum theory; however, to find any kind of solution to the grand picture of the universe we have to study quantum mechanics, and in some sense string theory. Einstein wasn’t alone in this however, and “ it was not obvious for a long time that mathematics is actually the appropriate approach to understanding the structure of the universe and how it’s working ” (Dijkgraaf, 2012).
Quantum models can be subdivided into two main categories: those based on quantum mechanics and those on quantum field theory. In quantum mechanics the classical measurable quantities of position and momentum are replaced by abstract operators acting on an abstract state-space (vector) of the system called a Hilbert space. An operator is an object that transforms state vectors to each other. The state vector, in turn, contains all available information of the system (see e.g., Srednicki, et al, 1984; Peltola, 2007). As Pessa explains in an article found in Vitiello, Pribram, and Globus’ (2004) book:
Quantum mechanics deals with systems constituted by a finite and fixed number of particles contained within a finite and fixed volume. The physical quantities characterizing them, however, cannot be all measured simultaneously with arbitrary precision. A first consequence of such an uncertainty is that a complete characterization of a particle dynamical state with unlimited precision is impossible. One is then forced to introduce the concept of representation of the state of the system being considered [...] consisting in selecting a subset of the dynamical variables describing the state of the system such that all variables belonging to the subset can be measured simultaneously with arbitrary precision. In a sense, every representation can offer only a partial description of system’s dynamics. However, an important theorem proven by Von Neumann (1955) asserts that in QM all possible representations are reciprocally equivalent, meaning that they give rise to the same values of probabilities of occurrence of results of all possible measures relative to the physical system under consideration, independently from the particular representation chosen. (2004)
Quantum mechanics can also be applied to fields (see Jaffe and Witten, 2000, for review); as MacKinnon prescribes, “ the ongoing process of explaining composites in terms of progressively more elementary units should not terminate in particles, but in some more fundamental entities postulated by quantum field theory ” (2007). As ‘t Hooft explains:
According to the laws of quantum mechanics, the energy in a field consists of energy packets, and these energy packets are in fact the particles associated to the field. Quantum mechanics gives extremely precise prescriptions on how these particles interact, once the field equations are known and given in the form of a Lagrangian. The theory is then called
17
quantum field theory, and it explains not only how forces are transmitted by the exchange of particles, but it also states that multiple exchanges should occur. (‘t Hooft, 2008)
This sets the stage for a dynamical model. Quantum field theory was first proposed in 1926 by Paul Dirac as a general framework for the description of the physics of relativistic quantum systems and elementary particles. It’s “ the synthesis of quantum mechanics with special relativity, supplemented by the principle of locality in space and time, and by the spectral condition in energy and momentum ” (Halvorson, 2006). Quantum field theory stands in theoretical methods and experimental verifications as the correct approach to describe particle interactions at high energies up to the grand unified scale where unification of the known forces is predicted. 8 As Baumgartl explains:
Based on this [quantum field] theory the Standard Model of particle physics has been developed, which has been successful in unifying the known forces and particles in a consistent mathematical framework. It provides a scheme where all observed particles can be gathered, classified according to mass, charge, spin, etc. (2007)
In the 1960’s it was commonly argued that “ elementary particle physics is like a black box, something you cannot open, something comes in, something comes out and you can study the correlation between the two .” As Dijkgraaf continues, “ not only could this box be opened, it turned out inside it was in fact quite a small formula ” (2012).
This describes the Standard Model of particle physics as the central achievement of QM and QFT under the guise of quantum chromodynamics and gauge theory. These equations describe natural geometrical objects: “ a handful of particles and how they interact—in essence, all physics, all the matter, all forces, and all radiation ” (Dijkgraaf, 2012).
Most theories in standard particle physics, including the Standard Model, are formulated in terms of relativistic quantum field theories—such as QED and QCD—incorporating Einstein’s special theory of relativity with quantum theory (see e.g., Bain 2011). These give rise to gauge theories. A brief summary overview is in order. As ‘t Hooft summarizes:
In the Yang-Mills theory of QCD we are told the quantum field theories that have proven most important in describing elementary particle physics are gauge theories, and that the classical example of a gauge theory is the theory of electromagnetism. (2008)
A gauge theory is a quantum field theory where the Lagrangian—an expression of the action principle (action = reaction) in terms of the fields of the system—is invariant
8 “Within QFT—as opposed to QM—there is the possibility of having nonequivalent representations of the same physical system” (Haag, 1961; Hepp, 1972); one consequence of this is that only QFT, which allows for different phases of the system itself, can deal with phase transitions, that is, with global structural changes of the system. Such a circumstance entails that the framework of QFT is actually the only one possible if we attempt to model intrinsic emergence (see Itzykson & Zuber, 1986; Umezawa, 1993). 18 under certain transformations, meaning that the state will not change up to the multiplication by one phase (‘t Hooft, 2008). These transformations, called local gauge transformations, form a Lie group which is referred to as the symmetry group or the gauge group (scale-group) of the theory. 9
In many early theories, the multiple exchanges of particles in quantum fields gave rise to difficulties: their effects seemingly unbounded, or infinite; in a gauge theory however, “ the small distance structure is very precisely prescribed by the requirement of gauge-invariance and one can combine the infinite effects of the multiple exchanges with redefinitions of masses and charges of the particles involved. This procedure is called renormalization ” (‘t Hooft, 2008) and we’ll come back to it more in the next chapter and chapter six.
Gauge theories come in two essential varieties, abelian and non-abelian. Abelian is the mathematical term correlating to ‘commutation’ in physics. When values are always commutative these are referred to as abelian theories, like electromagnetism. This means that the transformations of X to Y are equivalent to those from Y to X: you can play them in reverse, in other words. However, there is also a property of vector matrices allowing that not all values will always commute; under these representations, this means that the gauge theory is non-abelian and indicates that the symmetry group is non-commutative, or non-abelian. In these theories X to Y doesn’t equal Y to X and as such there are irreversible processes described. This property can also be distinguished under the heading of ‘chiral symmetry breaking’ where ‘chiral’ refers to the handedness of a gauge group and says that the left and right aren’t symmetric copies.
The non-abelian version of four-dimensional quantum gauge theory is the Yang-Mills theory, which accounts for the electromagnetic and weak forces, including the electro- weak force (see Witten and Jaffe, 2000). In order for the Yang-Mills theory to also describe the strong force, the theory of quantum chromodynamics is required.
Quantum chromodynamics is the study of the Yang–Mills theory of the strong force and color-charged fermions: the quarks; this, by extension, describes a non-abelian gauge theory consisting of a color field mediated by a set of force-carrying (exchanging) particles: the gluons. However, there are also peculiarities about QCD that makes it much more nuanced than classical, non-abelian gauge theories (see Jaffe and Witten, 2000). In order for QCD to describe the strong force it must demonstrate the following three properties: mass gap, quark confinement, and chiral symmetry breaking.
The mass gap “ is necessary to explain why the nuclear force is strong but short-ranged ;” confinement “ is needed to explain why we never see bare quarks ;” and chiral symmetry breaking “ is needed to account for the current algebra theory of soft pions developed in the 1960’s ”
9 Topologically, gauge theory studies principal bundle connections , (the gauge fields ), on a principal bundle. Gauge fields are where principle bundle connections move through on way to embedding onto the surface manifold of the principle-bundle. These connections correspond to fields, e.g. the electromagnetic field. 19
(Jaffe and Witten, 2000). In addition, confinement stands in relation to de-confinement, witnessed in a fourth property, asymptotic freedom. There is no precise phase-transition line separating these two properties; confinement dominates in low-energy scales but as energy increases, asymptotic freedom prevails. Roughly, this means that quarks behave casually-unresponsive when packed very close to each other. “ Due to asymptotic freedom, QCD can approach to the conformal limit in UV regions ” and transform into a conformal field theory for which the holographic principle is then applicable (Xie, et al., 2007). ‘t Hooft sums it nicely in the following passage:
Suppose now that we take the SU(2) x U(1) Yang-Mills system, together with the Higgs field, to describe electromagnetism and the weak force, and add to this the SU(3) Yang- Mills theory for the strong force, and we include all known elementary matter fields, being the quarks and the leptons […] Then we obtain what is called the Standard Model. It is one great gauge theory that literally represents all our present understanding of the subatomic particles and their interactions. (‘t Hooft, 2008)
We have long been aware of the fact that, in spite of its successes, the Standard Model cannot be exactly right, however. As ‘t Hooft explains:
The Standard Model is not perfect from a mathematical point of view. At extremely high energies (energies much higher than what can be attained today in the particle accelerators), the theory becomes unnatural. In practice, this means that we do not believe anymore that everything will happen exactly as prescribed in the theory; new phenomena are to be expected. (2008)
Even with the recent confirmation of the “shadow” of the Higgs boson detection, there are still mysteries to be explained within the construct of the Standard Model. For one, is the Higgs a fundamental boson or is it, like the graviton, also an emergent value? Crucially, we still cannot account for a full theory of gravitation. As Kuhlmann explains:
Most quantum field theories are not asymptotically free, which means that they cannot be extended to arbitrarily small distance scales. We could easily cure the Standard Model, but this would not improve our understanding because we know that at those extremely tiny distance scales where the problems become relevant, a force appears that we cannot yet describe unambiguously: the gravitational force. It would have to be understood first. The gravitational force acting between two subatomic particles is tremendously weak. As long as we disregard that, the theory [of quantum fields] is perfect. (2014)
Thus, “ quantum mechanics can’t be the whole story, ” as ‘t Hooft asserts, and “ there must be something underneath quantum mechanics…some more basic system ” (2002, 2011). Two key platforms we encounter in formulating an answer are quantum gravity and string theory. To approach these topics, Renate Loll (2010) asks the key question: “ what comprises the empty space between the fundamental particles ?” By the end of this study we’ll be poised to suggest how dark matter could provide an explanation and leads to a systematic process.
20
2.4 – DARK MATTER/ENERGY
Physics is at a loss and they try to figure out whether this [model] fits in a grand pattern. If you start to rearrange the pieces of the puzzle, for instance, you see that you can rearrange them in more symmetric patterns which seem to suggest that this is just part of a bigger story, there are bigger symmetries here that we cannot see in nature but that perhaps are behind physical phenomena. (Dijkgraaf, 2012)
Nature has given us a few clues suggesting that our present model can’t be the whole story with perhaps the most famous one coming from cosmology re: the existence of dark matter (Dijkgraaf, 2012). This stems from the fact that solutions to QFT’s are known to result in a huge amount of energy in the quantum vacuum (Gross, 2011). In addition:
If you look at the way in which gravity is acting on the stars in a galaxy, then astronomers have discovered that, in order to count it in terms of matter, there is a huge cloud of matter which is dark, invisible, and not made out of the particles that we know, surrounding each galaxy, and by indirect measurements, you can actually determine the structure of this dark matter distribution – roughly six times more of that dark matter than there is original matter. Cosmologists look at the structure of the universe and see that these galaxies are not uniformly distributed in the universe. They are clumped together in a large-scale structure, these kinds of strands that fly basically through space, and by studying the dynamics of matter and dark matter, actually get a very clear model that seemed to fit very well the observed structure of the universe. So we know there are lots and lots of more matter around that we cannot encode at this moment in our physical models. (Dijkgraaf, 2012)
This, combined with cosmological observations of anisotropic (unequal) inter-galactic mass distributions, led to recognition of this vast amount of energy and matter, ninety-six percent of our universe, as unaccounted for by present models. This indicates that we’re in the wrong paradigm. One proposal carried by this dissertation is that the microscopic degrees of freedom in the universe qua AE’s represent some sense of dark energy/matter.
One modification we must make, for instance, involves Einstein’s cosmological constant describing a static background/vacuum-energy density: modern physics tells us this simply isn’t the case and XT at microscopic scale is teeming with fluctuations and virtual particles that arise and disappear all the time in the quantum vacuum—which, as a result is better characterized as more of a plenum. 10 This leads to an accelerated expansion of the universe and is very different than a static background. As Dijkgraaf explains:
In fact, the universe is not only expanding, it’s expanding in an accelerating way, growing increasingly fast, and with a force pushing the universe apart: namely, dark energy. This (dark energy) represents the extra parameter and physical phenomenon [qua cosmological constant] “ to hold back the universe. ” (2012)
10 “In 1922, scientists discovered that application of Einstein's field equations to cosmology resulted in an expansion of the universe. Einstein, believing in a static universe (and therefore thinking his equations were in error), added a cosmological constant to the field equations, which allowed for static solutions. In 1929 Edwin Hubble discovered a redshift from distant stars, implying they were moving with respect to the Earth. The universe, it seemed, was expanding. Einstein removed the cosmological constant from his equations, calling it the biggest blunder of his career ” (Dijkgraaf, 2012). 21
As he notes, the cosmological constant is still there but “ working just the opposite way than Einstein figured. It is not slowing down the expansion, it is actually adding to the expansion ” (Dijkgraaf, 2012). Verlinde regards dark energy as “ the energy that fuels the system responsible for inertia ” (2011). We will develop this more in chapters five and six.
2.5 – VACUUM and VIRTUAL PARTICLES
In order to access this domain of dark matter we must travel to the vacuum scale where the quantum vacuum permeates the whole universe, and in this respect, can be identified with the space-time of general and special relativity. This leads us to consider the vacuum as fundamental and quanta and waves of energy taken as excitations of this more fundamental vacuum.
Empty space, according to quantum theory is this boiling pot of particles and anti- particles appearing and disappearing. It has really a physical material that you can study, and if you take a chunk of this quantum space-time, because of all this phenomena, there is energy in it, and this energy, according to quantum theory, that is this dark energy – that is the phenomenon that cosmologists measure. (Dijkgraaf, 2012)
In the vacuum a particle can split into two particles for a very brief time before they combine again to form another particle. These intermediate particles are termed virtual particles and you can only see them indirectly. In a slightly different context, Casimir measured this effect between two electric plates. This effect is possible because:
Basically, there is a rule of quantum mechanics that anything is allowed as long as you do it fast enough before it can detect it. And in fact [virtual particles] are measured in particle accelerators, and there is the ultimate result that, for a brief moment of time two particles can appear out of nothing and then combine again. Or, if you wish to think like Wheeler, it is a single particle that goes up in time and down in time and keeps on going round and round and round. (Dijkgraaf 2012)
These quantum fluctuations will prove to be what allow Hawking to eventually form his model for thermal radiation from black holes and to posit that black holes aren’t purely black. As we find, studying space and time itself in the context of the vacuum is really studying quantum gravity: how space and time behave in the microscopic quantum realm. Perhaps no better staging ground presently exists to examine this phenomenon than in the near-horizon environment of black holes, as we’ll see in a few moments.
2.6 – QUANTUM GRAVITY
Quantum theory deals with the very small: atoms, subatomic particles and the forces between them. General relativity deals with the very large: stars, galaxies and gravity—the driving force of the cosmos as a whole. The dilemma is that on the microscopic scale, Einstein’s theory fails to comply with the quantum rules that govern the behavior of the elementary particles, while on the macroscopic scale black holes are threatening the very foundations of quantum mechanics. Something big has to give. This augurs a new scientific revolution. (Duff, 2013)
There is a long-standing methodological tradition in physics whereby combining two approaches yields an advancement. In the case of classical mechanics this was considered
22 alongside Maxwell’s electromagnetism before being triangulated by Einstein’s special relativity. This taken in the context of classical (Newtonian) gravitation was then described by Einstein’s general relativity. The next advance led to the rise of quantum theory alongside general relativity but this has proven to present an obstacle to further progress along this road. Such a field, as a resolution to this combined procedure, is considered to develop from within a theory of quantum gravity. As Culetu points out, “ To have a complete theory of quantum gravity we must clarify whether the gravitational interaction is fundamental ” (Culetu, 2010). In this study we principally-opt for the recognition of gravity, precisely not as a fundamental force, but following Verlinde, an emergent one.
In a conceptual manner, the two branches of thermodynamics and relativity serve to denote that nature has two principle theories instead of one. This could have strong implications for a unified field theory. As Liu explains, “ In many cases these two principle theories seem to be mysteriously connected to each other ” (2010). In physics this is understood in the context of gravitation and quantum theory. The puzzle of two refers to the fact that there are two systems of physical laws in the world comprised of thermodynamics and relativity as revealed by the discrepancy between the theory of gravity and quantum dynamics. Modern physics shows us that thermodynamics and relativity come together under the notion of black hole thermodynamics and Hawking radiation.
More specifically, quantum gravity is a programme (Lakatos, 1970) of research actively in search of a structural and conceptual foundation within the unification-endeavor of general relativity and quantum field theory. In this framework, general covariance relates to relativity and gauge symmetry to quantum theory (see e.g., Norton, 1993). That such an endeavor necessarily entails a metaphysical framework is a basic postulate of this study. Quantum gravity, emergent gravity, it all comes back to an exploration of the fundamental nature of spacetime, as Markopoulou-Kalamara explains: “ Quantum effects of the gravitational field become important when we reach the fundamental limits of space and time measurements ” (2009). The scale of quantum gravity is taken as the Planck length, 10 -35 meters. Many approaches to quantum gravity consider that space and time do not actually exist at this most fundamental level and that “ together with quarks and leptons, perhaps they emerge from the deeper physics that does not rely on, or even permit, their existence [...] This makes QG a fertile ground for the metaphysician ” (Wüthrich, 2006). 11
In general relativity the gravitational field is represented by the metric of space-time and gravity is identical to properties of a dynamical geometry; therefore, "general relativity is not just a theory of gravity […] it’s also a theory of spacetime itself ” (Butterfield and Isham, 2001). As Hedrich explains, “ the quantum dynamics of the gravitational field would correspond to a dynamical quantum space-time: a dynamical quantum geometry ” (2008). This signifies that a quantization of the gravitational field would correspond to a quantization of the metric of
11 For example, Krause (2013) supposes that “the universe arises out of nothing (creation ex nihilo) because of quantum gravity. Out of nothing, nothing arises — is valid if we assume quantum gravity .” 23 space-time. As a result, certain XT ideas of classical relativity like topological spaces, continuum manifolds, and XT geometry may not prove applicable in quantum gravity.
A theory of quantum gravity should instead lead to a description of a dynamical quantum spacetime (see for example Butterfield and Isham, 2001). Some have even recently suggested that spacetime can be modeled like a superfluid (see e.g., Jacobson, 1999; Jacobson and Volovik, 1998; Fedichev and Fischer, 2003; Wilczek, 2013; and Mottola, et al., 2014). In addition we would like to gain further insight into what happens in black holes. We consider the general feasibility of quantum gravity on the basis that:
…if you look at the various forces of nature, the three forces that are there in the Standard Model, and you compare them to gravity, you see that when the energy scale goes up and up and up, that gravity gets the same strength as the other forces. So there should be a moment, even before this very brief split of a second in which inflation starts, where space-time itself becomes a quantum phenomenon, so not only are the particles themselves allowed to do anything they want, space-time itself is allowed to do this, and therefore it stops. (Dijkgraaf, 2012)
As Chivukula explains, “The search for a quantum theory of gravity has been one of the most fundamental problems in physics for the past fifty years because such a theory is necessary to understand the Universe at its earliest moments ” (2010).
2.7 – Quantization v. Non-Quantization
Does the need to find a quantum theory of gravity imply that the gravitational field must be quantized? Physicists working in quantum gravity routinely assume an affirmative answer, often without being aware of the metaphysical commitments that tend to underlie this assumption. (Wüthrich, 2006)
To quantize or not to quantize….that is the question. The major disagreement consists in determining whether such a theory may be obtained by quantizing general relativity, or by considering it as a low-energy effective theory whose metric and connections form the collective, hydrodynamic variables of some unknown microscopic theory (see Jacobson and Volovik, 1998; Volovik, 2001). This study affirms the latter view.
Framing the matter: “ There are many attempts to quantize gravity—string theory and loop quantum gravity are alternative approaches that can both claim to have gone a good leg forward ,” explains Liberati, “ but maybe you don’t need to quantize gravity; you need [instead] to quantize this fundamental object that makes space-time ” (qtd. in Moskowitz, 2014). In fact, there’s also data to suggest this; strictly speaking, as Grygiel notes, gravity doesn’t need to be quantized: “ it isn’t demanded by hard, experimental data ” (2007). Non-quantization models for theories attempting to reconcile general relativity with quantum theory lead to emergent approaches where gravity is taken as an emergent phenomenon like thermodynamics and hydrodynamics, instead of treated as a fundamental force. Here, “the fundamental role of gravity is replaced by thermodynamical interpretations leading to similar
24 or equivalent results without knowing the underlying microscopic details ” (Banerjee, 2010). Since gravity isn’t fundamental it therefore doesn’t have to be quantized (Wüthrich, 2006).
As we will see in the next chapter, Sakharov’s induced gravity program (1967) provides an effective, semi-classical approach to gravity that doesn’t require a quantization either. Similarly, Jacobson’s gravitational thermodynamics conceives of gravity as emergent from the energy flux of unobservable degrees of freedom. Together these two accounts serve to bolster the feasibility that a final theory of gravity may not involve quantization. As Deutsch (1984) predicted:
In the important matter of formalism we still know of no other way of constructing quantum theories than ‘quantization,’ a set of semi-explicit ad hoc rules for making a silk purse (a quantum theory) out of a sow’s ear (the associated classical theory). [...] I believe that quantization will have to go before further progress is made at the foundations of physics. [...] To base the theory of quantum fields on that of classical fields is like basing chemistry on phlogiston or general relativity on Minkowski space-time: it can be done, up to a point, but it is a mistake; not only because the procedure is ill defined and the resulting theory of doubtful consistency, but because the world isn’t really like that. No classical fields exist in nature.
Indeed, major shifts in viewpoints must be expected in order to make it possible to understand a theory of quantum gravity as part of a greater framework which also incorporates all other known particles and forces (Baumgartl, 2007). One of these is the intriguing case of UV/IR mixing.
2.8 – UV/IR MIXING
Remarkably, black holes provide an environment where small and large scale values are both relevant. This is known as UV/IR mixing. Ultraviolet refers to the small scale and infrared to the large scale. UV and IR correspond with quantum theory and relativity, respectively. This means that the very large-scale structures in the universe described by general relativity are coming together and mixing with the very-smallest structures and regions at the quantum level of space-time: the physics of very high energies are affecting those at extremely low ones.
In theoretical physics it is generally feasible to organize physical phenomena according to the energy-scale or distance-scale. The theory of renormalization group procedures—as we’ll explore in the next chapter—is based on the paradigm where the short-distance, ultraviolet physics does not directly affect qualitative features of the long-distance, infrared physics, and vice versa (see e.g., Minwalla, Raamsdonk and Seiberg, 1999). While this separation of scales holds in quantum field theory, in noncommutative field theory and quantum gravity (especially string theory) these interrelations between UV and IR physics start to emerge (Minwalla, Raamsdonk, Seiberg, 1999; also, Matusis, Susskind, and Toumbas, 2000). As Verlinde explains, “ gravity is a macroscopic force that dominates in IR but it also knows about microscopic states and has the information about the UV; therefore, there must be [an underlying] principle at work ” (2011).
25
On the UV side, the quantum effects of gravity are understood to become significant at the scale of the Planck length (Peltola, 2007). Planck scale is vacuum scale and describes phenomena at the vacuum level. In his inaugural paper on quantum theory, Max Planck (1899) realized that if quantum mechanics were to be legitimate then the ultimate consequence would be that there is a smaller size in physics, a smaller size to space and time. “ Physics is telling us is that space itself should have this property if you put it on a gigantic microscope: there is no longer space; there are just little bits and pixels ” (Dijkgraaf, 2012). They are quantum bits roughly the size of the Planck length.
Sonego describes how “our present model of spacetime as a pseudo-Riemannian differentiable manifold can be considered accurate and authentic from cosmological scales down to particle physics scales ” (Sonego, et al; 2015); however, as also explained in an earlier paper, “quantum fluctuations emerging at the microscopic level are expected to shatter the classical structure of space and time at smallest scales ” (1995). This leads us to recognize the need for a new physics description at Planck scale.
Where do we see these [Planck scale and UV/IR mixing] phenomena? Is this relevant physics? I think the amazing thing is that there is a [cosmological] laboratory where you can test these ideas… and it is black holes. So black holes are something that were in science fiction books twenty years ago, but now are part of the standard description of our universe. (Dijkgraaf, 2012)
UV/IR mixing is a very special property that only a select few models prove able to describe such scenarios, most notable among these being a holographic description of black holes near the horizon described by Dn-branes in string theory, and 4/5d SYM of Large- (1/n expansion) in gauge theory. Together these mutual descriptions yield the gauge-string duality, a close-associate of the gauge-gravity-, open-closed string-, AdS/CFT-, and AdS/QCD- correspondences, all of which will be discussed further.
2.9 – BLACK HOLES
“Black holes are where God divided by zero.” Steven Wright
Wheeler is credited with coining the term, black hole. Most generally it refers to any ‘body’ in which the escape velocity is greater than the speed of light. In fact, the existence of such regions was proposed for the first time by Michell and Laplace in the late eighteenth century on the basis of Newton’s theory of gravitation. Classical general relativity predicts the existence of black holes using the collapse of a star model. Black holes can be ultra-massive or microscopic. The first black hole solution to Einstein’s field equation was found by Schwarzschild in 1916. The Schwartzchild radius marks the radius of the singularity of the black hole which demarks the regional distinction where outside the radius the speed of light is less than escape velocity and inside the escape velocity is greater than the speed of light. The singularity is separated from the outside world by an
26 event horizon. Any mass that lives within its Schwartzchild radius becomes a black hole and a dimensionless point. 12
Black holes are characterized as having an extremely-large mass in an extremely-small volume, and are defined by three variables: mass, charge, and angular momentum. In addition, they follow four rules correlating with thermodynamics. As Bekenstein explains, “there is strong evidence that the laws of black hole mechanics are a subset of the laws of thermodynamics, and that the black hole area is proportional to its entropy ” (Bekenstein, 1973; see also Bekenstein 1974). The four laws of black hole mechanics were originally derived from the classical Einstein equation and developed by Bardeen, Carter, and Hawking in a 1973 paper. As Jacobson explains, the discovery of quantum Hawking radiation “ made it clear that the analogy is in fact a statement of identity ” (1995).
The zeroth-law maintains that the horizon has constant surface gravity for a stationary black hole. The zeroth-law is analogous to the zeroth-law of thermodynamics stating that the temperature is constant throughout a body in thermal equilibrium. It suggests that the surface gravity is analogous to temperature. In this sense, the thermal equilibrium for a normal system is analogous to constant over the horizon of a stationary black hole (see Bekenstein, 2008; Horowitz and Teukolsky, 1998).
The first law formulaically states that a change of mass is equal to a change of area, angular momentum, and electric charge. This means that the energy of a system at temperature T will change when work is done on it; analogously, the first law of thermodynamics is the statement of energy conservation (see e.g., Horowitz, 2012).
The second law states that the horizon area is a non-decreasing function of time— correlating to the fact that in thermo-physics entropy doesn’t decrease either—and assuming that observed matter is always positive: the weak-energy condition (Bekenstein, 2008). This is the statement of Hawking's area theorem describing how a change in entropy in an isolated system will be greater-than-or-equal-to zero for a spontaneous process, suggesting a link between entropy and the area of a black hole horizon (2008). This leads to the scenario where “if you have two black holes that merge together, the area of their horizons, of the new black hole, is larger than the sum of the two original ones: a synergetic effect ” (Dijkgraaf, 2012). Unless we allow that black holes have entropy we cannot maintain the second law of thermodynamics, for if black holes carried no entropy it would be possible to violate the second law by throwing mass into it. Paraphrasing Hawking’s assessment, the increase of the entropy of the black hole pays dividend compared to the decrease of the entropy carried by the object swallowed (Hawking, 1988; inter alia).
Finally, the third law states that given thermodynamics reveals the impossibility of ever reaching absolute zero, cosmologically we can derive that it is therefore impossible to form a black hole with vanishing surface gravity such that = 0 (Bardeen, Carter, and
12 If Earth had a radius of 8mm (like a marble), retaining all the mass, it would become a black hole. 27
Hawking, 1973). Stating that cannot go to zero essentially means that the entropy of a system at absolute zero is a well-defined constant, given that any system at zero temperature would necessarily be said to exist in its lowest-energy (vacuum) state (ibid).
According to these laws, black holes have a real temperature and entropy but to know this we have to count the microscopic states of the black hole, which requires a quantum theory of gravity (see Horowitz, 1998). Classical models of black holes are shown to resolve Einstein’s field equations with the appropriate Schwartzchild metric for boundary conditions (see Nordstrom, 1918); however, “for an adequate description of the interior of black holes and the very early universe ” we still need to include low-energy quantum-mechanical effects of near-horizon behavior (Schutz, 2003). Out of this region also arises the semi- classical Hawking radiation that serve to produce an identity between the classical laws governing black holes and the laws of thermodynamics. As Dijkgraaf explains, “ an amazing thing [is that] if you do the computation, you find the temperature of this thermal radiation is in fact given by the surface gravity of the black hole ” (2012). This means that in addition to entropy we also obtain energy; that is, “we have a temperature and it looks like there really is something like thermodynamics going on in black hole physics ” (ibid).
Crucial to all these descriptions is the integral role of quantum statistical mechanics. Thermodynamics is important because it represents an approximate description of the behavior of large groups of particles, made possible by the fact that the particles obey statistical mechanics. Investigation of statistical effects in systems consisting of a large number of particles is called statistical mechanics. Soloviev makes this plain:
The history of physics shows that all thermodynamical laws later have been derived from a more fundamental theory — statistical mechanics. It was shown that the concept of temperature could be explained as the average kinetic energy of a micro-particle, the concept of entropy — as logarithm of the number of states corresponding to the same macroscopic thermodynamical macro-state. (2005)
Thermodynamics describes the properties of macroscopic systems by means of the basic macroscopic quantities of pressure and temperature. In the first half of the 19th century the laws of thermodynamics were known only as phenomenological rules confirmed by experiments (see e.g., Gyftopoulos, 2005); however, through the visionary works of Boltzmann and Gibbs, “ the thermodynamical properties of macroscopic systems became viewed as statistical averages over their microscopic degrees of freedom ” (Peltola, 2007; see also Chakrabarti and De, 2000). This procedure involves identifying a collection of microscopic states with one macroscopic state specifying the system’s energy, entropy, and temperature (plus local near-horizon quantum fluctuations). Based on this identification, we can predict how some quantities will change when we vary others under certain constraints (Kelly, 1996-2002). This is also very similar to how we consider phonons in an event-ontology.
28
Indeed, in quantum statistics, the entropy of a system in a given macrostate is the natural logarithm of the number of microstates corresponding to that macrostate, whereas in classical statistics entropy is the natural logarithm of the phase space volume corresponding to the given macrostate (see e.g., Peltola, 2007; Dieks, 2013). As Willie explains, “ much as the study of the statistical mechanics of black-body radiation led to the advent of the theory of quantum mechanics, the effort to understand the statistical mechanics of black holes has had a deep impact upon the understanding of quantum gravity, leading to the formulation of the holographic principle ” (2012). Camenzind builds: “ Since black holes have a non-zero temperature, the classical laws of black holes are simply the laws of thermodynamics applied to black holes ” (2007); therefore, many posit that “ there must be some more fundamental description of the classical laws governing black holes in terms of statistical mechanics ” (ibid). Within the context of our present study we align with Majhi’s following passage where:
In order to provide a statistical interpretation of gravity, first give the equipartition law of energy and show that this leads to the identification of entropy with the action for gravity. The immediate consequence of it is that the Einstein equations, obtained by a variational principle involving the action, can be equivalently obtained by an extremization of the entropy. This implies gravity can be thought of as an emergent phenomenon. (2012)
Emergence is studied in statistical mechanics qua Ising model and magnetization. Instead of a magnet we have space and gravity, and the question is: what underlies gravity, or what is the Ising analog underlying gravity and spacetime? One of the limitations of quantum statistical dynamics, however, is that it cannot treat living matter; only the formalism of quantum field theory can practically handle this (Jibu and Yasue, 1997). However, classical statistical mechanics, in its practical applications, is still closely tied to human knowledge: a sudden change in our knowledge causes, in classical statistical mechanics, a sudden change in the mathematical/physical representation of our knowledge (see also Stapp, Von Neumann, 1955).
2.10 – GEOMETRICAL ENTROPY
Black hole entropy is a concept with geometric roots but many physical consequences. It ties together notions from gravitation, thermodynamics, and quantum theory, and is thus regarded as a window into the as yet mostly hidden world of quantum gravity. (Bekenstein, 2008)
The probabilistic description of gravity can be traced back to research on black-hole thermodynamics initially stalwarted by Bekenstein and Hawking in the mid-1970s when they introduced the concept of geometrical entropy as a gravitational version equal to the area of the horizon. Put simply, “ Bekenstein concluded that the black hole entropy is directly proportional to the area of the event horizon ” (Marolf, 2009). Starting from the theorems provided by Hawking on black-hole thermodynamics, Bekenstein conjectured that black holes represent maximum entropy objects whose entropy is proportional to the area of its event horizon divided by the Planck area. 13 Specifically:
13 He considered a sphere of radius R where the entropy in a relativistic gas increases as the energy increases. The only limit is gravitational; when there is too much energy the gas collapses into a black hole. 29
Black-hole entropy should only depend on the observable properties of the black hole: mass, electric charge and angular momentum. It turns out that these three parameters enter only in the same combination as that which represents the surface area of the black hole. One way to understand why is to recall the "area theorem" (Hawking 1971; Misner, Thorne, and Wheeler 1973): the event horizon area of a black hole cannot decrease; it increases in most transformations of the black hole. This increasing behavior is reminiscent of thermodynamic entropy of closed systems. (qtd. in Bekenstein, 2008)
Possibly the most important consequence of black-hole entropy dwells on its statistical interpretation within a quantum gravity framework. After Hawking derived the feasibility of black-hole evaporation using an interpretation of the thermal temperature of blackbody radiation[3], considerable efforts followed to find a statistical interpretation for the proportionality of black-hole entropy and its horizon area (see e.g., Zhang and Zhao, 2005; Jiang, Wu, and Cai, 2006; and Chen and Wang, 2011).
These studies suggest a profound connection between gravity and thermodynamics as well as representing a precursor to the holographic principle. “The fact that black hole entropy is also the maximal entropy that can be obtained by the Bekenstein bound [as it ‰ equality] was the main observation that led to the holographic principle ” (see Bousso, 2002). This led to the recognition of black-hole thermodynamics as the primary method for attempts to reconcile the laws of thermodynamics with the existence of event horizons. 14
In a quantum description of black holes, since a black hole has a well-defined entropy, we expect that the hole also has a well-defined exponent of microstates corresponding to its macrostate (see Krasnov et al, 1998). A macroscopic hole has an enormous amount of QM degrees of freedom compared to the three classical ones predicted by “no-hair” theorems (see e.g., Bhattacharya, 2007).
The existence of these microstates raises many intriguing questions. Do these microstates correspond to the quantum states of the collapsing matter inside the black hole, or are these degrees of freedom connected with the quantized matter fields on a background geometry? Or could it be possible that the notion of black-hole entropy stems from the microscopic structure of spacetime itself? (Peltola, 2007)
Bekenstein used this to put an upper boundary on the entropy in a region of space that is proportional to the area of the region, as opposed to the volume. 14 Until 1995 no one was able to make a controlled calculation of black hole entropy based on statistical mechanics, which associates entropy with a large number of microstates. That changed when Strominger and Vafa calculated the proper Bekenstein-Hawking entropy of a supersymmetric black hole in string theory using methods based on D-branes and string duality (see Mohaupt, 2000). 30
2.11 – HAWKING RADIATION
Many scholars have attempted to describe the interaction of quantum matter with gravity by quantizing the matter on a fixed, classical gravitational background (see Weinstein and Rickles, 2011). That is, they have tried quantizing the matter, but not the gravity. This will work only if the gravity is weak, as in semi-classical methods; therefore, it should work outside a large black hole, but not near the singularity. 15 To these ends, Hawking provides such an approach, realizing that if you introduce an entropy you can also acquire a temperature.
Using the thermodynamic relationship between energy, temperature, and entropy, he published calculations in 1975 confirming Bekenstein's conjecture that black holes should have a well-defined entropy (1973), and showed how the characterization of black holes as thermodynamical objects with a non-zero temperature signifies that black holes aren’t completely black but should actually emit a dim, thermal radiation with the spectrum of a black body (see e.g., ‘t Hooft, 1985). This result can be attributed to quantum-mechanical effects located in the immediate surroundings of the event horizon even when there is no in-falling matter associated with the black hole (Hawking, 1988). Thereafter this was formalized into the Bekenstein-Hawking entropy describing “the amount of entropy that must be assigned to a black hole in order for it to comply with the laws of thermodynamics as they are interpreted by observers external to that black hole ” (Bekenstein, 2008 ). This is particularly true for the first and second laws, listed earlier.
Specifically, Hawking proposed a heuristic scenario where the spontaneous pair- production of virtual particles near an event-horizon provides a mechanism for radiation to occur. As Peltola explains, “ In normal conditions, a virtual particle-antiparticle pair annihilates itself very rapidly after its emergence. In the vicinity of the event horizon, however, it is possible that the member of the pair with negative energy is swallowed by the black hole before the annihilation, and the other with positive energy is free to escape from the hole ” (2007). Dijkgraaf adds, “The particle inside would be pulled by the gravitational force to the singularity while the other particle is now liberated and can escape to infinity ” (2012).
The existence of such radiation implies that black holes, like any other macroscopic objects, have thermodynamical properties—including entropy (see e.g., Becker, Becker, and Schwartz, 2006). Semi-classical calculations indicate that indeed they do, with the surface gravity playing the part of temperature in Planck's law (see Wald, 1975 and 2001). As Baumgartl explains, “ semi-classical phenomena like Hawking radiation show that black holes must be treated as quantum objects ” (2005). This provides the only known phenomenon so far that contains an interplay between quantum theory and general relativity. Peltola finds delight in this, explaining:
15 A recent line of promising inquiry posits that there are no naked singularities. The singularity is protected by a region of space, like an apex, such that the singularity isn’t ever reached. A corollary line of thinking is that quantum effects would also make it impossible to ever fully reach the singularity (see e.g., Allen, 2011). 31
Maybe the most intriguing aspect of black hole radiation is that it contains elements from quantum theory, thermodynamics and general relativity. Thus, one may say that in black hole radiation all the three foundational theories of physics meet for the first time. It is natural to expect that similar radiation processes would take place in the vicinity of other spacetime horizons as well. (Peltola, 2007)
In fact, just recently a model black hole capturing sound instead of light has been shown emitting quantum particles considered as the analog of Hawking radiation (see Steinhauer, 2009). “ The effect may be the first time that a lab-based black hole analog has created Hawking particles in the same way expected from real black holes ” (Grossman, 2014). The Hawking effect comes from quantum noise at the horizon, explains Unruh, one of the first to propose fluid-based black holes. The horizons create pairs of phonons where one escapes the horizon while the other is trapped inside (Steinhauer, 2009). Creating a quantum-mechanical fluid they were able to mimic the same physics proposed at a black hole's event horizon on a much smaller scale. In 2009, Steinhauer et al first developed a model black hole using the collective-mode, Bose-Einstein condensates that behave like a single value. Now they report it has produced just the kind of Hawking radiation expected in real black holes. " This tells us that the idea of Hawking actually works: A black hole should really produce Hawking radiation " (2009).
In the formal (cosmological) class of black holes, the fact that Hawking radiation is a semi-classical result means that matter fields are assumed to follow the laws of quantum physics while space-time is considered as a non-dynamical background. Paraphrasing Rickles (2008) and Bärenz (2012), they explain how in a more realistic situation we should expect the gravitational field to have quantum effects as well. This means that in addition to thermodynamics, black holes should also obey quantum mechanics and therefore space should be recognized as dynamical and emergent at the smallest scale.
2.12 – BLACK HOLE INFORMATION PARADOX
Hawking’s use of quantized fields derives from using QM to show how information was disappearing (Dijkgraaf, 2012); however, this prompted a scenario where the calculation conflicted with a basic principle of quantum mechanics: namely, that physical systems evolve in time according to the Schrödinger equation—referred to as the unitarity of time evolution (see Maldacena, 2005; Zwiebach 2009). This contradiction predicates the basis of the black hole information paradox and states roughly that any information travelling into a black hole must therefore disappear altogether; however, this also undermines another basic principle, namely, that information can never be created nor destroyed (for a review, see Susskind, 2008). This gave physicists a crisis to sort out. Susskind, ‘t Hooft, and H.Verlinde approach the problem by offering that when information is dropped into a black hole it isn’t actually lost; taking this, the horizon is recognized as a holographic representation of the immediate surrounding space-time. Given that the amount of information in the bulk region of a black hole is equal to the amount of information on its surface area, we can quantify the information we do not know by measuring its area.
32
In fact, it was Wheeler who began the tradition of thinking about information as the basis for understanding all of geometrical theoretical physics (1973). Bekenstein summarized this trend by suggesting that scientists may " regard the physical world as made of information, with energy and matter as incidentals " (2003). He concluded that "thermodynamic entropy and Shannon entropy are conceptually equivalent: the number of arrangements that are counted by Boltzmann entropy reflects the amount of Shannon information one would need to implement any particular arrangement of matter and energy ” (Bekenstein, 2007). The only salient distinction between the thermodynamical entropy of physics and the Shannon's entropy of information is in the units of measure; “ the former is expressed in units of energy divided by temperature, the latter in essentially dimensionless "bits" of information, and so the difference is merely a matter of convention ” (Bekenstein, 2003).
Wheeler came up with a particularly-clever slogan about this, heralding the adage: “It from Bit” - implying that ‘it’ is the universe, and ‘bit’ refers to the simplest convention of information measurement. As Dijkgraaf simplifies, “‘ Bit’ is the entropy that is there; entropy is information ” (2012). Building off Boltzman, in 1948 Shannon suggested that the sum quantity of bits is related to the totality of degrees of freedom of matter. For a given energy in a given volume there is an upper limit to the density of information (the Bekenstein bound) about the locations of all the particles composing the matter in that volume; furthermore, this suggests that “ matter itself cannot be subdivided infinitely many times and there must be an ultimate level of fundamental particles ” (Meijer, 2012).
Taking the notion of information as a starting point, Verlinde combines this with the holographic principle to predicate his approach to emergent gravity, as we’ll see shortly.
2.13 – HOLOGRAPHIC PRINCIPLE
The holographic principle and its realization in string theory have shined critical light on the mysteries of black holes and information-loss as suggested by Hawking's work, and are believed to provide a resolution of the black-hole information paradox (Maldacena, 2005; Susskind, 2008). Specifically, it’s considered that the informational content of all objects that have fallen into the hole can be re-acquired in surface fluctuations of the event horizon (see Davies, 2004). In 2004, Hawking conceded that black holes do not violate quantum mechanics and suggested a mechanism through which they might preserve information (Hawking, 2005; Susskind, 2008).
The holographic principle demonstrates how subdivisions of matter stop at the level of information, and whose fundamental constituents are represented by bits of information like 1’s or 0’s; as Damasco explains, “The main idea behind the holographic principle is that information is what drives physical phenomena ” (2012). The insight that first inspired the holographic method emerged almost two decades earlier, however, when 't Hooft wrote a paper on quantum gravity—revisiting Hawking's original work on black-hole thermodynamics—and concluding that the total number of degrees of freedom in a region of spacetime surrounding a black hole is proportional to the surface area (or radius
33 squared) of the horizon, and not cubed as would be expected (see ‘t Hooft, 1993). This idea was further clarified by Susskind in (1995), arguing that the oscillation of the horizon of a black hole is a complete description of both the in-falling and outgoing matter because the world-sheet theory of string theory was just such a holographic description. While short strings have zero entropy, he could identify long, highly-excited string states with ordinary black holes. This was a deep advance because it revealed that strings have a classical interpretation in terms of black holes. In this sense, the information of a black hole and anything outside it can be encoded by putting this information on the surface of its horizon. “ It’s a rather radical idea because it tells you that there is information in the underlying layer of understanding all of quantum geometrical physics ” (Dijkgraaf, 2012).
The basic problem with this idea is in determining the context such a holographic screen that encodes information arising in the first place. One possible answer is given by Einstein’s equivalence principle applied in an experiential context: as an observer expends energy and enters into an accelerated frame of reference an event horizon arises that acts as a holographic screen and encodes information (see e.g., Kowall, 2015).
As we will see, gravity is modeled by Verlinde as an entropic form of holographic information (see e.g., Gao, 2012). This means in some sense that “the space-time geometry we observe could be a holographic illusion defined by the spatial and temporal relationships between animated images projected from a holographic screen to the central point of view of an observer ” (Hu and Wu, 2014). In the sense of the principle of equivalence, “the observer is nothing more than consciousness that arises at an accelerated point of view while all images of things in the observer’s world arise on its holographic screen ” (ibid).
Our universe and everything we know always seems to eventually lead us to the conclusion that we live in a holographic reality. From cosmology to quantum physics, scientists today are truly having a troublesome time trying to explain the nature of our reality. How can we draw the extreme conclusion that our world is only an illusion and what does that mean when we want to know our place in the cosmos (Yang, 2006)?
Can the holographic principle be tested? It can be tested in a theoretical way, and the most successful approach so far is yielded in string theory. String theory also provides a formalism combining quantum mechanics and gravity. Hawking’s black hole information paradox is also resolved when quantum gravity is described in a string-theoretic way. String theory is funny though because essentially it’s a theory that almost never was, having already been thrown to the bin as a theory of hadrons; instead it was revived by a few key results and reformulated within the context of a quantum theory of gravity.
34
2.14 – STRING THEORY
Quantum mechanics brought an unexpected fuzziness into physics because of quantum uncertainty, the Heisenberg uncertainty principle. String theory does so again because a point particle is replaced by a string, which is more spread out. (Witten, 1995)
One of the more promising lines of reasoning in the modern era of mathematics has come from the development of string theory. It is important to distinguish this from other theories, however; as Theisen, et al., explain, “ String theory is not, in contrast to general relativity and quantum field theory, a theory in the strict sense ” (2007). Originally, string theory originally rose out of an attempt to describe the properties of the strong-force interaction through the construction of a dual-resonance model to compute S-Matrix scattering results for the strong force and mesons as an emergent foundation for physical law (see, e.g., Cooper and West, 1988; Gross, 2005; McGarrie, 2011; Rickles, 2014). This model was later recognized to correspond to the quantization of a relativistic string (see e.g., Salisbury, 1984 ). Heisenberg introduced this as a method where the S-matrix 16 serves as a way of constructing a theory that doesn’t rely on the local notions of space and time— proposed to break down at the nuclear scale—instead keeping track solely of the particles and their collisions (see e.g., Shapiro, 2007; Di Vecchia et al., 2012).
In quantum field theory the intermediate steps are the fluctuations of fields, or equivalently, the fluctuations of virtual particles; in this context, there are no local quantities at all. In this sense, the S-matrix theory was a proposal for replacing local quantum field theory as the basic principle of elementary particle physics. (Shapiro, 2007)
This allowed space-time to be taken as an emergent abstraction and the S-matrix as the quantity that describes how a superposition of incoming particles turns into outgoing ones. As a result this program was influential in the 1960’s as a conceivable substitute for quantum field theory, dogged at the time with the zero-interaction divergences at strong coupling (see Schulz, 1993). Exapted into string theory it has been suggested that S- matrix theory still offers the best approach to the problem of quantum gravity (see e.g., Frautschi, 1963; Giulini, Kiefer, and Lämmerzahl, 2003). Here, the S-matrix theory is related to the holographic principle and AdS/CFT correspondence by a flat-space limit where the analog of the S-matrix relations in AdS space are the boundary conformal theory (Giddings, 1999). Specifically, Polchinski and Susskind proposed an expression for the S-matrix in flat space-time (without gravity) in terms of the large- limit of the gauge theory living on the boundary of the AdS space (see Aref’eva, et al., 2013).
Exact theories of quantum gravity should be formulated in terms of gauge invariant observables associated to the boundary of spacetime. In that spacetime, the only such observable is the S-Matrix, so a theory of quantum gravity in at space will be a theory that computes scattering amplitudes holographically. Since AdS/CFT provides a non- perturbative description of AdS theories via a dual CFT, one can obtain the bulk S-Matrix
16 Geoffrey Chew made the bootstrapping approach famous in America. 35
from a at space limit of AdS. This defines a holographic theory for at space using a sequence of CFTs with increasing central charge. (Fitzpatrick and Kaplan, 2012)
S-matrix theory was all but abandoned in the 1970’s as QCD and renormalization arose to solve these challenges within the framework of field theory, finding greater success and corroboration with experimental results in accelerators – but then 1973 happened; this was a big year for string theory: two sets of researchers contemporaneously contextualized the role of the massless spin-two anomaly as a graviton: first, Tamiaki Yoneya discovered that all the known string theories included a massless spin-two particle that obeyed the correct Ward identities 17 to be considered a graviton (see also Veneziano, 1986). Right around the same time, Scherk and Schwartz derived a similar result leading them to adduce that string theory is actually a theory of quantum gravity, not hadrons (see, e.g. Blumenhagen, Lüst, and Theisen, 2013). This led to string theory’s rare re-examination as it became clear that the properties making string theory incongruous as a theory of nuclear physics were in fact optimal for a quantum theory of gravity. Green, Scherk, and Schwartz also realized that at low energies “ this stringy graviton interacts according to the covariance laws of general relativity ”18 (Theisen et al., 2013). With this insight, string theory became a formal candidate for quantum gravity.
As it turned out, five different string theories were eventually developed whose multiple- realizability remained a mystery until Ed Witten recognized in the mid-1990s that each of these theories could be obtained as a different limit of a non-perturbative 10+1 dimensional theory unifying all five into different elements of the same underlying theory, which he named M-theory (see for example Schwarz, 1999). “M-theory tells us that string theories are really about strings and the higher dimensional objects of D-branes, like solitons in a sense ” (Lerche, 1997). 19 As Vecchia et al. explain (2005) “ it turned out all five consistent string theories in ten dimensions unify gravity in one way or another with gauge theories. ” M- theory is often used as a term in place of referring to a non-perturbative completion of string theory (see Theisen, et al., 2007).
The five fermionic string theories are: type I, type IIa, type IIb, HO, and HE. All string theories contain closed strings; type I involves both open and closed strings. In addition there is a bosonic string theory without fermions in 26 dimensions. We focus primarily on type I, IIa, IIb string theories over the mainstay of procedures considered in this study.
Generally, QFT can’t handle gravity because of all the infinities that emerge from singularities in Feynman diagrams (see e.g. Szabo, 2011). Non-abelian gauge theories make it possible to overcome many of these difficulties. More generally, string theory unifies this level of microphysics with general relativity. String theory develops to
17 The Ward identity is a correlation function that follows from the gauge symmetries of the theory and remains valid after renormalization. 18 Covariance is a measure of how much two random variables change together. 19 Alpha prime and h-bar together lead to M-theory (see e.g., Tong, 2009); the best results so far are based on an open string language (see e.g., Antoniadis, Dudas, and Sagnotti, 1999). 36 provide “ a consistent quantum theory, free from ultraviolet divergences, which necessarily requires gravitation for its overall consistency ” (Szabo, 2011). Sahakian’s narrative spells it out:
String theory may be viewed as a framework for exploring new exotic ideas on the frontier of theoretical physics. At its heart, the subject aims at describing a consistent theory of quantum gravity, in addition to being a short length scale completion of the Standard Model of particle physics. The subject’s most prevalent successes to date are twofold: convincing evidence that the theory resolves various long standing puzzles arising in black hole physics; and phenomenological realizations of models that appear to mimic the world we see at low energies. While the theory itself as a whole may still evolve beyond its current form, several of the new concepts that it has developed are expected to survive at the foundation of a future formulation of the laws of physics. (2012)
The basic postulate of string theory is modest: elementary objects are extended and behave according to the laws of relativistic quantum mechanics; the notion of an elementary particle is generalized to a one-dimensional object: a little bit of string. Since string theory is a relativistic quantum theory that includes gravity, it must also involve the corresponding three fundamental constants, namely the speed of light c, the reduced Planck constant h-bar, and the Newtonian gravitational constant G. These three constants combine into a constant with dimensions of length. The characteristic length scale of strings may thereby be derived to the Planck length (see Szabo, 2011).
This means that the fundamental mass-scale (or tension) of a string is related to the characteristic mass-scale of gravity: the Planck mass. Strings vibrating at the Planck scale are thought to be an essential ingredient in the production of all fundamental particle constituents through the nature of their vibrations—each giving rise to a particular type. In this sense QFT is also described naturally in part by the integral nature of strings; as Dawid (2007) explains, “the dynamics of our observed world is at the most-fundamental level explained by a purely geometrical theory of strings in space-time. All interactions, nuclear interaction as well as gravity, can be extracted from the dynamics of those strings .” Even more succinctly, Blumenhagen, Lüst, and Theisen describe how:
All particles, matter and interactions have a common origin: they are excitations of the string. There are open and closed strings. The massless spin-two particle appears in the spectrum of the closed string. Since any open string theory with local interactions, which consist of splitting and joining of strings, automatically contains closed strings, gravity is unavoidable in string theory. (2013)
In addition, all properties attributable to point-like particles are explained in string theory in terms of oscillation modes formulated as topological properties of the string (see Dawid, 2007); specifically, the Polyakov action (1981) describes the worldsheet of a string whose manifold allows it to be embedded in space-time (see e.g., Font and Theisen, 2003, 2005; Weigand, 2011). In order to describe oscillating strings, the Polyakov action must be supplemented by the Liouville action in order to describe the fluctuations (see e.g., Compere, 2008; Tong, 2009). Ivancevic draws out the point:
37
Liouville's theorem is a key theorem in classical statistical and Hamiltonian mechanics. It asserts that the phase-space distribution function is constant along the trajectories of the system — that is, the density of system points in the vicinity of a given system point travelling through phase-space is constant with time. (Ivancevic, 2002)
Famously, string theory requires 9+1 dimensions to hold. M-theory describes 10+1. The extra dimensions—in addition to the usual 3+1—are treated as compactified internal dimensions. Four-dimensional spacetime appears as usual flat Minkowski space, whereas the internal dimensions constitute manifolds with possibly complicated geometries. The compactification of higher-dimensional spacetime in extra dimensions can be considered as hyper-dimensional oscillating spaces. The internal structure of string theory gives reasons to believe that once it has found a fully consistent formulation it might be a final theory (see Dawid, 2003).
In string theory there are three basic operators: closed strings, open strings, and D- branes. The two varieties of strings, open and closed, each lead to states with characteristic properties. The Reggeonic sector and pomeron are early versions of open and closed strings: the 'pomeron sector' is now recognized as the closed string sector while the 'reggeon sector' represents an open string theory (see e.g., Lublinsky et al., 2014). More recently, open strings in AdS/CFT description are linked with =4 SYM, while closed strings bear reference with a quasiparticle modeling. In addition, D-Branes can be considered like solitons. We’ll identify each of these scenarios briefly, beginning with open strings.
Open Strings
Open strings are defined in bosonic and type I (fermionic) string theories. An open string has two end-points and is the equivalent of a line-interval. Open strings describe scalar and vector bosons in the massless sector and are perturbative (see Verlinde, 2011). In addition, open strings can merge into closed-loops of energy. As Dijkgraaf remarks, “before we calculated the theory we couldn't know this; it just turns out that we struck lucky ” (2012). It is as if we can make something that isn't in the theory in the first place. This leads to the conclusion that we not only need to accept open strings as basic ingredients, but also closed loops of energy: closed strings.
Even though open and closed strings have much in common there are also notable distinctions between them; for example, as Verlinde describes: “ gravity is in the closed string sector and is therefore something else than what the open strings are ” (2011). In fact, while not all string theories describe open strings, they each must contain closed strings given that “interactions between open strings can always result in closed strings ” (Schwartz, 2000). Open strings prove to bear more resemblance to CFT’s while closed strings represent the classical sector of gravity “ and yet also exist in QFT as coupling constants ” (Vecchia, et al.,
38
2005; Verlinde, 2011). These facts prove vital for linking the accounts of Whitehead’s AE’s with Verlinde’s EG in an event-logic and process context.
It turns out that a quantum-mechanical theory of open strings can also be formulated and shown to automatically incorporate a number of excitations that look like particles (see e.g., Rudolph, 1998; Ashtekar, 2005)—one of those being a vector particle that is exactly massless. The ability to describe this means that a “quantum-mechanical formulation of a theory of open strings automatically incorporates one of the key ingredients of the Standard Model ” (Polchinski, 1998).
Depending on which string theory you’re working with, open strings come in various dimensionalities (Peeters and Zamaklar, 2007). Verlinde says there must be something else behind open strings, however, since we need a D-Brane background for them to propagate on in spacetime; therefore, open strings cannot be considered the starting point: “ they already have their own inertia via states with a certain mass, so there’s something underlying them ” (Verlinde, 2011). The event-logic constructed in chapter five will provide a methodological nod to this claim with the hope and goal of shedding more insight into the ontological nature of open strings and D-branes.
In terms of a gravitational scenario in matrix theory, open strings can be taken as oscillators with a certain spectrum of masses and an amplitude between two D-branes (Verlinde, 2011). Frequencies of oscillators depend the on the mutual positioning of the D-branes; anharmonic frequencies with coupling constants in QFT are integrated out (ibid). This dis-integration of the anharmonic, off-diagonal (open string) degrees of freedom between two D-branes is attributed to the induction of gravity. This is a way of viewing gravity in an open-string channel where, as Verlinde describes, “ fast variables influenced by slow variables create a reaction force and this is precisely what open strings are doing” (2011). This means that the set of open strings creates fast-variables operating in a Higgs space that, when integrated out of influence with slow-variables of Coulomb space, leads to gravity, and whose remaining parts contribute to the gravitational self-energy (see Verlinde, 2011). Crucially, this is compared to the phases of prehension and concrescence in Whitehead’s AE process leading to the satisfaction, or culmination, of the basic AE generative cycle (PR), as we’ll see in chapters four and six.