<<

The Founding of an Event-Ontology: Verlinde's Emergent and Whitehead's Actual Entities

by

Jesse Sterling Bettinger

A Dissertation submitted to the Faculty of Claremont Graduate University in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate Faculty of Religion and Economics

Claremont, California 2015

Approved by: ______

© Copyright by Jesse S. Bettinger 2015 All Rights Reserved

Abstract of the Dissertation

The Founding of an Event-Ontology: Verlinde's Emergent Gravity and Whitehead's Actual Entities

by

Jesse Sterling Bettinger

Claremont Graduate University: 2015

Whitehead’s 1929 categoreal framework of actual entities (AE’s) are hypothesized to provide an accurate foundation for a revised theory of gravity to arise compatible with Verlinde’s 2010 emergent gravity (EG) model, not as a fundamental force, but as the result of an entropic force. By the end of this study we should be in position to claim that the EG effect can in fact be seen as an integral sub-sequence of the AE process. To substantiate this claim, this study elaborates the conceptual architecture driving Verlinde’s emergent gravity hypothesis in concert with the corresponding structural dynamics of Whitehead’s philosophical/scientific logic comprising actual entities. This proceeds to the extent that both are shown to mutually integrate under the event-based covering logic of a generative process underwriting experience and physical ontology. In comparing the components of both frameworks across the epistemic modalities of pure philosophy, , and cosmology/relativity physics, this study utilizes a geomodal convention as a pre-linguistic, neutral observation language—like an augur between the two theories—wherein a visual event-logic is progressively enunciated in concert with the specific details of both models, leading to a cross-pollinized language of concepts shown to mutually inform each other. The geomodal framework will be implemented in this study as an exegetical modeling convention. From this study we will attempt to construct a set of narratives for string theory and AE’s on the basis of an event logic and process ontology. Combining these two fields brings to light novel connections between the sciences and humanities as well as offering a method for realizing a new, narrative logic in string theory and philosophy of mind.

"On ne voit bien qu'avec le cœur. L'essentiel est invisible pour les yeux." (One sees clearly only with the heart. What is essential is invisible to the eye.) A. Saint-Exupery

Acknowledgements

Heartfelt, special thanks where it is due. Gratitude to my advisors, Phil Clayton, Vatche

Sahakian, and Paul Zak for their willingness to take on the project; to Tim Eastman for providing useful feedback and intellectual motivation; to Edris Stuebner for positive encouragement and moral support over the years; to the Athenas, to the Stars, and to the

Lions: you were the inspiration every step of the way; to the Negritto family for their generosity and for setting the tone for the year; to the Frazier family for their gracious friendship, talks, and example; to Scott Bracken and Jeremy Ognall for the phenomenal rugby experience and for taking us to the beach. Experiencing that was life-changing and provided the determination and wherewithal to see this project through to the end. You have no idea how much my time with you all has meant, but it has meant the world.

iv

Table of Contents

PART I – Introductory Materials 1. Introduction – Prospectus + Methodology 2. Physico-Conceptual Foundations of Emergent Gravity PART II – Outline of Models 3. Verlinde’s Emergent Gravity 4. Whitehead’s Actual Occasions PART III – Comparative + Geomodal 5. Origination 6. Creativity + Synthesis PART IV – Review 7. Discussion –Einstein and Whiteheadian Gravity 8. Conclusion – Review and Denouement

1. Introduction ------1 1.1. --- Methodology ------3 1.2. --- Conceptual and Phenomenal Placement ------5 1.3. --- Scale ------7 1.4. --- Ontology ------8 1.5. --- Organization of Chapters ------9

2. Physico-Conceptual Foundations of Emergent Gravity ------13 2.1. --- Emergence of XT ------13 2.2. --- ------15 2.2.1. 2.2.2. Expansion + Cosmological Constant 2.3. --- Quantum Theory ------17 2.3.1. Quantum Mechanics 2.3.2. Quantum Field Theory 2.3.3. Standard Model 2.3.4. 2.3.5. 2.3.6. Yang-Mills Theory 2.4. --- Dark Energy/Dark Matter ------21 2.4.1. Accelerated Expansion 2.5. --- Vacuum Energy of QM ------22 2.5.1. Virtual Particles 2.5.2. Vacuum as Plenum 2.6. --- ------22 2.7. --- Quantization v. Non-Quantization ------24 2.8. --- UV/IR Mixing ------25 2.8.1. Planck Scale 2.9. --- Black Holes ------26 2.9.1. Four Laws 2.9.2. Statistical Mechanics

v

2.10. --- Geometrical Entropy ------29 2.11. --- Hawking Radiation ------31 2.12. --- Information Paradox; Entropy as Information ------32 2.12.1. It from Bit ‰ geometric entropy of Planck horizon 2.13. --- ------33 2.14. --- String Theory ------35 2.14.1. Basic History 2.14.2. Open Strings 2.14.3. =4 Super Yang-Mills Theory 2.14.4. Closed Strings 2.14.4.1. --- Closed strings as phonons 2.14.4.2. --- Closed strings as coupling constants 2.14.4.3. --- Quantum coupling constants as dynamical 2.14.5. D-Branes 2.14.5.1. --- Solitons 2.14.6. Open/Closed String Correspondence 2.14.7. Gauge/Gravity Duality ‰ AdS/CFT Correspondence 2.15. Emergence of Gravity ------48 2.16. Summary ------50

3. Verlinde’s Emergent Gravity ------51 3.1. --- Introduction ------51 3.2. --- Non-Quantizational Approaches to QG ------51 3.3. --- Sakharov’s ------52 3.4. --- Jacobson’s Gravitational Thermodynamics ------54 3.5. --- Distinguishing Verlinde from Predecessors ------57 3.6. --- Verlinde’s Entropic Gravity ------59 3.6.1. Universality of Gravity 3.6.2. Emergence of Space-time and Gravity 3.6.3. Information 3.6.4. Holographic Principle 3.6.5. Entropic Force 3.6.6. Polymers and Black Hole Thermodynamics 3.6.7. Information and Storage on Holographic Screens 3.6.8. Derivation of Newton’s Laws 3.6.9. Emergence of Space 3.6.10. Coarse Graining 3.7. --- String Theoretic Approach ------67 3.7.1. Open-Closed String Correspondence and AdS/CFT 3.7.2. Matrix Theory 3.7.3. Adiabatic Reaction Force 3.7.4. Hidden Phase Space 3.7.5. Inertia and Gravity as Adiabatic Reaction Forces 3.8. --- The End of Gravity as a Fundamental Force ------70 3.9. --- Summary ------71

vi

4. The Actual Entities ------72 4.1. --- Philosophy of Organism ------72 4.2. --- Experiential Metaphysics and Speculative Philosophy ------74 4.3. --- From Substance to Event Ontology ------75 4.4. --- Uniquely Suited to Mathematical Physics ------79 4.5. --- Actual Entities ------80 4.5.1. Prehension 4.5.2. Simple Physical Feelings 4.5.3. Subjective Forms 4.5.4. Initial/Subjective Aim and Decision 4.5.5. Concrescence 4.5.6. Satisfaction 4.5.7. Unity and Determinateness

5. Origination, Emergence, Reenactment ------98 5.1. --- Geomodal Construct ------100 5.1.1. Geometry and Physics: Two Metrics, Not One 5.1.2. Minkowski’s Lightcone 5.1.3. Hypersurface of the Present and Manifold 5.1.4. Ontological Immediacy v. Sensory-Conscious Present 5.2. --- Whitehead and Verlinde Signatures in an Event-Ontology ------106 5.2.1. Sea of Strands ------107 5.2.1.1. Strands in Geomodal method 5.2.1.2. Strands in Chew 5.2.1.3. Quantum fluctuations and Casimir effect 5.2.1.4. Strands as pre-XT microscopic data 5.2.1.5. Link to Tachyonic String Theory (26d Bosonic) 5.2.1.6. Strands as pure potentialities 5.2.2. Snapshot + Photograph ------110 5.2.2.1. Measurement Problem 5.2.2.2. Snapshot (as Mechanism) 5.2.2.3. Similar to Sen’s 1 st example 5.2.2.4. Snapshot of Frozen Strands 5.2.2.5. Dn-brane of Open Strings 5.2.2.6. Multiplicity of Initial Data 5.2.2.7. Emptiness and Dependent Arising 5.2.3. Holographic Dual of Snapshot ------121 5.2.3.1. Initial Data ‰ Objective Data ‘Reenactment’ 5.2.3.2. Open-Closed String Correspondence 5.2.4. Phonon ------125 5.2.4.1. Phonon is emergent from snapshot elements 5.2.4.2. Similar to graviton radiating off a D0-brane 5.2.4.3. Instead of graviton, phonons qua closed strings 5.2.4.4. Phonon as a revival of the “objective datum” 5.3. --- Dictionary and Summary ------129

vii

6. Selection, Creativity, Synthesis ------133 6.1. --- Phonon as Coupling for Prehension during Renormalization ------135 6.1.1. Coarse-graining: Foliation = Prehension: Concrescence 6.2. --- Prehension and Coarse-Graining ------139 6.2.1. Prehension/Coarse-Graining as a Selection Process 6.2.2. (-) Prehension & integration-out of open strings in a matrix 6.2.3. (+) Prehension & open-string acquisition of expectation value 6.3. --- Concrescence and Foliation ------145 6.3.1. Phases of Concrescence 6.3.2. Concrescence of feelings ‰ emergent dimension of space 6.3.3. Foliation of “feelings” ‰ genetic phases 6.4. --- Verlinde’s Matrix Theory------148 6.5. --- Satisfaction = Gravitational Self-Energy ------150 6.5.1. Max. of Coarse-Graining ‰ Newton’s Potential Φ 6.6. --- Summary ------155

7. Discussion ------158 7.1. --- The Principle of Relativity ------158 7.2. --- Philosophical Distinctions Between Einstein and Whitehead ------160 7.2.1. Experience 7.2.2. Two Metrics, Not One 7.2.2.1. The First Metric 7.2.2.2. The Second Metric 7.2.3. Space and Time 7.2.4. Uniformity 7.2.5. Measurement 7.2.6. Simultaneity 7.3. --- Comparing Whitehead and Einstein to Verlinde’s EG ------178 7.4. --- Summary ------181

8. Conclusion ------182 8.1. --- Science in an Emergent Paradigm ------183 8.2. --- Précis ------187 8.2.1. Prehension and Concrescence 8.2.2. Satisfaction and Maximization of Coarse Graining 8.3. --- Denouement ------197 8.3.1. String Theory Epical Narrative 8.3.2. AE’s Epical Narrative 8.3.3. Two Aspects of One Process 8.3.4. Closing

viii

Chapter 1 – Preface/Methodology

I look for the hour when that supreme Beauty, which ravished the souls of those eastern men, and chiefly of those Hebrews, that through their lips spoke oracles to all time, shall speak in the West also. The Hebrew and Greek Scriptures contain immortal sentences that have been bread of life to millions. But they have no epical integrity; are fragmentary; are not shown in their order to the intellect. I look for the new Teacher, that shall follow so far those shining laws, that he shall see them come full circle; shall see their rounding complete grace; shall see the world to be the mirror of the soul; shall see the identity of the law of gravitation with purity of heart; and shall show that the Ought, that Duty, is one thing with Science, with Beauty, and with Joy.

- Ralph Waldo Emerson, closing statement of "Divinity School Address" (1838) to graduating class

This study is pursued in the philosophy of physics and aims to serve in part as a contribution to the postulates underlying physics through an exegesis of Verlinde’s (2010) emergent gravity (EG) model in the context of Whitehead’s (1929) process dynamics and descriptive account of the actual entities (AE’s). The goal is to provide the foundations of contemporary physics with a philosophical foothold and narrative within a process and event logic. While Verlinde’s model is not the only one, the way he constructs his case sets it most in line with Whitehead’s development of AE’s. This defends a joint (physical and philosophical) ontology beginning from processes and events. In this framework, not only gravity and string theory (Sakharov, 1967; Jacobson, 1995; Verlinde, 2010, 2011; Padmanabhan, 2012; Frampton, Kephart, 2005) but even space and time (Dijkgraaf, 2012; Seiberg, 2009; etc.) are emergently-derived from processes and events (Whitehead, 1922).

As an interdisciplinary study in the truest sense, this effort is meant to bring together two highly-technical disciplines—in one case, written for experts in mathematical-physics and in the other case, for experts in Whiteheadian studies—and in such a way as to esteem the concomitance brought to bear progressively throughout the course of this report. To these ends, the neologisms of one subject-area might not be immediately accessible to an expert from either field, in particular; however, as we’ll encounter in the comparative chapters, the concepts and terms used by one field are made readily available by pairing them with a term found in the other field. As such we create a “dictionary” between both fields. This allows for an expert in physics and string theory to recognize the concepts they understand within a different set of neologisms in Whitehead’s philosophy. Providing these dictionaries also ensures against the claim from either side that we have excluded them from the ideas of the other; instead we provide an on-the-spot translation for both sides to participate with each other.

In physics, the quest to understand the microscopic structure of space-time represents the driving method of scholars attempting to merge quantum theory with gravitation (Chivukula, 2010); historically speaking however, quantum mechanics, as a theory of the exceptionally small, appears incompatible with general relativity, as a theory of the exceedingly large. Such a tension drives Mäkelä to state that “ instead of attempting to understand the microstructure of matter, we should…attempt to understand the microstructure of itself ” (Mäkelä, 2010). Out of this apparent antagonism arises the sub-discipline

1 of quantum gravity and the subsequent approaches that have emerged over the last half- century to try and reconcile gravity with the other known laws.

Still today, conceptual perspectives in physics continue to differ with regard to the understanding of space and time: while some scholars hold to space and time as fundamental and discrete (see, e.g., Polchinski, 1998; Rovelli, 2004; Gao, 2011), others like Verlinde take them to be more emergent properties (see, e.g., Witten, 2004; Seiberg, 2006; Padmanabhan, 2012). It is suggested here that these differences can be attributed to artifacts of distinction between substance- and process metaphysics. As well, they are seen as representative of two basic approaches to spacetime and gravity: the quantization approach to quantum gravity, and non-quantizational, induced or emergent approaches.

In accordance with the general theme of this study we lend our attention to non- quantizational (emergent) approaches and look at two of the most substantial contributions within semi-classical methods: Sakharov’s induced gravity and Jacobson’s gravitational thermodynamics, both of which serve in setting the mathematical and conceptual stage for Verlinde’s 2009 insight that gravity may not be a fundamental force but a macroscopic phenomenon emerging as the result of thermodynamic principles applied to phase-changes of information in dynamic mass-distributions (Chivukula, 2010).

To advance this logic, a geomodal method will be employed in chapters five and six to describe a kinematic and dynamical event-sequence – serving in addition as an exegetical standard used to compare Verlinde’s EG to Whitehead’s AE’s. We often think of the term “geometry” as referring to shapes and their properties. The geomodal method adds another layer to include spaces-within-shapes (or attributes of a natural symbol) as ontologically and conceptually significant. Here, a ‘physical symbol’ arises from Minkowski’s 4d lightcone of space-time, a central tenet both of Einstein’s Special Relativity as well as Whitehead’s first metric. The purpose of the geomodal method is to situate an event-ontology at the seat of experiential/material dynamics in the form of a basic, pictorial logic made easy for comprehension.

In this study, Whitehead’s 1929 categoreal framework of actual entities is hypothesized to provide a coherent foundation for Verlinde’s revised theory of gravitation to arise compatible with his 2010 emergent gravity model—not as a fundamental force, but as the result of an entropic force qua thermodynamics and string theory. If this can be established, this study aims to show how the EG effect could be interpreted as an integral sub-sequence within the description of Whitehead’s AE process. From this we will propose that an event serves as the foundational unit in physics, rather than a substance. This signifies a shift in both philosophical and physical paradigms from material to events and process—out-of-which material values emerge: a both/and (see Eastman, 2009).

At first blush there might seem little reason for trying to link two so-seemingly disparate fields and concepts together. After all, the AE’s were developed out of a discontent with substance metaphysics whose response levies an attempt to describe science predicated

2 on, and amenable to, experience. Verlinde’s model, on the other hand, developed as an insight into the non-fundamental description of gravity predicated on the physics of gravitational thermodynamics, statistical mechanics, and the holographic principle in the context of black holes and string theory. In another sense, however, the potential to recognize Verlinde’s gravitational theory in light of Whitehead’s actual entities does not come as a complete surprise. In fact, there are a few, substantive reasons why a theory of emergent gravity should find comport in Whitehead’s categorical program. We consider these now.

1.1 – In Methodology : speculative philosophy and speculative physics

Both Verlinde and Whitehead’s conceptual frameworks can be recognized as speculative ventures into speculative physics/cosmology and speculative philosophy, respectively. Bradley defines speculative philosophy succinctly as, “ a theory of the conditions of the actualization of the empirical world ” (2007). We use this definition for our study. For Whitehead, speculative philosophy qua metaphysics is “ the science which seeks to discover the general ideas which are indispensably relevant to the analysis of everything that happens ” (RM 84). As he states: “my arguments must be based upon considerations of the utmost generality untouched by the peculiar features of any particular natural science ” (PRel 14). During Whitehead’s time this was likely the safe bet, given the state of physics, though today we have uncovered new models that can in fact make contact with the natural sciences from the location of Whitehead’s AE’s, as will be identified and highlighted in this study.

On Verlinde’s side, of all the physical theories about nature only two branches are independent of concrete details of the system being considered: thermodynamics and relativity (see Liu, 2010). 1 This sets them onto the scale and order of universal theories and what Whitehead would agree describe some of the most general features of nature. As Liu explains, “ thermodynamics and relativity are two theories about the universal principles every physical system must obey and are hence referred to as principle theories ” (Liu, 2010). As such we have set from the start a stable basis for comparing the universality and generality of both Verlinde and Whitehead’s programs. Verlinde further describes the universality of gravity in another (2010) passage:

Of all forces of nature, gravity is clearly the most universal: gravity influences and is influenced by everything that carries an energy, and is intimately connected with the structure of space-time. The universal nature of gravity is also demonstrated by the fact that its basic equations closely resemble the laws of thermodynamics and hydrodynamics.

Aptly, both Whitehead and Verlinde’s programmes 2 can be shown to derive on the basis of Aristotelian “ first principles” qua the most general concepts and universal phenomena.

1 Renormalization Group theories in QFT are known to yield multiple levels that could also function partially independent of micro-system details; this possibility will be addressed in chapter six. 2 The Lakatosian research programme (1978) provides a framework within which research can be conducted on the basis of 'first principles.' As such it also resembles Kuhn's notion of a paradigm (1962). 3

Both are methodologically approached from the level of a speculative venture and as general ideas. Speculative philosophy as conceived by Whitehead represents the “ the endeavor to frame a coherent, logical, necessary system of general ideas in terms of which every element of our experience can be interpreted ” (see Sherburne, 1966). As he explains in (PR 6):

The first requisite is to proceed by the method of generalization so that certainly there is some application; and the test of some success is application beyond the immediate origin. In other words, some synoptic vision has been gained. In this description of philosophic method, the term ‘philosophic generalization’ has meant ‘the utilization of specific notions, applying to a restricted group of facts, for the divination of the generic notions which apply to all facts.’

Turning to the basis of Whitehead’s approach as a ‘speculative metaphysics’ linked to Aristotle’s ‘first philosophy’ by Ramal (2003) we can also recover the ingrained motivation for framing reality from the first principles of ‘ being qua being ’ (Aristotle). As Ramal explains: “ the first philosopher looks for the first principles that render reality intelligible by means of descriptive generalizations ” (2003). Whitehead is also known to have predicated his philosophical method on the pursuit of “imaginative rationalization” (PR 7) or what he also calls a “descriptive generalization” (PR 15). For Verlinde things are also suitably generalized; as he describes, “starting from first principles and general assumptions, using only space independent concepts like energy, entropy and temperature ,” his paper shows how Newton's laws of gravitation “appears naturally and practically unavoidably ” (2010). Ramal links the ‘first principles’ to being-as-such (Aristotle); he explains:

Since the “essential attributes” of being as such are the first principles, first philosophy differs from mathematics and the other sciences in that it seeks to study the most universal first principles, not simply the general principles or causes of a particular aspect of reality (Ramal, 2003).

Where Verlinde indicates a method predicated on a description of the general principles underwriting the emergence of gravity as a ‘universal feature’ intimately linked to the structure of space-time, and ‘influenced by everything that carries energy,’ so too the topics of space and time are also shown to have chief import in the description of phenomena underwriting reality as expressed in Whitehead’s Principle of Relativity (1922), and Process and Reality (1929), in constructing the categoreal scheme of the AE’s. As will be shown in chapters seven and eight, Whitehead’s model defines space and time as abstractions from AE’s.

While linked to a firm physical basis, Verlinde’s model is still patently speculative and communicated almost exclusively from the level of first principles and general ideas. As he states: “ I use a lot of ideas from string theory but…I feel one should try to extract the essence from it and start from certain principles…I think the principles will be more important ” (2011). In another case he develops his motivation further: “ I’m more interested in finding out how nature works ” (Verlinde, 2011). What little equations are found in his 2010 paper, he admits in an interview, were just for making a point to the reader and weren’t even essential to conveying the idea of the hypothesis (see 2011). This might sound like a reason not t0 pay

4

much attention to the model from a physicist’s perspective, but from our philosophical standpoint, Verlinde’s model—plus concepts in similar ones: Berenstein, 2006; Li and Wang, 2012—provides the kind of framework most useful for assessing compatibility with Whitehead’s actual entities..

1.2 – In Conceptual and Phenomenal Placement : AE’s housed within a principle of relativity standing in most important predictions alongside Einstein’s theory. It will be shown in the discussion how the conceptual differences between Whitehead to Einstein’s theory set the former closer in alignment with Verlinde.

In 1922 Whitehead wrote The Principle of Relativity with Applications to Physical Science (PRel) with the aim of reformulating Einstein’s theory of gravity in such a way that “gravity would no longer be identified with the allegedly variably curved space-time, but with a physical interaction (Whitehead’s gravitational impetus) that can be defined against the uniform background of Minkowski’s space-time ” (Desmet, 2010). As Whitehead explains:

The present work is an exposition of an alternative rendering of the theory of relativity. It takes its rise from that “awakening from dogmatic slumber”—to use Kant’s phrase—which we owe to Einstein and Minkowski. But it is not an attempt to expound either Einstein’s earlier or his later theory. The metrical formulae finally arrived at are those of the earlier theory, but the meanings ascribed to the algebraic symbols are entirely different. (PRel, v)

In fact, the two theories can be considered largely equivalent in many important respects. For example, Fowler is known to have constructed an interpretation of Whitehead's theory qualifying it as an alternate, mathematically equivalent presentation of GR (see Fowler, 1974). As Bain elaborates, Whitehead’s theory “ makes the same predictions as general relativity with respect to the perihelion advance, the deflection of light rays and the gravitational red-shift; indeed, Eddington (1924) has shown that it is equivalent to the Schwartzschild solution of Einstein’s field equations for the one-body problem” (Bain, 1998).

For Whitehead the geometric structure of nature grows out of the relations among actual entities (Fowler, 1974). Unlike Einstein, Whitehead was after a theory predicated on experience, broadly interpreted. Desmet explains how in the preface to Whitehead’s Principles of Natural Knowledge (1919) he stresses that “ the modern theory of relativity, because of its union of space and time, has opened the possibility of a new answer to the question of how the space of physical geometry can be conceived as the logical outcome of generalizations from experience ” (quoted in: Desmet, 2007). As a result of this, Whitehead’s theory “ holds a different paradigm from Einstein's—elegant and simple in mathematical formulation and with its own philosophical background. It has been called a thorn in Einstein's side because it agrees with Einstein in its prediction for all the classical tests ” (Tanaka, 1987). This guides us towards the realization that the real issues between Einstein and Whitehead are not physical but philosophical (see Desmet, 2010). As Fowler expresses: “ No empirical test can decide the issue of the adequacy of Whitehead's basic theory of relativity. This issue must be settled on other grounds ” (Fowler, 1974). To these ends we seek a philosophical and contextual assessment.

5

For Whitehead, the principle of relativity was paramount to his speculative metaphysics: “The doctrine of relativity affects every branch of natural science, not excluding the biological sciences ” (PRel 3). We see Verlinde’s model also make indirect contact with the biological through his concurrent development of the polymer example throughout the 2010 paper. This provides a level of contact not witnessed in Einstein’s more-mentalistic framework that can be seen on account of the fact that, as Fowler explains:

The key foundational principles of Einstein’s theory -- the constancy of the velocity of light and the equivalence principle -- are postulates which are the free creations of the mind and not open to immediate experience. (Fowler, 1974)

Einstein goes even farther, fully in keeping with a neo-Kantian perspective, saying that "time and space are modes in which we think, not conditions in which we live ." By contrast, Whitehead’s uses his appeal to the “ immediate experience of simultaneity and the contemporary world as the foundation of relativity ” (Fowler, 1974). This is, for Whitehead, a predication of space and time as abstractions from events as relations, and as such are also shown to be emergent and creative. As he explains:

The whole investigation is based on the principle that the scientific concepts of space and time are the first outcome of the simplest generalizations from experience, and that they are not to be looked-for at the tail end of a welter of differential equations. (Whitehead; PNK vi)

Whitehead's theory of relativity is so-closely connected with the processual nature of his speculative metaphysics that we cannot attempt to understand it without paying due attention to his philosophy. As Bain reiterates, “The ontological relationship between the two must be fleshed out in the context of Whitehead’s philosophy of nature ” (Bain, 1998). Thus we can be led to the view, like Fowler draws, that Whitehead’s theory of gravitation offers a framework based within a comprehensive philosophy of nature whereas Einstein’s model seems to resemble little approaching the likes of experience (see Fowler, 1974).

As described at the beginning, in distinction from Einstein, the formula Whitehead adopts for the gravitational field involves both the flat metric of -time and a dynamic metric dependent on the presence of source masses. In order to find a mathematical expression for the law of gravitation, Whitehead introduces the second metric, dJ 2, to represent the gravitational field of a particle and describe the way a particle “pervades” its future (PRel 74 ). This is specifically developed with the AE’s in mind. As Whitehead explains, the “ individual peculiarities of actual occasions” represent the properties of the physical contingent world (dJ 2) while the " background of systematic geometry " represents the metric of uniform background space-time, dG 2 (PRel 58).

Whitehead describes the physical field in his Principle of Relativity as expressing “ the unessential uniformities regulating the contingency of appearance ” (PRel 8). Similarly, to complete the quote from above, in Process and Reality he describes the physical field as the " interweaving of the individual peculiarities of actual occasions on the background of systematic geometry " (PR 507). The geometry is systematized to the extent that all values

6 are valued in a mode of being; that is, from derived hypersurfaces of a manifold. When we advance to Verlinde’s approach, the move to introduce AE’s into what Verlinde refers to as the inessential microscopic information has a very proper and defendable justification: to the extent that Whitehead defines the physical field as expressing the “unessential uniformities regulating the contingency of appearance, ” Verlinde’s model locates “inessential microscopic data” predicating the emergent gravity effect. As he explains: “The universality of gravity suggests that its emergence should be understood from general principles that are independent of the specific details of the underlying microscopic theory ” (2010).

With each distinction raised in this chapter we aim to realize how Whitehead’s model draws closer in semblance to that of Verlinde’s approach rather than with Einstein’s. Whitehead’s two-tensor construction is one of the key distinctions; perhaps equally important, however, we also realize how Whitehead programmed the AE’s directly into the microscopic details of his theory of gravity. The major claim (to be developed in chapter seven) is by that maintaining the notion of gravity through a real, physical interpretation, as Whitehead does, ultimately brings us closer to Verlinde’s development of gravity as arising through an entropic force—than it does with Einstein’s notion of gravity as solely the result of geodesics and geometry. 3

1.3 – Scale : Both AE’s and EG describe dynamics at smallest level of phenomena as well as in the largest, cosmological contexts. We refer to this in physics as UV/IR mixing.

Our motivation for linking Whitehead’s AE’s to Verlinde’s EG draws from dynamics encountered on the order of string theory. String theory combines quantum mechanics and general relativity into one framework in a rather elegant way. That said, string theory is also subject to revisions due both to a lack of experimental testing as well as theoretical loose-ends. As Verlinde explains: “ String theory has many correct elements, but I think we need to rethink the starting point. We have all kinds of elements but we don’t really know how they hang together. We have to find this new starting point ” (Verlinde, 2011). Out of this Verlinde hopes to change the view of string theory from a given- to an emergent process. I propose that Whitehead’s cosmology of “actual entities” offers a potential platform for this venture and that an event-logic represents the desired starting basis.

If we are to take Whitehead seriously in acknowledging the categoreal scheme of actual entities as describing the most-fundamental values and dynamics in the “ experience of subjects ”—apart from and behind which “ there is nothing, nothing, bare nothingness ” (PR 167), and that “there is no going behind actual entities to find anything more real” (PR 23) — then this implies that the pursuit of fundamental dynamics must also be located at the smallest distance-scale of nature: the Planck scale. It would be here where we should

3 A program of completing the geometrization of general relativity was attempted in the 1960’s by Wheeler and colleagues. This “geometrodynamics” represents the bid to describe space-time and associated phenomena wholly in terms of geometry. More recently, Isham and Butterfield (1999) also develop a quantum version to evaluate work toward a quantum theory of gravity. 7 expect to encounter, at least in part, the type of dynamics able to be correlated with what Whitehead had in mind for the AE’s.

‘t Hooft surmises that when we get to the Planck scale we should encounter a new type of dynamics he refers to as pre-quantum (1999). As Verlinde explains, the microscopic theory is without space or laws of Newton. The hypothesis tended in this study suggests that the precursors of AE’s: ‘continuous potentialities’ (PR 102) are what dwell, at least in part, at the Planck scale and pre-quantum, like a sea of bosonic strings. In addition, within physics we encounter phenomena at the Planck distance-scale in: string theory, UV/IR mixing, QFT/tachyons, and the holographic principle plus AdS/CFT correspondence. 4 These prove rich in their descriptions of dynamics that can be read into the AE program with ease through an event-ontology. As Whitehead explains:

The actual entities—are the final real things of which the world is made. There is no going behind actual entities to find anything more real. […] The final facts are, all alike, actual entities, and these actual entities are drops of experience, complex and interdependent. (PR 18)

By assuming string theory as the actual physical basis for linking up with the AE’s—given Whitehead’s description of speculative metaphysics and Verlinde’s development of string theory, both predicated on dynamics taking place at the smallest distance scale of spacetime—we should expect to find phenomena described by string theory to also bear some resemblance to Whitehead’s description of AE’s. In fact, this is precisely what occurs in a vivid overlap between the concepts underwriting both descriptions. Out of this we will show string theory to provide a physical basis for the AE’s in a vivid overlap between the concepts underwriting both descriptions. These are spelled out in precise detail and an epical ordering in chapters five thru seven that can be read into other sequential interpretations of the AE’s such as in Cobb (2008) and Ford (1974).

1.4 – In Ontology : AE’s and EG both describe emergent phenomena

Like entropic gravity, the AE’s are duly predicated in the light of emergent phenomena: creative satisfactions qua final concrescences that emerge from the combinatorial, (positive and negative) prehensive dynamics of a feeling tone amidst a collection of multiplicities of objective data stemming from a set of initial data. As Cobb explains:

For the most part the occasion and all its prehensions express the causal efficacy of past occasions. The prehensions are better understood as expressing their causal efficacy in the constitution of the new, emergent occasion, which only comes into being as these prehensions integrate in it. (Cobb, WRB, 2008; p.35)

On Verlinde’s behalf, “ Newton's law of gravitation is shown to arise naturally and unavoidably in a theory in which space is emergent through a holographic scenario ” (2010). In fact, he considers

4 We review these concepts in chapters two and three. 8 gravity, space-time, and strings all as emergent phenomena. Seen in this light, Whitehead and Verlinde are both shown to seek the general principles underwriting emergent phenomena: the AE’s in Whitehead’s categoreal framework predicated on a two-tensor approach to space-time and relativity—and space-time/gravity, in Verlinde’s approach. From a wide view, this represents a general nod on the conceptual level to the paradigm of process/event logic over that of the classical substance metaphysics.

Given these four qualifications, it is no surprise that a tertiary framework predicated on an event-ontology and experientialism can be developed to demonstrate the same basic dynamics at play in both programmes. What is a surprise, however, is that while Whitehead’s alternative theory might not have resolved “ the true identity of gravity ” (Emerson, 1838), his Principle of Relativity—predicated on the categorical framework of actual occasions— he still might have brought crucial light to the matter as a harbinger towards what would eventually be outlined by Verlinde in concert with basic ideas pursued first in Jacobson, Bekenstein, and Sakharov plus later in Padmanabhan, Liu, Lee, and others. Put simply, Whitehead’s framework could provide the philosophical groundwork for emergent approaches in physics and cosmology.

This study aims to show that not only are Whitehead and Verlinde’s frameworks co- relevant, they are also closely coinciding. Even if Verlinde’s model requires further adjustments from within physics; and even if entropy were not the ultimate basis of emergent gravity but instead something like conformal matter in a world crystal (as in Danielewski, 2007; and Kleinert, 1987)—the general concepts underwriting his approach should still be shown to hold even if some of the details ultimately develop differently. To substantiate this claim, this study elaborates the conceptual architecture driving Verlinde’s emergent gravity in-concert-with the corresponding structural dynamics of Whitehead’s philosophical and scientific logic comprising actual entities. This proceeds to the extent that both are shown to mutually integrate under the covering-logic of a generative process and event-cycle underwriting experience and the physical universe. In comparing the components of both frameworks across the epistemic modalities of pure philosophy and cosmology/relativity physics, this study utilizes a pictorial modeling convention as a tertiary, “ neutral observation language ” — like an augur between the two theories—wherein an event-logic is progressively-enunciated in concert with the specific details of both models leading to a cross-pollinized, mutually-informing language.

1.5 – Organization of Chapters

In order to set the stage for a comparative analysis of the details underwriting emergent gravity and actual entities, a chapter is initially spent introducing the physico-conceptual foundations in their historical and conceptual contexts. This serves as a logical narrative for emergent gravity, beginning with an introduction to the historical branching of quantum gravity approaches—predicated on the desire to unify quantum mechanics with relativity—and with regard to the decision whether to quantize gravity or not. This study focuses on non-quantization approaches. From here we select for detailed analysis two

9 definitive works characterizing semi-classical methods within the non-quantization approach and setting the stage for Verlinde’s ultimate paper: Sakharov’s (1967) induced gravity and Jacobson’s gravitational thermodynamics (1995).

Predicated on the early triumphs of Sakharov and Jacobson’s non-quantization approaches to quantum gravity—and taken in tangent with foundational breakthroughs in black hole thermodynamics beginning with Bekenstein, Bardeen, Carter, and Hawking (1973)—a step-by-step conceptual outline of the ‘emergent gravity’ hypothesis is framed within Verlinde’s 2010 paper, in chapter three. With Verlinde’s proposal the notion of emergent gravity receives a solid conceptual foundation using minimal equations to grasp the idea in an accessible way; thus, it is presented as a general theory on the order of a speculative proposal. While it is generally understood that Verlinde’s model is not exacting in all mathematical details and further work remains, the generality of Verlinde’s approach provides an ideally-suited perspective for comparing essential features of that physics-based approach with Whitehead’s speculative model.

The “actual entities” chapter (four) figures-in the horizon in which AE’s locate: an experience-based process paradigm and philosophy of organism within a speculative philosophy and process (event) ontology. The paradigmatic update from substance- to event metaphysics implicit in this move is also described (see Eastman, 2009). A basic framework for the elements discussed in the comparative chapters is given by way of an introduction to Whitehead’s “categories of the ultimate” and “categories of existence,” leading to the “corrected categories of existence” (see PR). The corrected version becomes important in comparative chapters for grasping the sense in which Whitehead ultimately comes to terms with the non-fundamental (emergent) nature of ‘multiplicities’ in a way that proves to overlap nicely with Verlinde’s recognition of string theory as emergent.

After introducing both AE’s and EG sufficiently to acquaint the reader with general ideas underwriting each program, the next two chapters divide Verlinde and Whitehead’s programs into three phases comprised of different elements shown to interrelate to each other in progressively correlated dynamics through a conceptual exegesis of events.

For Whitehead, chapter five covers from the basic setting to the holographic dual (or reenactment) of a ‘multiplicity’ of ‘initial’ data into ‘objective data.’ For Verlinde, chapter five draws from the nature of UV/IR mixing, D-Branes of open strings, and the open/closed string correspondence.

After introducing the geomodal model, chapter five aims to correlate four connections between Verlinde and Whitehead’s models: (1) Whitehead’s “eternal objects” in flux with Verlinde’s ‘microscopic information’ qua ‘pre-event strands’ as discussed by Chew (2004). Out of this initial environment will be shown to yield a clarification of the ‘measurement problem’ in physics, not as a direct collapse of the wave-function, but as a sampling process of local vacuum through the hypersurface of a manifold.

10

This gives rise to (2) a “multiplicity” of “initial data” qua “open strings” on a “D-Brane” and leads to the formation of (3) a “primary datum” correlating with a “closed string,” or a “phonon.” In (4), recognition of the “snapshot” as holographic leads to a model of dual- projection. This is shown to resolve what this study calls the “data/datum dilemma” representing the historical tension in Whitehead to clarify whether ‘concrescence’ begins with a ‘primary datum’ or multiplicity of ‘data.’

The most salient points of chapter five shine light on the emergent nature of strings and how Whitehead’s correction of his “categories of existence” can be shown to reflect this same line of thinking. Here, Whitehead removes two of the original seven categories, ‘multiplicities’ and ‘objects,’ for reasons based the recognition that they are not in fact fundamental categories but instead emergent values formed within the process itself.

In another instance, it is remarkable how the last-minute substitution into the PR drafts of the “primary datum” for an “initial multiplicity” (in terms of how concrescence begins) is the single-most significant edit between the Gifford and ultimate PR draft; furthermore, in itself the primary datum is also the single most-cited example in the Gifford draft (Ford, 1984). This demonstrates that there was a certain struggle involved for Whitehead in ultimately and categorically replacing the most-used concept of earlier drafts (original datum) with an initial multiplicity—unable to reconcile how to maintain the process and arrive at objective data if initial multiplicity leads to a primary datum (see Ford). We account for this through Whitehead’s anticipation of aspects of the AdS/CFT correspondence, which is not developed until the late 1990’s. In the capacity of Whitehead’s model, the AdS/CFT correspondence proves essential for explaining how the initial multiplicity of data can be “reenacted” into “objective data” ready for “prehension.”

Chapter six picks up with the “primary datum,” or ‘phonon,’ ready to “prehend” the holographic-dual of the snapshot-sample (of the wave-function) qua “objective data” of the “initial data,” in Whitehead . Compared to a “closed string,” the phonon acts as a “coupling constant” during the phase of renormalization (see Verlinde 2012 lecture). This is linked in light of Whitehead’s description of “prehension” (+/-) and “concrescence” as correlated to the renormalization procedures of coarse graining and foliation in Verlinde’s approach. Here, “ negative prehension ,” is shown to correlate with Verlinde’s ‘exclusional’ property of coarse-graining and use of the book-keeping device of Newton’s potential, ɸ; as such it can be given a role not originally realized (or specified) in Whitehead’s works.

In order of cyclicity, prehension and the phases of concrescence qua renormalization ala coarse-graining and foliation ‰ maximization of coarse-graining ‰ Whitehead’s satisfaction qua gravitational self-energy and Verlinde’s emergent gravity effect qua accumulative reaction-force of the negative prehensions. The ‘maximum’ of coarse- graining is like a polymer thermalized onto the horizon (see Bekenstein, 1973 ; Verlinde, 2010). The contact of the polymer with horizon is comparable with the ultimate outcome of the satisfaction as the “ final, real things of which the world is made up ” (PR 23).

11

Following the correlational exercise in the two comparative chapters, the discussion chapter (seven) looks at the major philosophical and conceptual distinctions between Whitehead and Einstein’s theories of relativity with the intent of framing a case for how Whitehead’s version can be recognized as more-aptly suited to understanding the conceptual background of Verlinde’s emergent gravity hypothesis than are standard interpretations of Einstein’s approach.

Broadly, Whitehead offers an accompanying philosophical framework for his theory whereas Einstein renders more of a purely mental/visual construct than one that actually ties into our lived experience of the world and phenomenology. On a mathematical level, Whitehead’s model is also shown to resemble Verlinde’s in a few specific places; for instance, in the case of two-versus-one tensors. Verlinde, Whitehead, and Minkowski all selected two tensors but Einstein combined them both into just one . In other instances, space and time, non-locality, uniformity, light and measurement, and simultaneity, including what Whitehead refers to as “presentational immediacy” – all developed to provide additional examples of conceptual differences between the two.

The final, conclusion chapter (eight) begins with a cumulative review of the narrative and details acquired throughout each chapter, and builds up to a portrait of two theories interlaced with the same fundamental dynamics from two different fields of description. We should be able to recognize Whitehead’s actual entities—an early exemplar of a microscopic theory complete with a philosophical worldview predicated on experience, broadly interpreted—as smoothly accounting for gravitation as an emergent, large-scale process arising out of the microscopic dynamics. From this, Verlinde’s account gains a general philosophy and worldview predicated on experience to explain the origins of gravity; meanwhile, the AE program gains an overall unity of purpose and a selective clarification of logic, plus a posthumous completion of saga for Whitehead via a unification of diverse topics through the integration and blending of a revised gravitational theory into a new understanding of actual entities.

If Verlinde is to overturn the logic of the last three-hundred years in supposing that gravity is not a fundamental force then he will need some philosophical leverage. By the end of this study we should be able to show how Verlinde’s descriptive account of gravity as an emergent phenomenon can effectively represent an integral sub-sequence of the generative process of actual entities. Actual entities are recognized as the corresponding philosophy underwriting Verlinde’s emergent gravity. Equipped with this process philosophy of organism and event ontology, Verlinde will then have an adequate conceptual architectonic and worldview in which to naturally house his emergent gravity proposal. Both Whitehead and Verlinde stand to gain from this synthesis of frameworks.

12

Chapter 2 – Physico-Conceptual Foundations of Emergent Gravity

The purpose of this chapter is to provide a thorough narrative tracing a set of principles, phenomena, mathematical theories, and observational discoveries in modern physics leading up to string theory and emergent gravity, with roots dating back to the early 20 th century. Within this we’ll also encounter the guiding notion that space-time is emergent at smallest scale. We’ll weave this narrative into the context of: general relativity; quantum theory; the vacuum; and Planck scale; plus the integral motion to fuse these two theories together to obtain a quantum theory of gravity. In addition, we’ll explore the UV/IR mixing connection that predicates quantum gravity theories in the form of black- hole horizon environment at Planck scale. This will lead us to geometrical entropy, statistical mechanics, and Hawking radiation; the black hole information paradox; Wheeler’s it-from-bit; and right-on-up to the holographic principle. From here we transition into string theory via a brief historical development before encountering: emergent bootstrapping and S-Matrix approaches; open strings; closed strings; =4 SYM; phonons; D-branes; and solitons. This leads to a discussion of the open/closed string correspondence in light of gauge/gravity duality and the AdS/CFT correspondence. Finally, we can transition directly into Verlinde’s approach on the basis that it serves as the “ most-radical consequence of the AdS/CFT ” (Dijkgraaf, 2012) .

By the end of this chapter we aim to have all but defined the concepts and progression of ideas in physics and cosmology necessary for a lead-in to approaches and phenomena pertinent to Verlinde’s emergent model of gravity. The other major topics to be addressed are Sakharov’s induced gravity (1967), Jacobson’s gravitational thermodynamics (1997), and the holographic renormalization procedure, all to be covered in next chapter.

There is, however, one necessary disclaimer we must append before continuing. The existence of black holes is a theoretical construct as opposed to observational evidence. This study adopts a prudent stance with regard to the ontological validity of black holes, leveraging instead the 1/0 logic underwriting them in the capacity of pure dynamics. As such we’ll consider the mathematical possibility of these phenomena as a sign of their archetypal significance for matters to be contextualized under a different model.

We begin this larger background narrative by reconsidering the notion of space-time at the microscopic scale as a fundamental tensor, instead following routes of suggestion motivating it as an emergent phenomenon; thus, not only will we suppose that gravity is emergent, we’ll also consider that space-time, too, emerges at the smallest scale.

2.1 – SPACETIME AS EMERGENT

In order to construct a maximally-adequated comprehension of fundamental physics at the highest and lowest scales, we need to subscribe our knowledge of the macroscopic order (general relativity) to a development of the microscopic dynamics underwriting thermodynamical and quantum-mechanical properties. These prove most effective at

13 near-horizon black hole environs in the context of spacetime and string theory, as well as in the Large- gauge sector of quantum field theories. What remains to be established is a structural/contextual basis (or framework) for quantum gravity and all related phenomena, though the above examples provide a solid center for the venture.

In a philosophical sense, Aristotle’s ontology deals with macroscopic phenomena and consequently a substance metaphysics, whereas Whitehead’s process ontology and characteristic development of the AE’s appears precisely-suited to motivate a microscopic description of space-time and value dynamics; therefore, we should look to find comport in the endeavor to frame a foundational comprehension of quantum gravity within the general, process-relational philosophy of Whitehead, and in particular, his rendering of AE’s as a primitive process underwriting fundamental physical and experiential modes operating at the seat of—our connection to—nature and cosmology (1922).

The conceptual phylogenesis of modern physics leads to perhaps no more astounding a class of hypotheses than those underwriting the microscopic nature of spacetime, and by extension, quantum gravity. This mixing of the macroscopic with the microscopic scales is notably encountered in the context of UV/IR mixing, black holes, non-commutative geometry, and =4 super-Yang Mills theory, as well as in the holographic principle, Hawking radiation, string theory, and the AdS/CFT (gauge-gravity) correspondence.

Indeed, the quest to understand the microscopic structure of space-time “ represents the driving force in attempting to merge quantum theory with gravitation ” (Chivukula, 2010). Philosophical expectations differ, however, with regard to the understanding of space and time in the first place. Some hold to space and time as fundamental (see, e.g., Polchinski, 1998; Rovelli, 2004) whereas others take them to be emergent abstractions (see, e.g., Dijkgraaf, 2012; Verlinde, 2010; Berenstein, 2006; Seiberg, 2006). These differences can be ultimately attributed to artifacts between substance and process metaphysics—as discussed in the final chapter of this study. In addition, they can be taken as representative of two methodological approaches to space-time and gravity: (i) the quantization approach; and (ii) non-quantizational, induced and emergent approaches.

The guiding supposition and central motif of this study considers that space-time—as well as gravity and perhaps even QM (all matter)—will prove to be emergent phenomena at the smallest scales. Instead of fundamental synthetic-a-priori values they are derived “from a more-intrinsic dimensional occasion and dynamical framework” as Clara Moskowitz frames the matter in a recent publication of the Scientific American:

We often picture space and time as fundamental backdrops to the universe. But what if they are not fundamental, and built instead of smaller ingredients that exist on a deeper layer of reality that we cannot sense? If that were the case, space-time’s properties would

14

“emerge” from the underlying physics of its constituents, just as water’s properties emerge from the particles that comprise it. (2014) 5

The notion that the fundamental structure of spacetime might be something other than a continuum has been around for many decades (Palma and Patil, 2009) and many scholars are actively pursuing the construct (for instance, see: Seiberg, 2006; Yang, 2009; Liberati, 2006; Hu, 2009; Markstrom, 2010; Dreyer, et al 2006, 2009; El-Showk and Papadodimas, 2012). One indication in quantum theory recognizes the Riemannian smooth manifold as an inaccurate depiction of XT according to quantum mechanics at the smallest scale, which resembles more of a bubbling chamber of virtual particles and vacuum quantum fluctuations (see Gross, 2014; Dijkgraaf, 2012). This means that space-time at smallest scales isn’t smooth anymore but resembles more an ocean of activity. 6 Dijkgraaf explains:

XT gets replaced at small distances by something more involved via large- gauge theory as described in the AdS/CFT correspondence, whereby all physics is equivalent to a theory only living on the boundary of the black hole. (Dijkgraaf, 2012)

This suggests that XT doesn’t represent the fundamental basis for our arguments but should instead emerge in the process. “ Many philosophers of science and mathematical physicists alike are turning to this paradigm as the next big movement after relativity theory ” (Wüthrich, 2006). Whitehead’s AE framework should prove prescient to this approach, as we will explore in the upcoming chapters. First we begin by looking at two of the main pillars of science: Einstein’s general relativity and quantum mechanics—before building- up a narrative constellating other significant advancements leading to Verlinde’s model.

2.2 – GENERAL RELATIVITY

Originally space was considered by the Greeks to be a rigid, absolute container, like a big stage where natural phenomena play out their existence. Time on the other hand, according to Newton, was this big clock that would tick and set the stage directions (Dijkgraaf, 2012). Then Einstein came along and said that instead there is a spacetime continuum that unifies the two, plus space as a stage isn’t rigid but actually flexible and can curve and shape on the basis of energy and mass, or gravitation. Specifically, Einstein’s general relativity is a geometrical theory of gravity and space-time based on the curvature of space as a collection of physical events determined by the distribution of matter and energy present (Verlinde, 2010, Mäkelä, 2010).

5 “Water is made of discrete, individual molecules, which interact with each other according to the laws of quantum mechanics, but liquid water appears continuous and flowing and transparent and refracting. These are all ‘emergent’ properties that cannot be found in the individual molecules, even though they ultimately derive from the properties of those molecules” (Jacobson, 1995). 6 In another example, the noncommutative method of quantum gravity provides a third approach resolving that at microscopic level it’s improper to think about points in space below the Planck scale; instead spacetime itself diminishes into a fuzzy, pixelated region wherein values can be indexed as Planck areas (see e.g., Ambjørn, 2002). Non-commutation roughly means that “ although the average values of the fields vanish in a quantum vacuum, their variances do not” (Evans and Kielich, 1994). 15

The curvature interaction between matter and space-time is defined by the system of partial-differential equations underwriting Einstein’s field equations. These describe the relationship between the geometry of a four-dimensional, pseudo-Riemannian manifold representing smooth space-time (plus) the energy–momentum present in that same region (Wald, 1984; Weinberg, 1972). Put simply, Einstein said we can associate gravity with space-time by the way it warps.

Gravity in terms of this geometry of space-time is based on the local equivalence between gravitation and inertia, or the local cancellation of the gravitational field by local inertial frames: the equivalence principle. 7 As Yang explains, “ The equivalence principle guarantees that it is ‘always’ possible at any spacetime point of interest to find a coordinate system such that effects of gravity will disappear over a differential region in the neighborhood of that point ” (2009). Paraphrasing Wheeler, the Einstein equations say that matter tells space-time how to curve and space-time tells matter how to move (1990); therefore, the space-time metric is not a fixed stage but part of the equations and as such is dynamical and can be affected by the matter content of the universe. “In this sense, gravitation may be considered as a manifestation of the curvature of spacetime ” (Peltola, 2007).

The paths of objects moving in space are determined by the geometry of spacetime, and objects in a free-fall move along geodesics, i.e., routes between spacetime points. Space and time, therefore, have physical properties according to Einstein and the influence of this curvature describes the motion of particles under the influence of gravity. As such, gravity corresponds to changes in the properties of space and time, which in turn changes the straightest-possible paths, or geodesic routes, that objects will naturally follow.

The relevant physical content of Einstein’s theory isn’t the metric, however, but the diffeomorphisms of the metric; namely, shifts around points and the mapping of points from one manifold to another (see e.g., Mason and Newman, 1989; Chamseddine, 2001) . This is important for our present study because by definition a diffeomorphism is understood to map a sequence; this means that the order of events remains the same. This is taken as the principle of background independence , which states that only events and their relations are physical (Markopoulou-Kalamara, 2010). As we will see in chapter four, this principle also motivates the Whiteheadian process framework whereby only AE’s and their relations are physical, and XT is taken as an abstraction (see PRel, 1922).

For Einstein this means that all of physics is geometry “ and thereby space and time became no longer just the stage, but an active player in the game [...] In some sense, however, quantum theory will probably be victorious over the underlying ideas of geometry that were so dear to Einstein ” (Dijkgraaf, 2012). Notable others have also pushed back against the notion that matter distributions impact the geometry of spacetime. In fact, they’ve done so since the beginning. Not only Minkowski, Eddington, Silberstein, but also Whitehead relates similar sentiments: “It is

7 Einstein once recalled that the equivalence principle was the happiest thought of his life. 16 inherent in my theory to maintain the old division between physics and geometry. Physics is the science of the contingent relations of nature and geometry expresses its uniform relatedness ” (PRel 10). We will come back to explore Einstein’s framework of general relativity in relation to Whitehead’s version in the discussion chapter (seven) of this study.

2.3 – QUANTUM THEORY

In order to understand the very beginning of the universe, we have to understand the laws of the small elementary particles of quantum theory to describe the structures that we find there. (Dijkgraaf, 2012)

Einstein struggled with comprehending the rudiments of quantum theory; however, to find any kind of solution to the grand picture of the universe we have to study quantum mechanics, and in some sense string theory. Einstein wasn’t alone in this however, and “ it was not obvious for a long time that mathematics is actually the appropriate approach to understanding the structure of the universe and how it’s working ” (Dijkgraaf, 2012).

Quantum models can be subdivided into two main categories: those based on quantum mechanics and those on quantum field theory. In quantum mechanics the classical measurable quantities of position and momentum are replaced by abstract operators acting on an abstract state-space (vector) of the system called a Hilbert space. An operator is an object that transforms state vectors to each other. The state vector, in turn, contains all available information of the system (see e.g., Srednicki, et al, 1984; Peltola, 2007). As Pessa explains in an article found in Vitiello, Pribram, and Globus’ (2004) book:

Quantum mechanics deals with systems constituted by a finite and fixed number of particles contained within a finite and fixed volume. The physical quantities characterizing them, however, cannot be all measured simultaneously with arbitrary precision. A first consequence of such an uncertainty is that a complete characterization of a particle dynamical state with unlimited precision is impossible. One is then forced to introduce the concept of representation of the state of the system being considered [...] consisting in selecting a subset of the dynamical variables describing the state of the system such that all variables belonging to the subset can be measured simultaneously with arbitrary precision. In a sense, every representation can offer only a partial description of system’s dynamics. However, an important theorem proven by Von Neumann (1955) asserts that in QM all possible representations are reciprocally equivalent, meaning that they give rise to the same values of probabilities of occurrence of results of all possible measures relative to the physical system under consideration, independently from the particular representation chosen. (2004)

Quantum mechanics can also be applied to fields (see Jaffe and Witten, 2000, for review); as MacKinnon prescribes, “ the ongoing process of explaining composites in terms of progressively more elementary units should not terminate in particles, but in some more fundamental entities postulated by quantum field theory ” (2007). As ‘t Hooft explains:

According to the laws of quantum mechanics, the energy in a field consists of energy packets, and these energy packets are in fact the particles associated to the field. Quantum mechanics gives extremely precise prescriptions on how these particles interact, once the field equations are known and given in the form of a Lagrangian. The theory is then called

17

quantum field theory, and it explains not only how forces are transmitted by the exchange of particles, but it also states that multiple exchanges should occur. (‘t Hooft, 2008)

This sets the stage for a dynamical model. Quantum field theory was first proposed in 1926 by Paul Dirac as a general framework for the description of the physics of relativistic quantum systems and elementary particles. It’s “ the synthesis of quantum mechanics with special relativity, supplemented by the principle of locality in space and time, and by the spectral condition in energy and momentum ” (Halvorson, 2006). Quantum field theory stands in theoretical methods and experimental verifications as the correct approach to describe particle interactions at high energies up to the grand unified scale where unification of the known forces is predicted. 8 As Baumgartl explains:

Based on this [quantum field] theory the Standard Model of has been developed, which has been successful in unifying the known forces and particles in a consistent mathematical framework. It provides a scheme where all observed particles can be gathered, classified according to mass, charge, spin, etc. (2007)

In the 1960’s it was commonly argued that “ elementary particle physics is like a black box, something you cannot open, something comes in, something comes out and you can study the correlation between the two .” As Dijkgraaf continues, “ not only could this box be opened, it turned out inside it was in fact quite a small formula ” (2012).

This describes the Standard Model of particle physics as the central achievement of QM and QFT under the guise of quantum chromodynamics and gauge theory. These equations describe natural geometrical objects: “ a handful of particles and how they interact—in essence, all physics, all the matter, all forces, and all radiation ” (Dijkgraaf, 2012).

Most theories in standard particle physics, including the Standard Model, are formulated in terms of relativistic quantum field theories—such as QED and QCD—incorporating Einstein’s special theory of relativity with quantum theory (see e.g., Bain 2011). These give rise to gauge theories. A brief summary overview is in order. As ‘t Hooft summarizes:

In the Yang-Mills theory of QCD we are told the quantum field theories that have proven most important in describing elementary particle physics are gauge theories, and that the classical example of a gauge theory is the theory of electromagnetism. (2008)

A gauge theory is a quantum field theory where the Lagrangian—an expression of the action principle (action = reaction) in terms of the fields of the system—is invariant

8 “Within QFT—as opposed to QM—there is the possibility of having nonequivalent representations of the same physical system” (Haag, 1961; Hepp, 1972); one consequence of this is that only QFT, which allows for different phases of the system itself, can deal with phase transitions, that is, with global structural changes of the system. Such a circumstance entails that the framework of QFT is actually the only one possible if we attempt to model intrinsic emergence (see Itzykson & Zuber, 1986; Umezawa, 1993). 18 under certain transformations, meaning that the state will not change up to the multiplication by one phase (‘t Hooft, 2008). These transformations, called local gauge transformations, form a Lie group which is referred to as the symmetry group or the gauge group (scale-group) of the theory. 9

In many early theories, the multiple exchanges of particles in quantum fields gave rise to difficulties: their effects seemingly unbounded, or infinite; in a gauge theory however, “ the small distance structure is very precisely prescribed by the requirement of gauge-invariance and one can combine the infinite effects of the multiple exchanges with redefinitions of masses and charges of the particles involved. This procedure is called renormalization ” (‘t Hooft, 2008) and we’ll come back to it more in the next chapter and chapter six.

Gauge theories come in two essential varieties, abelian and non-abelian. Abelian is the mathematical term correlating to ‘commutation’ in physics. When values are always commutative these are referred to as abelian theories, like electromagnetism. This means that the transformations of X to Y are equivalent to those from Y to X: you can play them in reverse, in other words. However, there is also a property of vector matrices allowing that not all values will always commute; under these representations, this means that the gauge theory is non-abelian and indicates that the symmetry group is non-commutative, or non-abelian. In these theories X to Y doesn’t equal Y to X and as such there are irreversible processes described. This property can also be distinguished under the heading of ‘chiral symmetry breaking’ where ‘chiral’ refers to the handedness of a gauge group and says that the left and right aren’t symmetric copies.

The non-abelian version of four-dimensional quantum gauge theory is the Yang-Mills theory, which accounts for the electromagnetic and weak forces, including the electro- weak force (see Witten and Jaffe, 2000). In order for the Yang-Mills theory to also describe the strong force, the theory of quantum chromodynamics is required.

Quantum chromodynamics is the study of the Yang–Mills theory of the strong force and color-charged fermions: the quarks; this, by extension, describes a non-abelian gauge theory consisting of a color field mediated by a set of force-carrying (exchanging) particles: the gluons. However, there are also peculiarities about QCD that makes it much more nuanced than classical, non-abelian gauge theories (see Jaffe and Witten, 2000). In order for QCD to describe the strong force it must demonstrate the following three properties: mass gap, quark confinement, and chiral symmetry breaking.

The mass gap “ is necessary to explain why the nuclear force is strong but short-ranged ;” confinement “ is needed to explain why we never see bare quarks ;” and chiral symmetry breaking “ is needed to account for the current algebra theory of soft pions developed in the 1960’s ”

9 Topologically, gauge theory studies principal bundle connections , (the gauge fields ), on a principal bundle. Gauge fields are where principle bundle connections move through on way to embedding onto the surface manifold of the principle-bundle. These connections correspond to fields, e.g. the electromagnetic field. 19

(Jaffe and Witten, 2000). In addition, confinement stands in relation to de-confinement, witnessed in a fourth property, asymptotic freedom. There is no precise phase-transition line separating these two properties; confinement dominates in low-energy scales but as energy increases, asymptotic freedom prevails. Roughly, this means that quarks behave casually-unresponsive when packed very close to each other. “ Due to asymptotic freedom, QCD can approach to the conformal limit in UV regions ” and transform into a conformal field theory for which the holographic principle is then applicable (Xie, et al., 2007). ‘t Hooft sums it nicely in the following passage:

Suppose now that we take the SU(2) x U(1) Yang-Mills system, together with the Higgs field, to describe electromagnetism and the weak force, and add to this the SU(3) Yang- Mills theory for the strong force, and we include all known elementary matter fields, being the quarks and the leptons […] Then we obtain what is called the Standard Model. It is one great gauge theory that literally represents all our present understanding of the subatomic particles and their interactions. (‘t Hooft, 2008)

We have long been aware of the fact that, in spite of its successes, the Standard Model cannot be exactly right, however. As ‘t Hooft explains:

The Standard Model is not perfect from a mathematical point of view. At extremely high energies (energies much higher than what can be attained today in the particle accelerators), the theory becomes unnatural. In practice, this means that we do not believe anymore that everything will happen exactly as prescribed in the theory; new phenomena are to be expected. (2008)

Even with the recent confirmation of the “shadow” of the detection, there are still mysteries to be explained within the construct of the Standard Model. For one, is the Higgs a fundamental boson or is it, like the graviton, also an emergent value? Crucially, we still cannot account for a full theory of gravitation. As Kuhlmann explains:

Most quantum field theories are not asymptotically free, which means that they cannot be extended to arbitrarily small distance scales. We could easily cure the Standard Model, but this would not improve our understanding because we know that at those extremely tiny distance scales where the problems become relevant, a force appears that we cannot yet describe unambiguously: the gravitational force. It would have to be understood first. The gravitational force acting between two subatomic particles is tremendously weak. As long as we disregard that, the theory [of quantum fields] is perfect. (2014)

Thus, “ quantum mechanics can’t be the whole story, ” as ‘t Hooft asserts, and “ there must be something underneath quantum mechanics…some more basic system ” (2002, 2011). Two key platforms we encounter in formulating an answer are quantum gravity and string theory. To approach these topics, Renate Loll (2010) asks the key question: “ what comprises the empty space between the fundamental particles ?” By the end of this study we’ll be poised to suggest how dark matter could provide an explanation and leads to a systematic process.

20

2.4 – DARK MATTER/ENERGY

Physics is at a loss and they try to figure out whether this [model] fits in a grand pattern. If you start to rearrange the pieces of the puzzle, for instance, you see that you can rearrange them in more symmetric patterns which seem to suggest that this is just part of a bigger story, there are bigger symmetries here that we cannot see in nature but that perhaps are behind physical phenomena. (Dijkgraaf, 2012)

Nature has given us a few clues suggesting that our present model can’t be the whole story with perhaps the most famous one coming from cosmology re: the existence of dark matter (Dijkgraaf, 2012). This stems from the fact that solutions to QFT’s are known to result in a huge amount of energy in the quantum vacuum (Gross, 2011). In addition:

If you look at the way in which gravity is acting on the stars in a galaxy, then astronomers have discovered that, in order to count it in terms of matter, there is a huge cloud of matter which is dark, invisible, and not made out of the particles that we know, surrounding each galaxy, and by indirect measurements, you can actually determine the structure of this dark matter distribution – roughly six times more of that dark matter than there is original matter. Cosmologists look at the structure of the universe and see that these galaxies are not uniformly distributed in the universe. They are clumped together in a large-scale structure, these kinds of strands that fly basically through space, and by studying the dynamics of matter and dark matter, actually get a very clear model that seemed to fit very well the observed structure of the universe. So we know there are lots and lots of more matter around that we cannot encode at this moment in our physical models. (Dijkgraaf, 2012)

This, combined with cosmological observations of anisotropic (unequal) inter-galactic mass distributions, led to recognition of this vast amount of energy and matter, ninety-six percent of our universe, as unaccounted for by present models. This indicates that we’re in the wrong paradigm. One proposal carried by this dissertation is that the microscopic degrees of freedom in the universe qua AE’s represent some sense of dark energy/matter.

One modification we must make, for instance, involves Einstein’s cosmological constant describing a static background/vacuum-energy density: modern physics tells us this simply isn’t the case and XT at microscopic scale is teeming with fluctuations and virtual particles that arise and disappear all the time in the quantum vacuum—which, as a result is better characterized as more of a plenum. 10 This leads to an accelerated expansion of the universe and is very different than a static background. As Dijkgraaf explains:

In fact, the universe is not only expanding, it’s expanding in an accelerating way, growing increasingly fast, and with a force pushing the universe apart: namely, dark energy. This (dark energy) represents the extra parameter and physical phenomenon [qua cosmological constant] “ to hold back the universe. ” (2012)

10 “In 1922, scientists discovered that application of Einstein's field equations to cosmology resulted in an expansion of the universe. Einstein, believing in a static universe (and therefore thinking his equations were in error), added a cosmological constant to the field equations, which allowed for static solutions. In 1929 Edwin Hubble discovered a redshift from distant stars, implying they were moving with respect to the Earth. The universe, it seemed, was expanding. Einstein removed the cosmological constant from his equations, calling it the biggest blunder of his career ” (Dijkgraaf, 2012). 21

As he notes, the cosmological constant is still there but “ working just the opposite way than Einstein figured. It is not slowing down the expansion, it is actually adding to the expansion ” (Dijkgraaf, 2012). Verlinde regards dark energy as “ the energy that fuels the system responsible for inertia ” (2011). We will develop this more in chapters five and six.

2.5 – VACUUM and VIRTUAL PARTICLES

In order to access this domain of dark matter we must travel to the vacuum scale where the quantum vacuum permeates the whole universe, and in this respect, can be identified with the space-time of general and special relativity. This leads us to consider the vacuum as fundamental and quanta and waves of energy taken as excitations of this more fundamental vacuum.

Empty space, according to quantum theory is this boiling pot of particles and anti- particles appearing and disappearing. It has really a physical material that you can study, and if you take a chunk of this quantum space-time, because of all this phenomena, there is energy in it, and this energy, according to quantum theory, that is this dark energy – that is the phenomenon that cosmologists measure. (Dijkgraaf, 2012)

In the vacuum a particle can split into two particles for a very brief time before they combine again to form another particle. These intermediate particles are termed virtual particles and you can only see them indirectly. In a slightly different context, Casimir measured this effect between two electric plates. This effect is possible because:

Basically, there is a rule of quantum mechanics that anything is allowed as long as you do it fast enough before it can detect it. And in fact [virtual particles] are measured in particle accelerators, and there is the ultimate result that, for a brief moment of time two particles can appear out of nothing and then combine again. Or, if you wish to think like Wheeler, it is a single particle that goes up in time and down in time and keeps on going round and round and round. (Dijkgraaf 2012)

These quantum fluctuations will prove to be what allow Hawking to eventually form his model for thermal radiation from black holes and to posit that black holes aren’t purely black. As we find, studying space and time itself in the context of the vacuum is really studying quantum gravity: how space and time behave in the microscopic quantum realm. Perhaps no better staging ground presently exists to examine this phenomenon than in the near-horizon environment of black holes, as we’ll see in a few moments.

2.6 – QUANTUM GRAVITY

Quantum theory deals with the very small: atoms, subatomic particles and the forces between them. General relativity deals with the very large: stars, galaxies and gravity—the driving force of the cosmos as a whole. The dilemma is that on the microscopic scale, Einstein’s theory fails to comply with the quantum rules that govern the behavior of the elementary particles, while on the macroscopic scale black holes are threatening the very foundations of quantum mechanics. Something big has to give. This augurs a new scientific revolution. (Duff, 2013)

There is a long-standing methodological tradition in physics whereby combining two approaches yields an advancement. In the case of classical mechanics this was considered

22 alongside Maxwell’s electromagnetism before being triangulated by Einstein’s special relativity. This taken in the context of classical (Newtonian) gravitation was then described by Einstein’s general relativity. The next advance led to the rise of quantum theory alongside general relativity but this has proven to present an obstacle to further progress along this road. Such a field, as a resolution to this combined procedure, is considered to develop from within a theory of quantum gravity. As Culetu points out, “ To have a complete theory of quantum gravity we must clarify whether the gravitational interaction is fundamental ” (Culetu, 2010). In this study we principally-opt for the recognition of gravity, precisely not as a fundamental force, but following Verlinde, an emergent one.

In a conceptual manner, the two branches of thermodynamics and relativity serve to denote that nature has two principle theories instead of one. This could have strong implications for a unified field theory. As Liu explains, “ In many cases these two principle theories seem to be mysteriously connected to each other ” (2010). In physics this is understood in the context of gravitation and quantum theory. The puzzle of two refers to the fact that there are two systems of physical laws in the world comprised of thermodynamics and relativity as revealed by the discrepancy between the theory of gravity and quantum dynamics. Modern physics shows us that thermodynamics and relativity come together under the notion of black hole thermodynamics and Hawking radiation.

More specifically, quantum gravity is a programme (Lakatos, 1970) of research actively in search of a structural and conceptual foundation within the unification-endeavor of general relativity and quantum field theory. In this framework, general covariance relates to relativity and gauge symmetry to quantum theory (see e.g., Norton, 1993). That such an endeavor necessarily entails a metaphysical framework is a basic postulate of this study. Quantum gravity, emergent gravity, it all comes back to an exploration of the fundamental nature of spacetime, as Markopoulou-Kalamara explains: “ Quantum effects of the gravitational field become important when we reach the fundamental limits of space and time measurements ” (2009). The scale of quantum gravity is taken as the Planck length, 10 -35 meters. Many approaches to quantum gravity consider that space and time do not actually exist at this most fundamental level and that “ together with quarks and leptons, perhaps they emerge from the deeper physics that does not rely on, or even permit, their existence [...] This makes QG a fertile ground for the metaphysician ” (Wüthrich, 2006). 11

In general relativity the gravitational field is represented by the metric of space-time and gravity is identical to properties of a dynamical geometry; therefore, "general relativity is not just a theory of gravity […] it’s also a theory of spacetime itself ” (Butterfield and Isham, 2001). As Hedrich explains, “ the quantum dynamics of the gravitational field would correspond to a dynamical quantum space-time: a dynamical quantum geometry ” (2008). This signifies that a quantization of the gravitational field would correspond to a quantization of the metric of

11 For example, Krause (2013) supposes that “the universe arises out of nothing (creation ex nihilo) because of quantum gravity. Out of nothing, nothing arises — is valid if we assume quantum gravity .” 23 space-time. As a result, certain XT ideas of classical relativity like topological spaces, continuum manifolds, and XT geometry may not prove applicable in quantum gravity.

A theory of quantum gravity should instead lead to a description of a dynamical quantum spacetime (see for example Butterfield and Isham, 2001). Some have even recently suggested that spacetime can be modeled like a superfluid (see e.g., Jacobson, 1999; Jacobson and Volovik, 1998; Fedichev and Fischer, 2003; Wilczek, 2013; and Mottola, et al., 2014). In addition we would like to gain further insight into what happens in black holes. We consider the general feasibility of quantum gravity on the basis that:

…if you look at the various forces of nature, the three forces that are there in the Standard Model, and you compare them to gravity, you see that when the energy scale goes up and up and up, that gravity gets the same strength as the other forces. So there should be a moment, even before this very brief split of a second in which inflation starts, where space-time itself becomes a quantum phenomenon, so not only are the particles themselves allowed to do anything they want, space-time itself is allowed to do this, and therefore it stops. (Dijkgraaf, 2012)

As Chivukula explains, “The search for a quantum theory of gravity has been one of the most fundamental problems in physics for the past fifty years because such a theory is necessary to understand the Universe at its earliest moments ” (2010).

2.7 – Quantization v. Non-Quantization

Does the need to find a quantum theory of gravity imply that the gravitational field must be quantized? Physicists working in quantum gravity routinely assume an affirmative answer, often without being aware of the metaphysical commitments that tend to underlie this assumption. (Wüthrich, 2006)

To quantize or not to quantize….that is the question. The major disagreement consists in determining whether such a theory may be obtained by quantizing general relativity, or by considering it as a low-energy effective theory whose metric and connections form the collective, hydrodynamic variables of some unknown microscopic theory (see Jacobson and Volovik, 1998; Volovik, 2001). This study affirms the latter view.

Framing the matter: “ There are many attempts to quantize gravity—string theory and loop quantum gravity are alternative approaches that can both claim to have gone a good leg forward ,” explains Liberati, “ but maybe you don’t need to quantize gravity; you need [instead] to quantize this fundamental object that makes space-time ” (qtd. in Moskowitz, 2014). In fact, there’s also data to suggest this; strictly speaking, as Grygiel notes, gravity doesn’t need to be quantized: “ it isn’t demanded by hard, experimental data ” (2007). Non-quantization models for theories attempting to reconcile general relativity with quantum theory lead to emergent approaches where gravity is taken as an emergent phenomenon like thermodynamics and hydrodynamics, instead of treated as a fundamental force. Here, “the fundamental role of gravity is replaced by thermodynamical interpretations leading to similar

24 or equivalent results without knowing the underlying microscopic details ” (Banerjee, 2010). Since gravity isn’t fundamental it therefore doesn’t have to be quantized (Wüthrich, 2006).

As we will see in the next chapter, Sakharov’s induced gravity program (1967) provides an effective, semi-classical approach to gravity that doesn’t require a quantization either. Similarly, Jacobson’s gravitational thermodynamics conceives of gravity as emergent from the energy flux of unobservable degrees of freedom. Together these two accounts serve to bolster the feasibility that a final theory of gravity may not involve quantization. As Deutsch (1984) predicted:

In the important matter of formalism we still know of no other way of constructing quantum theories than ‘quantization,’ a set of semi-explicit ad hoc rules for making a silk purse (a quantum theory) out of a sow’s ear (the associated classical theory). [...] I believe that quantization will have to go before further progress is made at the foundations of physics. [...] To base the theory of quantum fields on that of classical fields is like basing chemistry on phlogiston or general relativity on Minkowski space-time: it can be done, up to a point, but it is a mistake; not only because the procedure is ill defined and the resulting theory of doubtful consistency, but because the world isn’t really like that. No classical fields exist in nature.

Indeed, major shifts in viewpoints must be expected in order to make it possible to understand a theory of quantum gravity as part of a greater framework which also incorporates all other known particles and forces (Baumgartl, 2007). One of these is the intriguing case of UV/IR mixing.

2.8 – UV/IR MIXING

Remarkably, black holes provide an environment where small and large scale values are both relevant. This is known as UV/IR mixing. Ultraviolet refers to the small scale and infrared to the large scale. UV and IR correspond with quantum theory and relativity, respectively. This means that the very large-scale structures in the universe described by general relativity are coming together and mixing with the very-smallest structures and regions at the quantum level of space-time: the physics of very high energies are affecting those at extremely low ones.

In it is generally feasible to organize physical phenomena according to the energy-scale or distance-scale. The theory of renormalization group procedures—as we’ll explore in the next chapter—is based on the paradigm where the short-distance, ultraviolet physics does not directly affect qualitative features of the long-distance, infrared physics, and vice versa (see e.g., Minwalla, Raamsdonk and Seiberg, 1999). While this separation of scales holds in quantum field theory, in noncommutative field theory and quantum gravity (especially string theory) these interrelations between UV and IR physics start to emerge (Minwalla, Raamsdonk, Seiberg, 1999; also, Matusis, Susskind, and Toumbas, 2000). As Verlinde explains, “ gravity is a macroscopic force that dominates in IR but it also knows about microscopic states and has the information about the UV; therefore, there must be [an underlying] principle at work ” (2011).

25

On the UV side, the quantum effects of gravity are understood to become significant at the scale of the Planck length (Peltola, 2007). Planck scale is vacuum scale and describes phenomena at the vacuum level. In his inaugural paper on quantum theory, Max Planck (1899) realized that if quantum mechanics were to be legitimate then the ultimate consequence would be that there is a smaller size in physics, a smaller size to space and time. “ Physics is telling us is that space itself should have this property if you put it on a gigantic microscope: there is no longer space; there are just little bits and pixels ” (Dijkgraaf, 2012). They are quantum bits roughly the size of the Planck length.

Sonego describes how “our present model of spacetime as a pseudo-Riemannian differentiable manifold can be considered accurate and authentic from cosmological scales down to particle physics scales ” (Sonego, et al; 2015); however, as also explained in an earlier paper, “quantum fluctuations emerging at the microscopic level are expected to shatter the classical structure of space and time at smallest scales ” (1995). This leads us to recognize the need for a new physics description at Planck scale.

Where do we see these [Planck scale and UV/IR mixing] phenomena? Is this relevant physics? I think the amazing thing is that there is a [cosmological] laboratory where you can test these ideas… and it is black holes. So black holes are something that were in science fiction books twenty years ago, but now are part of the standard description of our universe. (Dijkgraaf, 2012)

UV/IR mixing is a very special property that only a select few models prove able to describe such scenarios, most notable among these being a holographic description of black holes near the horizon described by Dn-branes in string theory, and 4/5d SYM of Large- (1/n expansion) in gauge theory. Together these mutual descriptions yield the gauge-string duality, a close-associate of the gauge-gravity-, open-closed string-, AdS/CFT-, and AdS/QCD- correspondences, all of which will be discussed further.

2.9 – BLACK HOLES

“Black holes are where God divided by zero.” Steven Wright

Wheeler is credited with coining the term, black hole. Most generally it refers to any ‘body’ in which the escape velocity is greater than the speed of light. In fact, the existence of such regions was proposed for the first time by Michell and Laplace in the late eighteenth century on the basis of Newton’s theory of gravitation. Classical general relativity predicts the existence of black holes using the collapse of a star model. Black holes can be ultra-massive or microscopic. The first black hole solution to Einstein’s field equation was found by Schwarzschild in 1916. The Schwartzchild radius marks the radius of the singularity of the black hole which demarks the regional distinction where outside the radius the speed of light is less than escape velocity and inside the escape velocity is greater than the speed of light. The singularity is separated from the outside world by an

26 . Any mass that lives within its Schwartzchild radius becomes a black hole and a dimensionless point. 12

Black holes are characterized as having an extremely-large mass in an extremely-small volume, and are defined by three variables: mass, charge, and angular momentum. In addition, they follow four rules correlating with thermodynamics. As Bekenstein explains, “there is strong evidence that the laws of black hole mechanics are a subset of the laws of thermodynamics, and that the black hole area is proportional to its entropy ” (Bekenstein, 1973; see also Bekenstein 1974). The four laws of black hole mechanics were originally derived from the classical Einstein equation and developed by Bardeen, Carter, and Hawking in a 1973 paper. As Jacobson explains, the discovery of quantum Hawking radiation “ made it clear that the analogy is in fact a statement of identity ” (1995).

The zeroth-law maintains that the horizon has constant surface gravity for a stationary black hole. The zeroth-law is analogous to the zeroth-law of thermodynamics stating that the temperature is constant throughout a body in thermal equilibrium. It suggests that the surface gravity is analogous to temperature. In this sense, the thermal equilibrium for a normal system is analogous to constant over the horizon of a stationary black hole (see Bekenstein, 2008; Horowitz and Teukolsky, 1998).

The first law formulaically states that a change of mass is equal to a change of area, angular momentum, and electric charge. This means that the energy of a system at temperature T will change when work is done on it; analogously, the first law of thermodynamics is the statement of energy conservation (see e.g., Horowitz, 2012).

The second law states that the horizon area is a non-decreasing function of time— correlating to the fact that in thermo-physics entropy doesn’t decrease either—and assuming that observed matter is always positive: the weak-energy condition (Bekenstein, 2008). This is the statement of Hawking's area theorem describing how a change in entropy in an isolated system will be greater-than-or-equal-to zero for a spontaneous process, suggesting a link between entropy and the area of a black hole horizon (2008). This leads to the scenario where “if you have two black holes that merge together, the area of their horizons, of the new black hole, is larger than the sum of the two original ones: a synergetic effect ” (Dijkgraaf, 2012). Unless we allow that black holes have entropy we cannot maintain the second law of thermodynamics, for if black holes carried no entropy it would be possible to violate the second law by throwing mass into it. Paraphrasing Hawking’s assessment, the increase of the entropy of the black hole pays dividend compared to the decrease of the entropy carried by the object swallowed (Hawking, 1988; inter alia).

Finally, the third law states that given thermodynamics reveals the impossibility of ever reaching absolute zero, cosmologically we can derive that it is therefore impossible to form a black hole with vanishing surface gravity such that = 0 (Bardeen, Carter, and

12 If Earth had a radius of 8mm (like a marble), retaining all the mass, it would become a black hole. 27

Hawking, 1973). Stating that cannot go to zero essentially means that the entropy of a system at absolute zero is a well-defined constant, given that any system at zero temperature would necessarily be said to exist in its lowest-energy (vacuum) state (ibid).

According to these laws, black holes have a real temperature and entropy but to know this we have to count the microscopic states of the black hole, which requires a quantum theory of gravity (see Horowitz, 1998). Classical models of black holes are shown to resolve Einstein’s field equations with the appropriate Schwartzchild metric for boundary conditions (see Nordstrom, 1918); however, “for an adequate description of the interior of black holes and the very early universe ” we still need to include low-energy quantum-mechanical effects of near-horizon behavior (Schutz, 2003). Out of this region also arises the semi- classical Hawking radiation that serve to produce an identity between the classical laws governing black holes and the laws of thermodynamics. As Dijkgraaf explains, “ an amazing thing [is that] if you do the computation, you find the temperature of this thermal radiation is in fact given by the surface gravity of the black hole ” (2012). This means that in addition to entropy we also obtain energy; that is, “we have a temperature and it looks like there really is something like thermodynamics going on in black hole physics ” (ibid).

Crucial to all these descriptions is the integral role of quantum statistical mechanics. Thermodynamics is important because it represents an approximate description of the behavior of large groups of particles, made possible by the fact that the particles obey statistical mechanics. Investigation of statistical effects in systems consisting of a large number of particles is called statistical mechanics. Soloviev makes this plain:

The history of physics shows that all thermodynamical laws later have been derived from a more fundamental theory — statistical mechanics. It was shown that the concept of temperature could be explained as the average kinetic energy of a micro-particle, the concept of entropy — as logarithm of the number of states corresponding to the same macroscopic thermodynamical macro-state. (2005)

Thermodynamics describes the properties of macroscopic systems by means of the basic macroscopic quantities of pressure and temperature. In the first half of the 19th century the laws of thermodynamics were known only as phenomenological rules confirmed by experiments (see e.g., Gyftopoulos, 2005); however, through the visionary works of Boltzmann and Gibbs, “ the thermodynamical properties of macroscopic systems became viewed as statistical averages over their microscopic degrees of freedom ” (Peltola, 2007; see also Chakrabarti and De, 2000). This procedure involves identifying a collection of microscopic states with one macroscopic state specifying the system’s energy, entropy, and temperature (plus local near-horizon quantum fluctuations). Based on this identification, we can predict how some quantities will change when we vary others under certain constraints (Kelly, 1996-2002). This is also very similar to how we consider phonons in an event-ontology.

28

Indeed, in quantum statistics, the entropy of a system in a given macrostate is the natural logarithm of the number of microstates corresponding to that macrostate, whereas in classical statistics entropy is the natural logarithm of the phase space volume corresponding to the given macrostate (see e.g., Peltola, 2007; Dieks, 2013). As Willie explains, “ much as the study of the statistical mechanics of black-body radiation led to the advent of the theory of quantum mechanics, the effort to understand the statistical mechanics of black holes has had a deep impact upon the understanding of quantum gravity, leading to the formulation of the holographic principle ” (2012). Camenzind builds: “ Since black holes have a non-zero temperature, the classical laws of black holes are simply the laws of thermodynamics applied to black holes ” (2007); therefore, many posit that “ there must be some more fundamental description of the classical laws governing black holes in terms of statistical mechanics ” (ibid). Within the context of our present study we align with Majhi’s following passage where:

In order to provide a statistical interpretation of gravity, first give the equipartition law of energy and show that this leads to the identification of entropy with the action for gravity. The immediate consequence of it is that the Einstein equations, obtained by a variational principle involving the action, can be equivalently obtained by an extremization of the entropy. This implies gravity can be thought of as an emergent phenomenon. (2012)

Emergence is studied in statistical mechanics qua Ising model and magnetization. Instead of a magnet we have space and gravity, and the question is: what underlies gravity, or what is the Ising analog underlying gravity and spacetime? One of the limitations of quantum statistical dynamics, however, is that it cannot treat living matter; only the formalism of quantum field theory can practically handle this (Jibu and Yasue, 1997). However, classical statistical mechanics, in its practical applications, is still closely tied to human knowledge: a sudden change in our knowledge causes, in classical statistical mechanics, a sudden change in the mathematical/physical representation of our knowledge (see also Stapp, Von Neumann, 1955).

2.10 – GEOMETRICAL ENTROPY

Black hole entropy is a concept with geometric roots but many physical consequences. It ties together notions from gravitation, thermodynamics, and quantum theory, and is thus regarded as a window into the as yet mostly hidden world of quantum gravity. (Bekenstein, 2008)

The probabilistic description of gravity can be traced back to research on black-hole thermodynamics initially stalwarted by Bekenstein and Hawking in the mid-1970s when they introduced the concept of geometrical entropy as a gravitational version equal to the area of the horizon. Put simply, “ Bekenstein concluded that the black hole entropy is directly proportional to the area of the event horizon ” (Marolf, 2009). Starting from the theorems provided by Hawking on black-hole thermodynamics, Bekenstein conjectured that black holes represent maximum entropy objects whose entropy is proportional to the area of its event horizon divided by the Planck area. 13 Specifically:

13 He considered a of radius R where the entropy in a relativistic gas increases as the energy increases. The only limit is gravitational; when there is too much energy the gas collapses into a black hole. 29

Black-hole entropy should only depend on the observable properties of the black hole: mass, electric charge and angular momentum. It turns out that these three parameters enter only in the same combination as that which represents the surface area of the black hole. One way to understand why is to recall the "area theorem" (Hawking 1971; Misner, Thorne, and Wheeler 1973): the event horizon area of a black hole cannot decrease; it increases in most transformations of the black hole. This increasing behavior is reminiscent of thermodynamic entropy of closed systems. (qtd. in Bekenstein, 2008)

Possibly the most important consequence of black-hole entropy dwells on its statistical interpretation within a quantum gravity framework. After Hawking derived the feasibility of black-hole evaporation using an interpretation of the thermal temperature of blackbody radiation[3], considerable efforts followed to find a statistical interpretation for the proportionality of black-hole entropy and its horizon area (see e.g., Zhang and Zhao, 2005; Jiang, Wu, and Cai, 2006; and Chen and Wang, 2011).

These studies suggest a profound connection between gravity and thermodynamics as well as representing a precursor to the holographic principle. “The fact that black hole entropy is also the maximal entropy that can be obtained by the Bekenstein bound [as it ‰ equality] was the main observation that led to the holographic principle ” (see Bousso, 2002). This led to the recognition of black-hole thermodynamics as the primary method for attempts to reconcile the laws of thermodynamics with the existence of event horizons. 14

In a quantum description of black holes, since a black hole has a well-defined entropy, we expect that the hole also has a well-defined exponent of microstates corresponding to its macrostate (see Krasnov et al, 1998). A macroscopic hole has an enormous amount of QM degrees of freedom compared to the three classical ones predicted by “no-hair” theorems (see e.g., Bhattacharya, 2007).

The existence of these microstates raises many intriguing questions. Do these microstates correspond to the quantum states of the collapsing matter inside the black hole, or are these degrees of freedom connected with the quantized matter fields on a background geometry? Or could it be possible that the notion of black-hole entropy stems from the microscopic structure of spacetime itself? (Peltola, 2007)

Bekenstein used this to put an upper boundary on the entropy in a region of space that is proportional to the area of the region, as opposed to the volume. 14 Until 1995 no one was able to make a controlled calculation of black hole entropy based on statistical mechanics, which associates entropy with a large number of microstates. That changed when Strominger and Vafa calculated the proper Bekenstein-Hawking entropy of a supersymmetric black hole in string theory using methods based on D-branes and string duality (see Mohaupt, 2000). 30

2.11 – HAWKING RADIATION

Many scholars have attempted to describe the interaction of quantum matter with gravity by quantizing the matter on a fixed, classical gravitational background (see Weinstein and Rickles, 2011). That is, they have tried quantizing the matter, but not the gravity. This will work only if the gravity is weak, as in semi-classical methods; therefore, it should work outside a large black hole, but not near the singularity. 15 To these ends, Hawking provides such an approach, realizing that if you introduce an entropy you can also acquire a temperature.

Using the thermodynamic relationship between energy, temperature, and entropy, he published calculations in 1975 confirming Bekenstein's conjecture that black holes should have a well-defined entropy (1973), and showed how the characterization of black holes as thermodynamical objects with a non-zero temperature signifies that black holes aren’t completely black but should actually emit a dim, thermal radiation with the spectrum of a black body (see e.g., ‘t Hooft, 1985). This result can be attributed to quantum-mechanical effects located in the immediate surroundings of the event horizon even when there is no in-falling matter associated with the black hole (Hawking, 1988). Thereafter this was formalized into the Bekenstein-Hawking entropy describing “the amount of entropy that must be assigned to a black hole in order for it to comply with the laws of thermodynamics as they are interpreted by observers external to that black hole ” (Bekenstein, 2008 ). This is particularly true for the first and second laws, listed earlier.

Specifically, Hawking proposed a heuristic scenario where the spontaneous pair- production of virtual particles near an event-horizon provides a mechanism for radiation to occur. As Peltola explains, “ In normal conditions, a virtual particle-antiparticle pair annihilates itself very rapidly after its emergence. In the vicinity of the event horizon, however, it is possible that the member of the pair with negative energy is swallowed by the black hole before the annihilation, and the other with positive energy is free to escape from the hole ” (2007). Dijkgraaf adds, “The particle inside would be pulled by the gravitational force to the singularity while the other particle is now liberated and can escape to infinity ” (2012).

The existence of such radiation implies that black holes, like any other macroscopic objects, have thermodynamical properties—including entropy (see e.g., Becker, Becker, and Schwartz, 2006). Semi-classical calculations indicate that indeed they do, with the surface gravity playing the part of temperature in Planck's law (see Wald, 1975 and 2001). As Baumgartl explains, “ semi-classical phenomena like Hawking radiation show that black holes must be treated as quantum objects ” (2005). This provides the only known phenomenon so far that contains an interplay between quantum theory and general relativity. Peltola finds delight in this, explaining:

15 A recent line of promising inquiry posits that there are no naked singularities. The singularity is protected by a region of space, like an apex, such that the singularity isn’t ever reached. A corollary line of thinking is that quantum effects would also make it impossible to ever fully reach the singularity (see e.g., Allen, 2011). 31

Maybe the most intriguing aspect of black hole radiation is that it contains elements from quantum theory, thermodynamics and general relativity. Thus, one may say that in black hole radiation all the three foundational theories of physics meet for the first time. It is natural to expect that similar radiation processes would take place in the vicinity of other spacetime horizons as well. (Peltola, 2007)

In fact, just recently a model black hole capturing sound instead of light has been shown emitting quantum particles considered as the analog of Hawking radiation (see Steinhauer, 2009). “ The effect may be the first time that a lab-based black hole analog has created Hawking particles in the same way expected from real black holes ” (Grossman, 2014). The Hawking effect comes from quantum noise at the horizon, explains Unruh, one of the first to propose fluid-based black holes. The horizons create pairs of phonons where one escapes the horizon while the other is trapped inside (Steinhauer, 2009). Creating a quantum-mechanical fluid they were able to mimic the same physics proposed at a black hole's event horizon on a much smaller scale. In 2009, Steinhauer et al first developed a model black hole using the collective-mode, Bose-Einstein condensates that behave like a single value. Now they report it has produced just the kind of Hawking radiation expected in real black holes. " This tells us that the idea of Hawking actually works: A black hole should really produce Hawking radiation " (2009).

In the formal (cosmological) class of black holes, the fact that Hawking radiation is a semi-classical result means that matter fields are assumed to follow the laws of quantum physics while space-time is considered as a non-dynamical background. Paraphrasing Rickles (2008) and Bärenz (2012), they explain how in a more realistic situation we should expect the gravitational field to have quantum effects as well. This means that in addition to thermodynamics, black holes should also obey quantum mechanics and therefore space should be recognized as dynamical and emergent at the smallest scale.

2.12 – BLACK HOLE INFORMATION PARADOX

Hawking’s use of quantized fields derives from using QM to show how information was disappearing (Dijkgraaf, 2012); however, this prompted a scenario where the calculation conflicted with a basic principle of quantum mechanics: namely, that physical systems evolve in time according to the Schrödinger equation—referred to as the unitarity of time evolution (see Maldacena, 2005; Zwiebach 2009). This contradiction predicates the basis of the black hole information paradox and states roughly that any information travelling into a black hole must therefore disappear altogether; however, this also undermines another basic principle, namely, that information can never be created nor destroyed (for a review, see Susskind, 2008). This gave physicists a crisis to sort out. Susskind, ‘t Hooft, and H.Verlinde approach the problem by offering that when information is dropped into a black hole it isn’t actually lost; taking this, the horizon is recognized as a holographic representation of the immediate surrounding space-time. Given that the amount of information in the bulk region of a black hole is equal to the amount of information on its surface area, we can quantify the information we do not know by measuring its area.

32

In fact, it was Wheeler who began the tradition of thinking about information as the basis for understanding all of geometrical theoretical physics (1973). Bekenstein summarized this trend by suggesting that scientists may " regard the physical world as made of information, with energy and matter as incidentals " (2003). He concluded that "thermodynamic entropy and Shannon entropy are conceptually equivalent: the number of arrangements that are counted by Boltzmann entropy reflects the amount of Shannon information one would need to implement any particular arrangement of matter and energy ” (Bekenstein, 2007). The only salient distinction between the thermodynamical entropy of physics and the Shannon's entropy of information is in the units of measure; “ the former is expressed in units of energy divided by temperature, the latter in essentially dimensionless "bits" of information, and so the difference is merely a matter of convention ” (Bekenstein, 2003).

Wheeler came up with a particularly-clever slogan about this, heralding the adage: “It from Bit” - implying that ‘it’ is the universe, and ‘bit’ refers to the simplest convention of information measurement. As Dijkgraaf simplifies, “‘ Bit’ is the entropy that is there; entropy is information ” (2012). Building off Boltzman, in 1948 Shannon suggested that the sum quantity of bits is related to the totality of degrees of freedom of matter. For a given energy in a given volume there is an upper limit to the density of information (the Bekenstein bound) about the locations of all the particles composing the matter in that volume; furthermore, this suggests that “ matter itself cannot be subdivided infinitely many times and there must be an ultimate level of fundamental particles ” (Meijer, 2012).

Taking the notion of information as a starting point, Verlinde combines this with the holographic principle to predicate his approach to emergent gravity, as we’ll see shortly.

2.13 – HOLOGRAPHIC PRINCIPLE

The holographic principle and its realization in string theory have shined critical light on the mysteries of black holes and information-loss as suggested by Hawking's work, and are believed to provide a resolution of the black-hole information paradox (Maldacena, 2005; Susskind, 2008). Specifically, it’s considered that the informational content of all objects that have fallen into the hole can be re-acquired in surface fluctuations of the event horizon (see Davies, 2004). In 2004, Hawking conceded that black holes do not violate quantum mechanics and suggested a mechanism through which they might preserve information (Hawking, 2005; Susskind, 2008).

The holographic principle demonstrates how subdivisions of matter stop at the level of information, and whose fundamental constituents are represented by bits of information like 1’s or 0’s; as Damasco explains, “The main idea behind the holographic principle is that information is what drives physical phenomena ” (2012). The insight that first inspired the holographic method emerged almost two decades earlier, however, when 't Hooft wrote a paper on quantum gravity—revisiting Hawking's original work on black-hole thermodynamics—and concluding that the total number of degrees of freedom in a region of spacetime surrounding a black hole is proportional to the surface area (or radius

33 squared) of the horizon, and not cubed as would be expected (see ‘t Hooft, 1993). This idea was further clarified by Susskind in (1995), arguing that the oscillation of the horizon of a black hole is a complete description of both the in-falling and outgoing matter because the world-sheet theory of string theory was just such a holographic description. While short strings have zero entropy, he could identify long, highly-excited string states with ordinary black holes. This was a deep advance because it revealed that strings have a classical interpretation in terms of black holes. In this sense, the information of a black hole and anything outside it can be encoded by putting this information on the surface of its horizon. “ It’s a rather radical idea because it tells you that there is information in the underlying layer of understanding all of quantum geometrical physics ” (Dijkgraaf, 2012).

The basic problem with this idea is in determining the context such a holographic screen that encodes information arising in the first place. One possible answer is given by Einstein’s equivalence principle applied in an experiential context: as an observer expends energy and enters into an accelerated frame of reference an event horizon arises that acts as a holographic screen and encodes information (see e.g., Kowall, 2015).

As we will see, gravity is modeled by Verlinde as an entropic form of holographic information (see e.g., Gao, 2012). This means in some sense that “the space-time geometry we observe could be a holographic illusion defined by the spatial and temporal relationships between animated images projected from a holographic screen to the central point of view of an observer ” (Hu and Wu, 2014). In the sense of the principle of equivalence, “the observer is nothing more than consciousness that arises at an accelerated point of view while all images of things in the observer’s world arise on its holographic screen ” (ibid).

Our universe and everything we know always seems to eventually lead us to the conclusion that we live in a holographic reality. From cosmology to quantum physics, scientists today are truly having a troublesome time trying to explain the nature of our reality. How can we draw the extreme conclusion that our world is only an illusion and what does that mean when we want to know our place in the cosmos (Yang, 2006)?

Can the holographic principle be tested? It can be tested in a theoretical way, and the most successful approach so far is yielded in string theory. String theory also provides a formalism combining quantum mechanics and gravity. Hawking’s black hole information paradox is also resolved when quantum gravity is described in a string-theoretic way. String theory is funny though because essentially it’s a theory that almost never was, having already been thrown to the bin as a theory of hadrons; instead it was revived by a few key results and reformulated within the context of a quantum theory of gravity.

34

2.14 – STRING THEORY

Quantum mechanics brought an unexpected fuzziness into physics because of quantum uncertainty, the Heisenberg uncertainty principle. String theory does so again because a point particle is replaced by a string, which is more spread out. (Witten, 1995)

One of the more promising lines of reasoning in the modern era of mathematics has come from the development of string theory. It is important to distinguish this from other theories, however; as Theisen, et al., explain, “ String theory is not, in contrast to general relativity and quantum field theory, a theory in the strict sense ” (2007). Originally, string theory originally rose out of an attempt to describe the properties of the strong-force interaction through the construction of a dual-resonance model to compute S-Matrix scattering results for the strong force and mesons as an emergent foundation for physical law (see, e.g., Cooper and West, 1988; Gross, 2005; McGarrie, 2011; Rickles, 2014). This model was later recognized to correspond to the quantization of a relativistic string (see e.g., Salisbury, 1984 ). Heisenberg introduced this as a method where the S-matrix 16 serves as a way of constructing a theory that doesn’t rely on the local notions of space and time— proposed to break down at the nuclear scale—instead keeping track solely of the particles and their collisions (see e.g., Shapiro, 2007; Di Vecchia et al., 2012).

In quantum field theory the intermediate steps are the fluctuations of fields, or equivalently, the fluctuations of virtual particles; in this context, there are no local quantities at all. In this sense, the S-matrix theory was a proposal for replacing local quantum field theory as the basic principle of elementary particle physics. (Shapiro, 2007)

This allowed space-time to be taken as an emergent abstraction and the S-matrix as the quantity that describes how a superposition of incoming particles turns into outgoing ones. As a result this program was influential in the 1960’s as a conceivable substitute for quantum field theory, dogged at the time with the zero-interaction divergences at strong coupling (see Schulz, 1993). Exapted into string theory it has been suggested that S- matrix theory still offers the best approach to the problem of quantum gravity (see e.g., Frautschi, 1963; Giulini, Kiefer, and Lämmerzahl, 2003). Here, the S-matrix theory is related to the holographic principle and AdS/CFT correspondence by a flat-space limit where the analog of the S-matrix relations in AdS space are the boundary conformal theory (Giddings, 1999). Specifically, Polchinski and Susskind proposed an expression for the S-matrix in flat space-time (without gravity) in terms of the large- limit of the gauge theory living on the boundary of the AdS space (see Aref’eva, et al., 2013).

Exact theories of quantum gravity should be formulated in terms of gauge invariant observables associated to the boundary of spacetime. In that spacetime, the only such observable is the S-Matrix, so a theory of quantum gravity in at space will be a theory that computes scattering amplitudes holographically. Since AdS/CFT provides a non- perturbative description of AdS theories via a dual CFT, one can obtain the bulk S-Matrix

16 Geoffrey Chew made the bootstrapping approach famous in America. 35

from a at space limit of AdS. This defines a holographic theory for at space using a sequence of CFTs with increasing central charge. (Fitzpatrick and Kaplan, 2012)

S-matrix theory was all but abandoned in the 1970’s as QCD and renormalization arose to solve these challenges within the framework of field theory, finding greater success and corroboration with experimental results in accelerators – but then 1973 happened; this was a big year for string theory: two sets of researchers contemporaneously contextualized the role of the massless spin-two anomaly as a graviton: first, Tamiaki Yoneya discovered that all the known string theories included a massless spin-two particle that obeyed the correct Ward identities 17 to be considered a graviton (see also Veneziano, 1986). Right around the same time, Scherk and Schwartz derived a similar result leading them to adduce that string theory is actually a theory of quantum gravity, not hadrons (see, e.g. Blumenhagen, Lüst, and Theisen, 2013). This led to string theory’s rare re-examination as it became clear that the properties making string theory incongruous as a theory of nuclear physics were in fact optimal for a quantum theory of gravity. Green, Scherk, and Schwartz also realized that at low energies “ this stringy graviton interacts according to the covariance laws of general relativity ”18 (Theisen et al., 2013). With this insight, string theory became a formal candidate for quantum gravity.

As it turned out, five different string theories were eventually developed whose multiple- realizability remained a mystery until Ed Witten recognized in the mid-1990s that each of these theories could be obtained as a different limit of a non-perturbative 10+1 dimensional theory unifying all five into different elements of the same underlying theory, which he named M-theory (see for example Schwarz, 1999). “M-theory tells us that string theories are really about strings and the higher dimensional objects of D-branes, like solitons in a sense ” (Lerche, 1997). 19 As Vecchia et al. explain (2005) “ it turned out all five consistent string theories in ten dimensions unify gravity in one way or another with gauge theories. ” M- theory is often used as a term in place of referring to a non-perturbative completion of string theory (see Theisen, et al., 2007).

The five fermionic string theories are: type I, type IIa, type IIb, HO, and HE. All string theories contain closed strings; type I involves both open and closed strings. In addition there is a bosonic string theory without fermions in 26 dimensions. We focus primarily on type I, IIa, IIb string theories over the mainstay of procedures considered in this study.

Generally, QFT can’t handle gravity because of all the infinities that emerge from singularities in Feynman diagrams (see e.g. Szabo, 2011). Non-abelian gauge theories make it possible to overcome many of these difficulties. More generally, string theory unifies this level of microphysics with general relativity. String theory develops to

17 The Ward identity is a correlation function that follows from the gauge symmetries of the theory and remains valid after renormalization. 18 Covariance is a measure of how much two random variables change together. 19 Alpha prime and h-bar together lead to M-theory (see e.g., Tong, 2009); the best results so far are based on an open string language (see e.g., Antoniadis, Dudas, and Sagnotti, 1999). 36 provide “ a consistent quantum theory, free from ultraviolet divergences, which necessarily requires gravitation for its overall consistency ” (Szabo, 2011). Sahakian’s narrative spells it out:

String theory may be viewed as a framework for exploring new exotic ideas on the frontier of theoretical physics. At its heart, the subject aims at describing a consistent theory of quantum gravity, in addition to being a short length scale completion of the Standard Model of particle physics. The subject’s most prevalent successes to date are twofold: convincing evidence that the theory resolves various long standing puzzles arising in black hole physics; and phenomenological realizations of models that appear to mimic the world we see at low energies. While the theory itself as a whole may still evolve beyond its current form, several of the new concepts that it has developed are expected to survive at the foundation of a future formulation of the laws of physics. (2012)

The basic postulate of string theory is modest: elementary objects are extended and behave according to the laws of relativistic quantum mechanics; the notion of an elementary particle is generalized to a one-dimensional object: a little bit of string. Since string theory is a relativistic quantum theory that includes gravity, it must also involve the corresponding three fundamental constants, namely the speed of light c, the reduced Planck constant h-bar, and the Newtonian gravitational constant G. These three constants combine into a constant with dimensions of length. The characteristic length scale of strings may thereby be derived to the Planck length (see Szabo, 2011).

This means that the fundamental mass-scale (or tension) of a string is related to the characteristic mass-scale of gravity: the Planck mass. Strings vibrating at the Planck scale are thought to be an essential ingredient in the production of all fundamental particle constituents through the nature of their vibrations—each giving rise to a particular type. In this sense QFT is also described naturally in part by the integral nature of strings; as Dawid (2007) explains, “the dynamics of our observed world is at the most-fundamental level explained by a purely geometrical theory of strings in space-time. All interactions, nuclear interaction as well as gravity, can be extracted from the dynamics of those strings .” Even more succinctly, Blumenhagen, Lüst, and Theisen describe how:

All particles, matter and interactions have a common origin: they are excitations of the string. There are open and closed strings. The massless spin-two particle appears in the spectrum of the closed string. Since any open string theory with local interactions, which consist of splitting and joining of strings, automatically contains closed strings, gravity is unavoidable in string theory. (2013)

In addition, all properties attributable to point-like particles are explained in string theory in terms of oscillation modes formulated as topological properties of the string (see Dawid, 2007); specifically, the Polyakov action (1981) describes the worldsheet of a string whose manifold allows it to be embedded in space-time (see e.g., Font and Theisen, 2003, 2005; Weigand, 2011). In order to describe oscillating strings, the Polyakov action must be supplemented by the Liouville action in order to describe the fluctuations (see e.g., Compere, 2008; Tong, 2009). Ivancevic draws out the point:

37

Liouville's theorem is a key theorem in classical statistical and Hamiltonian mechanics. It asserts that the phase-space distribution function is constant along the trajectories of the system — that is, the density of system points in the vicinity of a given system point travelling through phase-space is constant with time. (Ivancevic, 2002)

Famously, string theory requires 9+1 dimensions to hold. M-theory describes 10+1. The extra dimensions—in addition to the usual 3+1—are treated as compactified internal dimensions. Four-dimensional spacetime appears as usual flat Minkowski space, whereas the internal dimensions constitute manifolds with possibly complicated geometries. The compactification of higher-dimensional spacetime in extra dimensions can be considered as hyper-dimensional oscillating spaces. The internal structure of string theory gives reasons to believe that once it has found a fully consistent formulation it might be a final theory (see Dawid, 2003).

In string theory there are three basic operators: closed strings, open strings, and D- branes. The two varieties of strings, open and closed, each lead to states with characteristic properties. The Reggeonic sector and are early versions of open and closed strings: the 'pomeron sector' is now recognized as the closed string sector while the 'reggeon sector' represents an open string theory (see e.g., Lublinsky et al., 2014). More recently, open strings in AdS/CFT description are linked with =4 SYM, while closed strings bear reference with a quasiparticle modeling. In addition, D-Branes can be considered like solitons. We’ll identify each of these scenarios briefly, beginning with open strings.

Open Strings

Open strings are defined in bosonic and type I (fermionic) string theories. An open string has two end-points and is the equivalent of a line-interval. Open strings describe scalar and vector bosons in the massless sector and are perturbative (see Verlinde, 2011). In addition, open strings can merge into closed-loops of energy. As Dijkgraaf remarks, “before we calculated the theory we couldn't know this; it just turns out that we struck lucky ” (2012). It is as if we can make something that isn't in the theory in the first place. This leads to the conclusion that we not only need to accept open strings as basic ingredients, but also closed loops of energy: closed strings.

Even though open and closed strings have much in common there are also notable distinctions between them; for example, as Verlinde describes: “ gravity is in the closed string sector and is therefore something else than what the open strings are ” (2011). In fact, while not all string theories describe open strings, they each must contain closed strings given that “interactions between open strings can always result in closed strings ” (Schwartz, 2000). Open strings prove to bear more resemblance to CFT’s while closed strings represent the classical sector of gravity “ and yet also exist in QFT as coupling constants ” (Vecchia, et al.,

38

2005; Verlinde, 2011). These facts prove vital for linking the accounts of Whitehead’s AE’s with Verlinde’s EG in an event-logic and process context.

It turns out that a quantum-mechanical theory of open strings can also be formulated and shown to automatically incorporate a number of excitations that look like particles (see e.g., Rudolph, 1998; Ashtekar, 2005)—one of those being a vector particle that is exactly massless. The ability to describe this means that a “quantum-mechanical formulation of a theory of open strings automatically incorporates one of the key ingredients of the Standard Model ” (Polchinski, 1998).

Depending on which string theory you’re working with, open strings come in various dimensionalities (Peeters and Zamaklar, 2007). Verlinde says there must be something else behind open strings, however, since we need a D-Brane background for them to propagate on in spacetime; therefore, open strings cannot be considered the starting point: “ they already have their own inertia via states with a certain mass, so there’s something underlying them ” (Verlinde, 2011). The event-logic constructed in chapter five will provide a methodological nod to this claim with the hope and goal of shedding more insight into the ontological nature of open strings and D-branes.

In terms of a gravitational scenario in matrix theory, open strings can be taken as oscillators with a certain spectrum of masses and an amplitude between two D-branes (Verlinde, 2011). Frequencies of oscillators depend the on the mutual positioning of the D-branes; anharmonic frequencies with coupling constants in QFT are integrated out (ibid). This dis-integration of the anharmonic, off-diagonal (open string) degrees of freedom between two D-branes is attributed to the induction of gravity. This is a way of viewing gravity in an open-string channel where, as Verlinde describes, “ fast variables influenced by slow variables create a reaction force and this is precisely what open strings are doing” (2011). This means that the set of open strings creates fast-variables operating in a Higgs space that, when integrated out of influence with slow-variables of Coulomb space, leads to gravity, and whose remaining parts contribute to the gravitational self-energy (see Verlinde, 2011). Crucially, this is compared to the phases of prehension and concrescence in Whitehead’s AE process leading to the satisfaction, or culmination, of the basic AE generative cycle (PR), as we’ll see in chapters four and six.

=4 SYM (the 1/N expansion)

Supersymmetric Yang Mills theory (SYM) is a large- theory in four dimensions that (‘t Hooft realized in 1974) fits into the scheme of the large- expansion of the field theory described by a string theory in five dimensions. “ The fact that these two seemingly different theories can be related is due to the holographic nature of gravity whereby the total number of degrees of freedom in the gravity side is not an extensive quantity ” (Berenstein, 2006).

Since its initial formulation, the large- limit (1/N expansion) has provided crucial insight into the study of non-abelian gauge theories like quantum chromodynamics,

39 referenced earlier. The 1/N expansion also plays a central role in the recently discovered connections between (non-abelian) gauge and string theories (via gauge-string duality: see e.g., Gopakumar, 2011; Casalderrey-Solana, 2011) promising new ways to analyze the non-perturbative domain of gauge theories (Scoccola et al., 2004). Specifically, the lines traced by the ends of open-string solutions ending on the boundary of AdS5 can be viewed as Wilson loops in = 4 SYM theory (see e.g., Tseytlin, 2002; Ishizeki, 2008). After applying an inversion transformation, the open Wilson loops become closed Wilson loops. Berenstein develops the example where:

In the large- limit QCD is described by some type of weakly coupled string theory: the mesons become open strings, and the glueballs are closed strings. There is some scale associated to the string dynamics (the effective string tension), and the string coupling constant is of order 1/N at that scale. (2006)

Topics addressed by the recent developments in the 1/N expansion in QCD include: confinement, AdS/CFT correspondence, the string-QCD connection, topology in lattice QCD, and a variety of applications to mesons and baryons (Scoccola et al., 2004). The general combination of open strings on a D-Brane and a large- gauge theory sets up the image where the open strings are dual to the large- gauge theory limit in QCD. As we’ll see, downstream this will motivate an AdS/QCD development.

Closed Strings

Closed strings are described in all string theories and represent a string with no end- points; they are topologically equivalent to a circle. As Chu and Ho explain: “a closed string can break itself and end on a D-brane. It can also break itself and interact with another closed string or open string ” (1999). Most remarkably, theories of closed strings provide a consistent UV completion of gravity (Koch and Murugan, 2009), which means that they are mathematically consistent and require experimental data for falsification; to these ends, excitations among the closed strings can be identified as gravitons (Baumgartl, 2007). Thus, we appear to get these massless spin-2 particles for free … “and thankfully, for it is also exactly what is needed to formulate the theory of general relativity ” (Dijkgraaf, 2012).

Closed strings classically describe what gets integrated-out in the quantum. This means that gravity knows about microscopic phase space. Integrating-out open strings integrates-in closed strings and thus they are the same in a sense (Verlinde, 2011). Coupling constants, as closed strings, have to become dynamical and the dynamics are based on the forces of the closed string (Verlinde, 2011). The closed string “knows about” phase space volume because it signifies the collective ingression of its original components in the snapshot and because it serves as coupling for the data therein.

Crucially, Verlinde maintains that the closed string (graviton) should not be treated like a fundamental particle, but rather like a phonon (quasiparticle) as in acoustics (2010). This means that there is nothing to quantize; instead, phonons appear in a quantum

40 description a lot like spontaneous symmetry breaking does in the Standard Model narrative; interestingly, the basic quanta of light (photons) and sound (phonons) obey the same rules describing bosons (see Khurgin, 2010).

Why are phonons important in the quantum realm of the very small? “ I feel they're important because of the notorious weakness of gravity at that scale ” (Verlinde, 2010). Gravitons do not exist when gravity is emergent. Gravitons are like phonons. If, as Verlinde speculates, the phonon effect goes away with few particles, and if gravity is ruled by such effects, then the weakness of gravity is explained (2010). To make this case he draws an example of sound waves travelling in pistons:

Consider two pistons that close of a gas container at opposite ends. Not that the force on the pistons due to the pressure is also an example of an entropic force. We keep the pistons in place by an external force. When we gradually move one of the pistons inwards by increasing the force, the pressure will become larger. Therefore the other piston will also experience a larger force. We can also do this in an abrupt way. We then cause a sound wave to go from one piston to the other. The quantization of this sound wave leads to phonons. We know that phonons are quite useful concepts, which even them-selves are often used to understand other emergent phenomena. Similarly, gravitons can be useful, and in that sense exist as effective "quasi" particles. But they do not exist as fundamental particles. (2010)

Interestingly, some physicists consider phonons as a way of maintaining—not mitigating—quantization. As Wüthrich explains, “physicists routinely quantize collective degrees of freedom such as sound ” via phonons serving as the quanta of sound (2006). We should be led, however, to recognize that even if we say we are quantizing a phonon, it still arises as an emergent value in a collective-mode. This means that we’re quantizing a quasi-particle, not a fundamental one. The event-logic constructed in chapters five and six will not only link phonons into Whitehead’s schema, but also explain from a Whiteheadian perspective why it is so important that they be emergent quasiparticles as opposed to fundamental values. In fact, the metaphysical notion of free will and individualized experience encoding in the universe could be shown to both reside and stem from this distinction, relating consciousness in nature like emergent quasiparticles to physical phenomena.

In these later chapters we will also develop closed strings in the capacity of a dynamical coupling constant. A coupling constant refers to a dimensionless (pure) number that indicates the strength of the interaction. In string theory, as distinguished from QFT, the coupling is a dynamical variable, as opposed to a strict constant (see e.g., Fujii, 2003) .

Dirichlet Branes

“D-branes led string theory back to gauge theory. ” (Klebanov, 2006).

In the 1990’s, Polchinski discovered that the mathematics of strings requires higher- dimensional objects called D-branes, and identified these with the black-hole solutions of gravity. Branes are dynamical objects that can propagate thru spacetime according to the

41 rules of quantum mechanics; they have mass and other attributes such as charge (Moore, 2005; Aspinwall, 2009). “Thus with the introduction of D-branes it was discovered that the fundamental scale of the theory is the Planck scale, and not the string scale ” (Martinec 2013). D- branes are recognized as very massive objects in the spectrum of relativistic strings, [3]; as Frampton explains, “ it quickly became clear that D-branes (and other p-branes), not just strings, formed the matter content of string theories ” (1974). Polchinski’s analysis of D-branes led to the AdS/CFT correspondence and a microscopic understanding of the thermodynamic properties of black holes (Polchinski, 1998; see also Iizuka, 2002).

In flat spacetime (without gravity) the D-brane is regarded as a fixed hypersurface yielding the Dirichlet boundary conditions for open strings (see Moore, 2005) on which their endpoints are attached. One key trait of D(irichlet)-branes is their ability to exchange with the string. D-Branes come in different dimensions. For example, D-0 branes are point-like, unless you stack a bunch of them (Verlinde, 2011u; 2011j). D-branes are typically classified by their spatial dimension indicated by the number written after the D. A D0-brane is a single point, a D1-brane is a line (sometimes called a "D-string"), a D2-brane is a plane, and a D25-brane fills the highest-dimensional space considered in bosonic string theory (see e.g., Aspinwall, 2009). There are also instantonic D(-1)-branes, which are localized in both space and time.

The microscopic fluctuations of the D-brane are described by open strings. As an open string propagates through spacetime its endpoints are required to link onto a D-brane. “The open string low-energy degrees of freedom are described by the open-string effective action, which is a super-Yang-Mills theory on the D-brane ” (Peeters and Zamaklar, 2007) and the dynamics on the D-brane world-volume is a gauge theory (see also Szabo, 2011). For a non-commutative scenario see (Chu and Ho, 1999).

In addition to the scalar field describing fluctuations of the brane in the normal directions, topologically there is also a line-bundle with connections that become non- abelian gauge fields (Moore et al., 1998; Harvey and Moore, 1998; Arnold and Moore, 2005). 20 When N branes are stacked on top of each other new non-abelian degrees of freedom are needed to describe the novel dynamics. As Moore explains, “ this fundamental phenomenon has ultimately led to many startling new insights into gauge theory ” (2005).

In some regions the gauge theory of the D-branes is decoupled from gravity living in the bulk, meaning that open strings attached to D-branes don’t interact with closed strings. This is termed a decoupling limit. In these cases the D-branes have two alternative descriptions: from the point of view of closed strings the D-branes are gravitational sources, and thus we have a gravitational theory on spacetime with some background fields. From the point of view of open strings, the physics of the D-branes is described by the appropriate gauge theory. This suggests that the gravitational theory on spacetime is physically equivalent (or dual) to the gauge theory on the boundary of that spacetime and

20 These elements are used to describe general relativity. 42

“the subspace filled by the D-branes is the boundary of this spacetime ” (Giveon and Kutasov, 1999). In addition, D-branes are comparable in role to the solitons of string theory. This also sets up the basis for gauge-gravity duality and AdS/CFT, as we’ll see next.

Solitons

In the nineteen fifties, Fermi, Ulam, and Pasta discovered a remarkable property at the Los Alamos laboratory. The original paper where the word “soliton” (or solitary wave) was introduced comes from a (1965) paper by Zabusky and Kruskal, entitled, “Interaction of ‘Solitons’ in a Collision-less Plasma and the Recurrence of Initial States.” Conceptually, a solitary wave represents a localized “‘wave of translation’ arising from a balance between nonlinear dispersive effects. As Scott explains, “a soliton is a solitary wave that behaves like a "particle", in that it satisfies the following conditions: 1) it must maintain its shape when it moves at constant speed; and 2) when a soliton interacts with another soliton, it emerges from the "collision" unchanged ” (2005). This means that solitons maintain their topological integrity. Lomdahl reiterates the point in noting how “a remarkable quality of these solitary waves was that they could collide with each other and yet preserve their shapes and speeds after the collision ” (1984).

Their definition as allowed endpoints for open strings, generalizes the notion of quarks on which the QCD string can terminate. In contrast to the quarks of QCD, D-branes are however intrinsic excitations of the fundamental theory: their existence is required for consistency, and their properties – mass, charges, dynamics – are unambiguously determined in terms of the Regge slope α′ and the asymptotic values of the dynamical moduli. They resemble in these respects conventional field-theory solitons, from which however they differ in important ways. D-particles, for instance, can probe distances much smaller than the size of the fundamental string quanta, at weak coupling. In any case, D-branes, fundamental strings and smooth solitons fill together the multiplets of the various (conjectured) dualities, which connect all string theories to each other. (Bacchas, 1999)

In the context of this study, solitons eventually find the most traction within the setting of quantum field theory and specifically, quantum brain dynamics (see e.g., Umezawa, 1967, 1979; Yasue and Jibu, 2011) described as “ nothing else but quantum electrodynamics of the electric-dipole field of dipolar solitons and water molecules ” (Globus, 2007) representing the first and second degrees of freedom in living matter, respectively. As Lomdahl explains, “because the mechanisms that give rise to soliton equations are so prevalent, the suggestion that solitons might arise in biology is not so surprising ” (Lomdahl, 1984). In quantum field theory , a soliton, as a coherent solitary-wave propagation, “ is considered as a localized degree of freedom maintaining and carrying energy without loss due to thermalization ” (Jibu and Yasue, 2011). They are also commonly referred to as Davydov-solitons. Specifically:

The first degree of freedom we are looking for in the fundamental system of living matter may be found as an internal degree of freedom of the background three-dimensional network structure of protein filaments free from thermalization (i.e., Maxwell’s demon). In 1979 such a degree of freedom was found by Davydov as a coherent dipolar solitary wave propagation along the one-dimensional chain of protein molecules such as the protein filament. (Davydov, 1979; Jibu and Yasue, 2011)

43

Energy stored in soliton form is “ kept free from thermalization and belongs to the fundamental system of living matter, though creation of soliton is triggered by an incoherent and disordered interaction with the metabolizing system ” (Jibu and Yasue, 2011). In other words, the creation and annihilation process of dipolar solitons serves as a gateway between metabolizing and fundamental systems. Namely, “ energy incoming from the metabolizing system of living matter through the ATP cyclic process to the fundamental system induces dipolar solitons localized in each protein filament ” (Jibu and Yasue, 2011). The dipolar soliton is a collective mode of many dipolar oscillations “ maintained by non-localized electrons trapped in the one-dimensional chain of protein molecules ” (ibid). “ It is a quantum-mechanical degree of freedom representing electric dipole moment localized in each background protein filament ” (ibid). Thus:

Living matter is essentially a quantum-mechanical many-body system described by two different degrees of freedom interacting with each other, that is, dipolar solitons localized in the background three-dimensional network structure of protein filaments and water dipole moments surrounding them. (Globus, Pribram, and Vitiello, 2011)

Taken in light of its similarity to a D-brane we may gather that these gateway solitons between fundamental and metabolizing systems might be recognized as small samples of information; in this sense, they are information-laden and the open strings on D-branes might somehow equate to a corresponding sub-architecture describing the information- content comprising a soliton. We can possibly learn something more about D-branes from solitons as well; namely, we could adopt to regard them as gateway (or transpositional) values between the underlying details of the microphysics in a potential mode vs. the actualizations of samples of these dynamical vacuum contents. Thus, D- branes of open strings could serve to motivate the role of ontological mediators between the potential and actual modes. We explore this more in upcoming chapters five and six.

As Sahakian explains, “ at its heart, string theory aims at describing a consistent theory of quantum gravity in addition to a short-length scale completion of the Standard Model of particle physics ” (2012). Verlinde says that string theory is based on general principles that have a real, structural capacity in the world - and is used to understand what gravity is. In string theory, “ while looking for the smallest constituents of matter we find that gravity emerges in the equations necessarily. You get it out of something where you didn’t put it in ” (Verlinde, 2011).

There are many indications suggesting that string theory is sufficiently rich to contain the answers to many puzzles such as the information paradox or the statistical interpretation of black hole entropy. (Horava, 2009)

String theory rests at the most fundamental level of theory-building and as a result presently still remains detached from all empirical testing. 21 As a classical theory, general relativity points to black holes but can’t tell you much about them substantively—string

21 “Bayesian epistemology identifies theory confirmation directly with an increase of the probability that a theory is true or viable. Based on this we may say that the string theorist’s arguments actually do constitute a specific form of theory confirmation, which I call non-empirical theory confirmation.” (Dawid, 2007) 44 theory builds off of this in cooperation with the holographic principle to shed more light on the quantum components of black holes.

For this we also need to elaborate the notion of phase space. Keeping track of reaction forces induced by degrees of freedom integrated out gives rise to gravity, and degrees of freedom integrated out give rise to forces that are really there at low energies (see Tong, 2009). Integral to these ends we also introduce Boltzman’s constant as a physical constant relating energy at the individual particle level with temperature. The only way to arrive at these is to keep track of, in Verlinde’s account, the phase space and degrees of freedom (see Verlinde, 2011).

String theory is developed within the highly-successful predictive context of high-energy physics. String theorists point out that their theory is the only one that offers a concrete and promising idea for a consistent description of microphysics and general relativity. As Dijkgraaf adds, “ it’s also the only theory that integrates into one overall theory our topical understanding of high energy physics based on gauge field theory and our understanding of cosmology based on general relativity ” (2012). All told, string theory provides the most potential for handling the high-energy environment where both general relativity and quantum mechanics persist. Even beyond this, though, we would expect that string theory, as a theory of quantum gravity, should also be able to tell us something more about black holes. The most tractable version is a 5d black hole whose entropy is given by the Bekenstein-Hawking formula. Building off this to gravity sector, Dijkgraaf explains:

Within the horizon of the black hole, you have gravitation, and gravitation is described, in string theory, in terms of these so called closed strings, these little loops that are running around and forming the shape of space and time, the curvature of space. So we can replace these closed strings by Einstein’s description of space-time. (2012)

What happens to open strings when they are very close to the black hole? They fall halfway fall through. “ Open strings are attached to the horizon, to the surface of the black hole, so they are tethered to the two endpoints that are fixed on the horizon and you can describe this system ” (Dijkgraaf, 2012). The mathematical description of this neighborhood of the horizon in string theory is described by the Yang-Mills theory. The geomodal method suggests a different scenario for deriving open strings attached to a D-brane, as we’ll see in chapter five.

So there is some theory of open strings known as matrices of strings, and an exact mathematical model that describes the physics of quantum black holes; and in fact, out of this the conclusion is that there is some sense what we call the fundamental layer of physics, namely space and time, gets replaced at very-small distances by something more involved and—in the case of string theory—you have a very precise candidate for this more fundamental theory, which is the Large- gauge theory. (Dijkgraaf, 2012)

Matrices of strings relate to the snapshot-dual of open strings in the event-narrative. Since all string theories contain gravity, it seems impossible to use a string theory to

45 describe strong interactions. “ In fact they are described by QCD that does not contain gravity ” (Vecchia, et al., 2005). This is what the AdS/CFT correspondence tells us, as we’ll see next. In addition, spacetime in quantum gravity should emerge as an effective description of the theory of oscillations of a lower-dimensional black-hole horizon. “ This suggests that any black hole with appropriate properties, not just strings, would serve as a basis for a description of string theory ” (Susskind, 2003). By uncovering more details about black holes we are ultimately led to the conclusion that gravity isn’t fundamental but actually emergent and linked to thermodynamics, and in many ways, hydrodynamics (see Verlinde, 2010).

String theory is a rather large theory, possibly with a huge landscape of vacua, each of which leads to a scenario for the history of the universe which may or may not resemble ours. Given this richness of string theory, it might even be logical to adopt the perspective in which string theory is not a candidate for a unique theory of the universe, but represents instead a natural extension and logical completion of quantum field theory. In this picture, string theory would be viewed—just as quantum field theory—as a powerful technological framework, and not as a single theory. (Horava, 2009; see also Gross, 2014)

The problem Verlinde brings to our attention is that there is no epical integrity to string theory; “ there are a bunch of parts that are important but no larger sense of how they all go together yet ” (2011). As Rickles explains, “ with few exceptions, string theory has so far taken quantum mechanics more or less for granted, rather than further elucidating its foundations, as one would wish for a fundamental theory ” (2013). An event-logic as constructed here is poised to offer such an epical integrity in concert with Whitehead’s AE’s in chapters five and six.

Open/Closed String Correspondence

The heart of gauge/gravity and AdS/CFT can be shown to reside in the open/closed string correspondence. As Baumgartl lists, “ AdS/CFT-correspondence, holography, and gauge/gravity duality are all research directions in string theory which strongly suggest an underlying profound connection between open and closed strings ” (2007). Over the years many examples have been collected showing that open strings are also capable of describing closed string interactions (see Baumgartl, 2008).

Since the early days of string theory it has been presumed that the distinction between open strings and closed strings is not fundamental (Baumgartl, 2008). “ This follows already from the observation that closed string poles occur as intermediate states in open string scattering amplitudes ” (Baumgartl, 2005). Specifically, the open/closed string correspondence refers to the scenario where the closed string propagator is equivalent to Figure 1 from the open string one-loop amplitude channel (Verlinde, 2011). Verlinde 2010 As Verlinde explains, “ the open-string one-loop amplitude diagram exchange between two D- branes is also like a closed-string exchange ” (2011). In closed strings it’s classical; in open- strings it’s quantum. This indicates that we cannot think about closed strings as truly fundamental, Verlinde explains. The equivalence between open and closed strings gives

46 rise to AdS/CFT where you take multiple D-branes and decouple open strings from closed strings to get to the conformal field theory at the low-energy limit (see Maldacena, 1997).

Gauge/Gravity ‰ ADS/CFT Correspondence

All string theories contain both gravity and gauge theories and therefore those two kinds of interactions are intrinsically unified in string theories (see Greene, 2000). The problem of developing a non-perturbative formulation of string theory was one of the original motivations for studying gauge-gravity and AdS/CFT correspondence (Maldacena, 1998). These dualities represent a major advance in our understanding of string theory, black hole physics, and quantum gravity by providing a formulation of string theory with certain boundary conditions resembling Minkowski space-time (Maldacena, 1997). As such, string theory provides new relations between quantum field theories (gauge theories) on one hand and classical gravity theories on the other. These relations are dualities since they map strongly-coupled gauge theories to weakly-coupled gravity theories. Gauge/gravity duality is a conjectured duality between a quantum theory of gravity in certain cases and gauge theory in a lower number of dimensions (see also Chivukula, 2010). A general phenomenon of the gauge-gravity correspondence is the UV/IR connection (see Skenderis, 2002). As Bernamonti (et al.) explain:

The gauge/gravity duality can be described as a “holographic” correspondence between the 4-dimensional physical space where the gauge theory lives and the 5-dimensional space where the supergravity (weak curvature) approximation of the 10-dimensional string theory is valid. It means qualitatively that the whole information should be the same on both sides of the correspondence, despite the difference in dimensionality. (2011)

This gave the first holographic description of a higher-dimensional object: the 3+1 dimensional type IIB membrane, and thereby resolved the long-standing problem of finding a string description of a gauge theory. The most rigorous realization of the holographic principle (and gauge-gravity duality in general) is the AdS/CFT correspondence, launched by Juan Maldacena in 1997, and referring to the correspondence between gravitational theories and quantum field theories with one fewer dimension.

Maldacena’s AdS/CFT correspondence provided the first, tangible realization of the holographic principle with applications for black holes, locality, and information theory in physics – as well as for the nature of the gravitational interaction (see also Chivukula, 2010). Specifically, it relates string theory to gauge theory by allowing contact with the low-energy model of =4 super Yang Mills in QCD via UV/IR mixing. This correspondence plays an important role in the recent development of string theory and the holographic principle, where it provides a concrete example of a lower-dimensional theory encoding the higher-dimensional physics (see Maldacena, 1998). On the string theory side, a description of certain gauge theories can be proposed in terms of string theory (see e.g., Ishizeki, 2010). More specifically:

47

AdS/CFT duality can be understood as the statement that a given brane configuration in string theory has two equivalent descriptions, one in terms of open and one in terms of closed strings. The examples in the original work of are branes that in a certain decoupling limit realize a supersymmetric n + 1 dimensional conformal field theory on their world- volume, the open string description. In the dual language, the theory is described in terms of closed strings propagating in the near-horizon geometry of the corresponding black- brane solution. (Karch and Randall, 2001)

In the case of the black-hole information paradox it shows how a black hole can evolve in a manner consistent with quantum mechanics; specifically, Maldacena noticed that “ the low-energy excitations of a theory near a black hole consist of objects close to the horizon, which for extremely-charged black holes looks like a five-dimensional anti-de Sitter space ” (Zwiebach, 2009). He noted that in this limit the gauge theory describes string excitations near the branes, and followed by hypothesizing that string theory on a near-horizon extremely- charged black-hole geometry (an AdS and a sphere with flux) is equally well-described by the low-energy limiting gauge theory similar in certain cases to quantum chromodynamics: the = 4 supersymmetric Yang–Mills theory (Aharony et al., 2008). “These particles obey the usual rules of quantum mechanics and in particular evolve in a unitary fashion, so the black hole must also evolve in a unitary fashion, respecting the principles of quantum mechanics ” (Maldacena, 2005). In 2005, Hawking published a statement that the paradox had been settled in favor of information conservation by the AdS/CFT correspondence— and suggested a mechanism by which black holes might preserve information.

As Dijkgraaf concludes, “ the physical theories that we developed to study the elementary particles seem to be relevant also to describe black holes, but then we really have to apply them into a quantum gravity regime ” (Dijkgraaf, 2012).

2.15 – EMERGENCE of GRAVITY

Verlinde proposes we use AdS/CFT and open/closed strings to account for more-general principles that can be compared to actual data (2011); thus, he uses string theory to motivate an account of the more general principles stemming from their dynamics. Can gravity emerge from information? Verlinde says yes. The microscopic degrees of freedom are information. Indeed, it was Verlinde who took the ultimate consequence of the AdS/CFT correspondence, saying that if gravity isn’t really a fundamental force then what we call curvature of spacetime “ is in some sense just an illusion because underlying, is this more quantum description. Therefore, perhaps we should stop looking for a fundamental description of gravity ” (Dijkgraaf, 2012). As Verlinde explains:

Gravity dominates at large distances, but is very weak at small scales. In fact, its basic laws have only been tested up to distances of the order of a millimeter. Gravity is also considerably harder to combine with quantum mechanics than all the other forces. The quest for unification of gravity with these other forces of Nature, at a microscopic level, may therefore not be the right approach. It is known to lead to many problems, paradoxes and puzzles. String theory has to a certain extent solved some of these, but not all. (2010)

48

Gravity and space-time geometry are emergent phenomena in Verlinde’s framework, implying that they have no fundamental microscopic definition but arise instead as macroscopic behavior (Chivukula, 2010). He cites the AdS/CFT correspondence as an example of how a non-gravitational theory can give rise to a theory of gravity in order to justify his argument for gravity’s emergence. As Chivukula explains (2010):

Inspired by Bekenstein’s original argument regarding black-hole thermodynamics, Verlinde proposes that the differential change in entropy as a mass crosses a screen is proportional to the mass and the differential displacement: ∆S ∝ m ∆x, and invoking the standard thermodynamic relation F ∆x = T ∆S we find that F ∝ T, where T is temperature of the information of the mass distribution and F is the force experienced by the mass.

To arrive at this conclusion Verlinde invokes the applicability of the holographic principle, and his initial assumptions are that there is some well-defined notion of time; there are space-independent quantities of energy, entropy, and temperature; that the number of degrees of freedom associated with a portion of space are finite (required by the holographic principle); there is an equivalence between energy and matter; and that the energy is distributed evenly over the degrees of freedom in the volume (Chivukula, 2010). These assumptions directly lead to an associated temperature for the specified volume. Verlinde asserts that the product of this temperature with a differential entropy change (resulting from mass displacement) manifests itself as the gravitational force.

This is substantively different from the orthodox view that gravity (gravitational-field) is a fundamental force (field) transmitted by a graviton. If Verlinde is correct then the graviton is analogous to a phonon as a non-fundamental, quantized macroscopic excitation, like a collective quasi-particle, or phonon.

The copious difficulties with the attempted unification of gravity with quantum mechanics at the Planck scale led Verlinde to propose that perhaps such attempts are misguided. He argues that gravity is not fundamental, but a macroscopic phenomenon that emerges from the thermodynamics of information in a theory without gravity. Consequently, gravity becomes an entropic force caused by gradients of information. (Chivukula, 2010)

Verlinde emphasizes that there are many forces in nature that are not fundamental. This means that there is not a little particle like the photon for the electromagnetic force, or the gluon for the strong force, and that we shouldn’t look for an elementary description but instead for an emergent one; thus, there should exist something like an entropic method for re-formulating all of gravity.

49

2.16 – SUMMARY

The purpose of this chapter has been to provide a detailed narrative of the principles, phenomena, mathematical theories, and observational discoveries in modern physics leading up to emergent gravity. Within this encounter we also recognize the guiding notion that spacetime is emergent at the smallest scales. This narrative has been woven into the context of: general relativity; quantum theory; the vacuum; Planck scale; plus the integral motion to fuse these two theories together at vacuum/Planck scale to obtain a quantum theory of gravity. In addition, we explored the UV/IR mixing connection that predicates quantum gravity theories in the form of black-hole horizon environment at Planck scale. This leads us to geometrical entropy, statistical mechanics, and Hawking radiation; the black hole information paradox; Wheeler’s it-from-bit; and right-on-up to the holographic principle. From here we transitioned into string theory via a brief historical development before encountering: emergent bootstrapping and S-Matrix approaches; open strings; closed strings; =4 SYM; phonons; D-branes; and solitons. This leads to a discussion of the open/closed string correspondence in light of gauge/gravity duality and the AdS/CFT correspondence. This positions us into the frame to transition directly into Verlinde’s approach on the basis that it serves as the “ most- radical consequence of the AdS/CFT ” (Dijkgraaf, 2012) .

Looking back over this chapter, we have all but defined the concepts and progression of ideas in physics and cosmology necessary for a lead-in to approaches and phenomena pertinent to Verlinde’s emergent model of gravity. The other major topics to be addressed are Sakharov’s induced gravity, Jacobson’s gravitational thermodynamics, and the holographic renormalization procedure, all of which are covered in the next chapter.

50

Chapter 3 – Emergent Gravity

Predicated on the early triumphs of Sakharov and Jacobson’s non-quantization approaches to quantum gravity—and taken in tangent with foundational breakthroughs in black hole thermodynamics beginning with Bekenstein, Bardeen, Carter, and Hawking—a step-by-step conceptual outline of the ‘emergent gravity’ hypothesis is framed in this chapter within Verlinde’s 2010 paper.

3.1 – Introduction

Alternative approaches to gravity challenge the received wisdom and do not regard gravity as a fundamental force but rather as effective: as merely supervening on fundamental physics. (Wüthrich, 2006)

As we saw in the last chapter, some hold to space and time as fundamental whereas others take them to be emergent abstractions. These differences are representative of two basic approaches to space-time and gravity: the quantization approach to quantum gravity, and the non-quantizational, “induced” and “emergent” approaches. Two of the most substantial non-quantizational approaches are contributed by Sakharov’s induced gravity and Jacobson’s gravitational thermodynamics, both of which set the stage for Verlinde’s 2009 insight that gravity is not a fundamental force but a macroscopic phenomenon emerging as a result of thermodynamic principles applied to the information of mass distributions (Chivukula, 2010). Throughout the course of examining these approaches we will find a description of space and time as emergent features, along with gravity, and perhaps even all matter.

3.2 – Non-Quantization Approaches to Quantum Gravitation

Does the need to find a quantum theory of gravity imply that the gravitational field must be quantized? Physicists working in quantum gravity routinely assume an affirmative answer, often without being aware of the metaphysical commitments that tend to underlie this assumption. (Wüthrich, 2006)

As we introduced in the last chapter, repeated failures to quantize gravity led to a parallel development where the interpretation of gravity as a fundamental force is replaced with an emergent phenomenon like thermodynamics or hydrodynamics. Non-quantization models for theories attempting to reconcile general relativity with quantum theory are described by emergent approaches; hence the issue of quantization of gravity becomes inconsequential (Banerjee, 2010).

Sakharov’s induced gravity program provides an effective, semi-classical approach to gravity that does not require a quantization of gravity. Similarly, Jacobson’s gravitational thermodynamics conceives of gravity as emergent from the energy flux of unobservable degrees of freedom. These accounts establish that it is at least conceivable that the final theory of gravity may not involve quantization. (Wüthrich, 2006)

As Wüthrich explains, we could oppose quantizing gravity on the grounds that in fundamental physics starting-out from a classical field in order to arrive at the quantum

51 structure puts the cart before the horse. This mentality, whose slogan could be something like “ quantum without quantization ,” is expressed in the quote from Patton and Wheeler on the occasion of the 1975 Oxford symposium on quantum gravity:

However workable the procedure of “quantization” is in practice [...], we know that in principle it is an inversion of reality. The world at bottom is a quantum world; and any system is ineradicably a quantum system. From that quantum system the so-called “classical system” is only obtained in the limit of large quantum numbers. (Patton and Wheeler, 1975)

There exist more promising alternative approaches that offer quantum theories of gravity which do not involve a quantization of the gravitational field. Typically, they understand gravity as an induced rather than a fundamental force. According to this view, gravity is not one of the four fundamental forces; instead, it emerges at a higher level as a result of the fundamental physics. Since gravity is not fundamental, it does not have to be quantized. (Wüthrich, 2006)

Why do we need a quantum theory of gravity at all? Why do we have to quantize gravity for the purpose of finding a quantum theory of gravity? “ It may seem that these two questions can hardly be kept separate since quantum field theory requires that all matter fields be quantized and general relativity teaches that those matter fields are the sources for the gravitational field ” (Wüthrich, 2005). Some thus believe that proceeding by quantization is a principled mistake. At least some approaches to gravity, the so-called semi-classical theories, maintain that coexistence between the two theories is indeed possible. Sakharov’s induced gravity presents just such an example.

3.3 – Sakharov’s Induced Gravity

A few years into the new millennium Sakharov’s 1967 proposal was brought back to bear by a reconfiguration of his original paper by Visser (2002). This subsequently led to a second-wave of interest in the method. Serving retroactively as a logical precursor to entropic gravity, induced gravity is developed in the context of quantum gravity and statistical mechanics where the space-time background emerges as a mean-field statistical approximation of underlying microscopic degrees of freedom.

The inversion of logic describing gravity as an emergent phenomenon was first proposed in this context by suggesting that gravity is induced by quantum field fluctuations. Sakharov's idea was to start with an arbitrary background pseudo-Riemannian manifold and introduce quantum fields (matter) on it but not any explicit gravitational dynamics. This gives rise to an “effective” action where general relativity comes to bear an emergent property of matter fields. This approach is known to be similar to the fluid mechanics (hydrodynamics) approximation of Bose–Einstein condensates (see e.g., Eingorn and Rusov, 2014).

Sakharov's induced gravity assumes a pre-existing metric field and Sakharov's idea is that (under some very general assumptions) it will automatically lead to Einsteinian

52 dynamics. In emergent gravity, the metric emerges naturally, e.g. from a condensed- matter system; it is not assumed to pre-exist (Kleinert, 1987). Many consider the difficulty with emergent gravity bearing from the fact that it does not necessarily obey Einstein’s dynamics (ibid).

Following Sakharov’s method, general relativity is viewed as the hydrodynamic (low- energy, long-wavelength) regime of a more fundamental microscopic theory of space- time where the metric and connection-forms represent the collective-variables derived from them (1967). At shorter wavelengths (and higher energies) these collective variables eventually lose meaning, “ much as the vibrational modes of a crystal will cease to exist at the atomic scale ” (Hu 2007). If we view GR as hydrodynamics and the metric or connection forms as hydrodynamic variables, quantizing them will only give us a theory for the quantized modes of collective excitations, such as phonons in a crystal, but not a more fundamental theory of atoms or quantum electrodynamics (see Hu, 2002).

It claims to implement the vision of Lorentz (1899-1900) contemplating the possibility of gravity as an effective force induced by residual electromagnetic forces. For Sakharov, gravity is thus not a fundamental physical field, but “induced,” i.e. emergent from quantum field theory like hydrodynamics emerges from molecular physics. Nota bene, since the interaction part of the action contains both classical and quantum terms, Sakharov’s account leads to a type of semi-classical quantum gravity. (Wüthrich, 2006)

According to this view most macroscopic gravitational phenomena can be explained as collective modes and hydrodynamic excitations, from gravitational waves as weak perturbations to black holes in the strong regime, as solitons (see Hu, 2006).

Implementing Maldacena’s AdS/CFT conjecture into the induced approach suggests that the microphysical degrees of freedom yield different modal representations: in one case, as strings in the bulk and in the other as conformal drops on the horizon; thus, bulk spacetime arises as an emergent phenomenon of the quantum degrees of freedom that live in the boundary of the spacetime (Maldacena, 2007). An emergent interpretation of gravity says that the gravitational field as a metric, and its dynamics, are not fundamental but arise as an effective theory from some other microscopic degrees of freedom (Hu, 2006). Under this interpretation the gravitational field is seen as a phenomenological coarse-graining of more fundamental fields. As Wüthrich explains:

The general framework for an induced gravity theory in Sakharov’s vein is set up by first assuming a Lorentzian manifold as a background on which to do quantum field theory. This background is a continuous, classical, un-quantized spacetime. It is left free “to flap in the breeze,” i.e. no assumptions regarding its dynamical evolution are made. In particular, no Einstein equations— modified or not—enter the picture. When we do quantum field theory on this background spacetime, it turns out that the effective action at the one-loop level automatically contains terms proportional to the cosmological constant and to the Einstein-Hilbert action of general relativity, as well as higher order terms. Thus, it looks as if Einstein gravity is generated at the one-loop level from the interaction of quantum fields. (Wüthrich, 2006)

53

One of the philosophical foundations underwriting this approach is that physical/cosmological models with emergent gravity are always possible as long as other things, such as space-time dimensions, emerge together with gravity; as a drawback, however, such models typically predict huge cosmological constants (Kleinert, 1987).

Sakharov observed that many condensed-matter systems give rise to emergent phenomena identical to general relativity. According to him, gravity should be treated as an effective theory, like elasticity to atomic forces. As Wüthrich explains: “ Not unlike the belief in the unity of nature, the belief in a fundamental theory including gravity is exposed by Sakharov as an additional commitment not warranted by the (currently available) resources of empirical physics alone ” (Wüthrich, 2006).

Despite its technical difficulties, this view still sheds critical light on undergirding the logic of this study that considers “ the attempt to deduce a quantum theory of gravity by quantizing the metric should prove to be as meaningful as deducing QED from quantizing elasticity ” (Hu, 2006). This underscores the point that there is no fundamental metric; Minkowski’s hypersurface of the present corresponding to a lightcone is moved into 5d as a derived element of a causal manifold therein.

3.4 – Jacobson’s Gravitational Thermodynamics

The probabilistic description of gravity traces a history going back to research on black hole thermodynamics by Bekenstein and Hawking in the mid-1970s when foundations were laid “ with the realization that stationary black hole horizons have thermodynamic properties such as temperature and entropy, much like fluids; in fact the generalized 2nd law of thermodynamics treats black hole entropy on par with external matter entropy ” (Hubeny, 2011). 22 The original motivation for treating gravity thermodynamically may be traced to Bekenstein’s observation (1995) that the area of a black hole horizon is proportional to its entropy. These studies suggest a deep connection between gravity and thermodynamics describing the behavior of heat. 23

22 In the early 80’s, analog models of black holes (Unruh, 1981) illustrated the converse notion, that fluids can admit sonic horizons and even the analog of Hawking temperature; indeed they can reproduce the kinematic aspects of black holes (Hubeny, 2011). We explore sonic black holes in the event-logic. 23 “Bill Unruh presented preliminary experimental direct observations of Hawking radiation of a “black hole”. Since creating a black hole is completely beyond our current technological abilities, what he was doing was simulating the event horizon of a black hole by waves propagating against a flowing liquid which was passing over a submerged barrier. What was remarkable was that the wave measurements provide the Bogoliubov coefficients in field theory and to the degree that the model is correct one does observe the actual Hawking radiation which is many-many orders of magnitude smaller than the environment temperature of 300K” (Moldoveanu, 2010). Bill informally reported three tentative experimental facts: (1) Hawking radiations is thermal, (2) the radiation originates in vacuum fluctuations before the event horizon is formed, and (3) after the event horizon is formed, the radiation outside and inside are and remain correlated after they start to move away from the horizon, although there is no communication possible between them. 54

Bekenstein’s initial proposal (1973) was based on Hawking’s area theorem (1971) and the connection was bolstered by the laws of black hole mechanics developed by Bardeen, Carter, and Hawking (1973) —“but the analogy didn’t acquire true physical significance until Hawking’s discovery that quantum-mechanical effects allow black holes to radiate with a thermal spectrum ” (Roveto and Munoz, 2012). The implication from black hole thermodynamics implies that the maximal entropy in any region scales with the radius squared , and not cubed as might be expected. In the case of a black hole, the insight was that the informational content of all objects that have fallen into the hole can be entirely contained in surface fluctuations of the event horizon. As Banerjee explains:

Black holes are to gravity what atoms were to atomic physics so that their study is expected to provide crucial information regarding gravity just as the study of atoms was the pathway to atomic physics. (2010)

Bekenstein argued that black holes are maximum entropy objects—that they have more entropy than anything else in the same volume. In a sphere of radius R, the entropy in a relativistic gas increases as the energy increases. The only limit is gravitational; when there is too much energy the gas collapses into a black hole. Bekenstein used this to put an upper bound on the entropy in a region of space proportional to the area of the region, concluding that the entropy is directly proportional to the area of the event horizon (see Marolf, 2009). Yet we might wonder; how did classical general relativity make it possible to infer that the horizon area of a black hole would turn out to be a form of entropy, and that surface gravity is a temperature? Jacobson answered the question in (1995) by turning the logic around and deriving the Einstein equation of general relativity from the ‘area-scaling property’ of the first law of thermodynamics (Jacobson, 1995). For Wüthrich:

Rather than deriving the four laws of black hole thermodynamics from the classical Einstein equations, as did Bardeen, Carter, and Hawking (1973), Jacobson inverts the derivation by recovering the Einstein equations from the entropy’s proportionality to the horizon surface area of a black hole together with the fundamental thermodynamical relation connecting heat, temperature, and entropy. (Wüthrich, 2006)

Hawking and Page have even shown that black hole thermodynamics is more general and cosmological event-horizons also have an entropy and temperature. More fundamentally, 't Hooft and Susskind used the laws of black hole thermodynamics to argue for a general holographic principle at work in nature. In a subtle foreshadowing to the holographic principle, the fifth footnote of Jacobson’s 1995 paper states that: “ We shall assume for most of this letter that the entropy is proportional to horizon area. Note that the area is an extensive quantity for a horizon, as one expects for entropy, ” adding:

Another argument that might be advanced in support of the proportionality of entropy and area comes from the holographic hypothesis, i.e., the idea that the state of the part of the universe inside a spatial region can be fully specified on the boundary of that region. However, currently the primary support for this hypothesis comes from black hole thermodynamics itself. Since we are trying to account for the occurrence of thermodynamic-like laws for classical black holes it would therefore be circular to invoke this argument. (1995)

55

Jacobson surmised that the energy flux across a causal horizon represents a kind of adiabatic heat flow—in the sense of Unruh’s famous thought experiment—and that the entropy of a system beyond is proportional to the area of that horizon (1995). Given that the origin of the large entropy is the vacuum fluctuations of quantum fields, according to the Unruh effect “ those same vacuum fluctuations have a thermal character when seen from the perspective of a uniformly accelerated observer .” He takes the temperature of the system to be the Unruh temperature associated with an observer hovering just inside the horizon.

The heat is interpreted as the energy flux across a causal horizon and the temperature as the Unruh temperature relative to an accelerated observer just inside a local Rindler horizon. This heat manifests itself via the gravitational field it generates. As in conventional thermodynamics, where heat is interpreted as energy flux between unobservable degrees of freedom, the underlying mechanics of the energy flux is irrelevant (Wüthrich, 2006).

Assuming a “ cosmic censorship ” of the unobservable microscopic degrees of freedom behind the horizon, Jacobson formulates local gravitational thermodynamics for an observer by means of the boundary causal horizon associated with entropy. “ The system that radiates heat is identified with the degrees of freedom behind the horizon, separated from the observer’s past by a causality barrier and is therefore unobservable ” (Wüthrich, 2006).

Just like in Sakharov’s approach, Jacobson also cautions against quantizing the Einstein equations; specifically, he demonstrates that the Einstein field equations describing relativistic gravitation can be derived by combining thermodynamics with the equivalence principle (1995). Viewed in this way, the Einstein equation is an equation-of- state rather than a fundamental theory. Wüthrich explains the matter:

As Jacobson shows, this interpretation imposes conditions on the curvature of spacetime such that the classical Einstein equations are implied. Therefore, he suggests that the Einstein equations can be more adequately analogized with the wave equation for sound in a medium, rather than interpreted as the dynamical equations for a fundamental field. These equations, he urges, as higher-level equations of state, should then not be quantized as if the gravitational field were fundamental, despite the fact that they may describe what is ultimately a quantum reality. (Wüthrich, 2006)

In Einstein’s equation “ a time reversal invariant system of local partial differential equations is generated whose solutions include propagating waves analogous to the role of sound in a gas acting as an adiabatic compression wave ” (Jacobson, 1995). As Jacobson explains: “ Since the sound field is only a statistically defined observable on the fundamental phase space of the multi-particle system, it should not be canonically quantized as if it were a fundamental field, even though there is no question that the individual molecules are quantum mechanical ” (1995). Jacobson built off of this description (in 1997) to make the connection between black holes and thermodynamics. As Sheykhi and Sarab explain:

Jacobson put forward a new step and suggested that the hyperbolic second-order partial differential Einstein equation for the spacetime metric has a predisposition to thermodynamic behavior. He disclosed that the Einstein field equation is just an equation of state for the

56

spacetime and in particular it can be derived from the proportionality of entropy and the horizon area together with the fundamental relation δQ = T dS. Following Jacobson, however, several recent investigations have shown that there is indeed a deeper connection between gravitational dynamics and horizon thermodynamics. The deep connection between horizon thermodynamics and gravitational dynamics helps to understand why the field equations should encode information about horizon thermodynamics. These results prompt people to take a statistical physics point of view on gravity. (2012)

The difference allotted in this distinction between the two is tantamount to another vote for emergent approaches, suggesting that “ it may be no more appropriate to canonically quantize the Einstein equation than it would be to quantize the wave equation for sound in air ” (Jacobson, 1995). Subsequently, other physicists, most notably Padmanabhan (2009) and Verlinde (2010), began to explore links between gravity and entropy. Padmanabhan proposed that classical gravity can be derived from the equipartition law (Lee, 2012). While also making use of the equipartition rule, Verlinde’s theory takes these models another step by offering a reason why gravity should emerge in the first place.

3.5 – Distinguishing Verlinde from these predecessor theories

The first cracks in the fundamental nature of gravity appeared when Bekenstein, Hawking and others discovered the laws of black hole thermodynamics. (Verlinde, 2010)

Jacobson's (1995) work builds off the deep connection between gravity and thermodynamics to establish that—assuming the first law of thermodynamics and the holographic principle, and identifying temperature with the Unruh temperature— Einstein’s equation of general relativity is found to derive from the first law of thermodynamics and the area-proportional Rindler horizon entropy (Davies, 1975). As Verlinde explains:

This is a remarkable result. Yet it is already 12 years old, and still up to this day, gravity is seen as a fundamental force. Clearly, we have to take these analogies seriously, but somehow no one does. (2010)

Building on these developments, he argued that the force of the second law of dynamics and Newton’s law of gravity both have their origin in thermodynamics and can be understood in terms of the entropic force. Methodologically, Verlinde presents a holographic scenario for the emergence of space addressing the origins of gravity and inertia that, like in Jacobson’s model, are also connected by the equivalence principle— concluding that a change of entropy was linked to a change of Newton’s potential. So far so good, but as Verlinde spells out, there are essential distinctions in his work from those along the same path preceding it; specifically:

We have seen a recent increase in papers following Jacobson, and extending his work to higher derivative gravity, and so on. But from all of these papers, I did not pick up the insights I presented in this paper. What was missing from those papers is the answer to

57

questions like: why does gravity have anything to do with entropy? Why do particles follow geodesics? What has entropy to do with geometry? (2011)

As in Sakharov’s approach, gravity does not represent a fundamental force; rather, “ it emerges as a phenomenon supervenient on the energy flux from causally inaccessible degrees of freedom ” (Wüthrich, 2006).

However, Jacobson’s claim that gravity should not be quantized in this scheme because it represents a collective, higher-order degree of freedom is simply false. Physicists routinely quantize collective degrees of freedom such as sound (with “phonons” as quanta of sound). Whether or not a degree of freedom must be quantized or not does not depend on whether it is collective or individual, but on altogether different considerations. Hence, quantization cannot necessarily be escaped by Jacobsonian gravity; but it is not forced on it either. (Wüthrich, 2006)

Turning to Verlinde, we build off Wüthrich’s assessment of Jacobson to see how:

The derivation of the Einstein equations (and of Newton's law) follows very similar reasoning to Jacobson's. The connection with entropy and thermodynamics is made also there. But in those previous works it is not clear WHY gravity has anything to do with entropy. No explanation for this apparent connection between gravity and entropy has been given anywhere in the literature. I mean not the precise details, even the reason why there should be such a connection in the first place was not understood. (Verlinde, 2010)

As he explains: “ The origin of gravity is an entropic force. That is the main statement, which is new and has not been made before. If true, this should have profound consequences ” (Verlinde, 2010). For instance, “ The statement that gravity is an entropic force is more than just saying that it has something to do with thermodynamics " — it says that motion and forces are the consequence of entropy differences. Verlinde continues to establish the precedent:

My idea is that in a theory in which space is emergent forces are based on differences in the information content, and that very general random microscopic processes cause inertia and motion. The starting point from which this all can be derived can be very, very general. In fact we don't need to know what the microscopic degrees of freedom really are. We only need a few basic properties. For me this was an "eye opener", it made it from obscure to obvious. It is clear to me now that it has to be this way. There is no way to avoid it: if one does not keep track of the amount of information, one ignores the origin of motion and forces. It clarifies why gravity has something to do with entropy. It has to; it cannot do otherwise. (2011)

Even though other authors have proposed that gravity has an entropic or thermodynamic origin, Verlinde’s model adds an important new element to this claim (2010):

Instead of only focusing on the equations that govern the gravitational field, we uncovered what is the origin of force and inertia in a context in which space is emerging. We identified a cause, a mechanism, for gravity. It is driven by differences in entropy, in whatever way defined, and a consequence of the statistical averaged random dynamics at the microscopic level. The reason why gravity has to keep track of energies as well as entropy differences is now clear. It has to, because this is what causes motion!

58

As Munkhammar explains, “ this led to the conclusion that inertia might be equivalent to the lack of entropy gradients, and conversely, that gravity is due to the presence of them ” (2010). Thus, Verlinde’s proposal is understood as the first to recognize and explain that inertia, and hence motion, is due to an entropic force when space is emergent. Consequently:

This is new, and the essential point. This means one HAS TO keep track of the amount of information. Differences in this amount of information is precisely what makes one frame an inertial frame, and another a non-inertial frame. Information causes motion. This can be derived without assuming Newtonian mechanics. […] By reversing the logic that lead people from the laws of gravity to holography, we will obtain a much sharper and even simpler picture of what gravity is. For instance, it clarifies why gravity allows an action at a distance even when there is no mediating force field. The presented ideas are consistent with our knowledge of string theory, but if correct they should have important implications for this theory as well. In particular, the description of gravity as being due to the exchange of closed strings can no longer be valid. In fact, it appears that strings have to be emergent too. (Verlinde, 2010)

Responding to early critics Verlinde explains that “everyone who does not appreciate that this view is different from previous papers is missing an essential point ” (2010). He continues:

If space is emergent, a lot more has to be explained than just the Einstein equations. Geodesic motion, or if you wish, the laws of Newton have to be re-derived. They are not fundamental. This has not been discussed anywhere, nor ever noted that it is the case.

3.6 – Emergent Gravity: an Outline of Verlinde’s Model

Gravity has given many hints of being an emergent phenomenon, yet up to this day it is still seen as a fundamental force. The similarities with other known emergent phenomena, such as thermodynamics and hydrodynamics, have been mostly regarded as just suggestive analogies. It is time we not only notice the analogy, and talk about the similarity, but finally do away with gravity as a fundamental force. (Verlinde, 2010)

Proposing a reversal of the logic that has carried the last 300 years since Newton, Verlinde set into play a remarkable new idea in 2010 linking classical gravity to entropic forces. Verlinde’s idea recasts gravity as an effective, emergent force that only has a meaningful identity at the macroscopic level (Chivukula, 2010). He proposed the idea of entropic gravity by (1) reversing the logic of black hole entropy; (2) treating the entropy from the information of the microscopic physics as the fundamental object; and (3) recognizing the phenomena of gravity as emergent.

Verlinde’s theory can be modeled using two different methods: thermodynamics and string theory. Within a semi-classical setting we consider gravity to relate to thermodynamics as an entropic force. String theory also appears in the context of the holographic principle (Susskind, 2008; ‘t Hooft, 1999, 2008) and the AdS/CFT correspondence (Maldacena, 1997). Within string theory Verlinde considers gravity as the result of an adiabatic reaction force (2011). We summarize both models now.

59

Using a polymer-based model, thermodynamics, and a specific interpretation of the holographic principle, Verlinde claims to have shown that gravity is a force solely caused by an exchange of information on a holographic screen. In essence, he concludes that gravity is not a fundamental force, but the result of changes in entropy of an unknown microscopic theory. (Roveto and Munoz, 2012)

Inspired by Bekenstein’s original argument regarding black-hole thermodynamics, Verlinde proposes that the change in entropy as a mass crosses a screen is proportional to the mass times the differential displacement (see 2010). Building off of the ideas previously put forward by ‘t Hooft (1993), Jacobson (1995), and Padmanabhan (2009)—the latter two both utilizing Rindler space-time to reverse-engineer the framework to arrive at the Einstein field equations for gravity. Distinguishing his method from predecessors, Verlinde derived Newton’s second law and Einstein’s equation from the relation between the entropy of a holographic screen and mass inside the screen to designate gravity “ not as a , but an emergent phenomenon arising from the statistical behavior of microscopic degrees of freedom encoded on the holographic screen ” (Verlinde, 2010).

Basically, by inverting the logic in the derivation of black-hole entropy, he took entropy as the fundamental object and gravity as something that emerges from the microscopic interactions of fundamental particles (Chen, 2010). Verlinde explains how after discarding the usual reference frame of beginning with Newtonian mechanics “gravity follows in a very simple fashion from holography ” (2010). The 2010 paper begins with an identification of the universality concerning the force of gravity:

Gravity influences and is influenced by everything that carries an energy, and is intimately connected with the structure of space-time. […] The universality of gravity suggests that its emergence should be understood from general principles that are independent of the specific details of the underlying microscopic theory. (Verlinde, 2010)

He attributes this universality of gravity also in part to the resemblance of the laws of thermodynamics with hydrodynamics (2010). As Verlinde explains:

Gravity dominates at large distances, but is very weak at small scales. In fact, its basic laws have only been tested up to distances of the order of a millimeter. Gravity is also considerably harder to combine with quantum mechanics than all the other forces. The quest for unification of gravity with these other forces of Nature, at a microscopic level, may therefore not be the right approach. It is known to lead to many problems, paradoxes and puzzles. String theory has to a certain extent solved some of these, but not all. (2010)

Next he speaks to the emergence of space-time and gravity as popular themes in physics made more-robust by Maldacena’s AdS/CFT correspondence “ or more generally, the open/closed string correspondence ” (2010). This represents a duality between a theory containing gravity and another without it and therefore “ provides evidence for the fact that gravity can emerge from a microscopic description that doesn't know about its existence ” (2010). Verlinde considers the principle notion needed to derive gravity is information.

60

More precisely, it is the amount of information associated with matter and its location—in whatever form the microscopic theory likes to have it—measured in terms of entropy. Changes in this entropy when matter is displaced leads to an entropic force, which as we will show takes the form of gravity. Its origin therefore lies in the tendency of the microscopic theory to maximize its entropy. (Verlinde, 2010)

The “ most important assumption ” Verlinde considers is that information associated with a part of space follows the holographic principle. As we have seen, the holographic principle takes its esteem from black-hole thermodynamics, string theory, and the AdS/CFT correspondence. As Verlinde explains, “ These theoretical developments indicate that at least part of the microscopic degrees of freedom can be represented holographically either on the boundary of space-time or on horizons ” (2010).

Verlinde presents “ a holographic scenario for the emergence of space ” (2010) that includes a holographic renormalization procedure wherein the boundary can be moved in accordance with an emergent, foliative dimension. He begins with the premise that the entropy contained within a region of space-time can be mapped to an appropriately chosen holographic screen (see Roveto and Munoz, 2012). 24 As we will see in Whitehead, this emergent dimension correlates closely with the logic of the genetic time of positive prehensions in the phases of concrescence (PR 314 ). Within this, Verlinde seeks a general framework describing “ how space emerges together with gravity ” (2010). Such a framework would have to generalize holography into the classical in order to account for gravity and why it obeys Newton’s laws.

In fact, when space is emergent, also the other laws of Newton have to be re-derived, because standard concepts like position, velocity, acceleration, mass and force are far from obvious. Hence, in such a setting the laws of mechanics have to appear alongside space itself. Even a basic concept like inertia is not given, and needs to be explained again. (2010)

Verlinde links the origins of gravity and inertia to the equivalence principle. Given a finite number of degrees of freedom associated with a spatial volume:

Starting from first principles, using only space independent concepts like energy, entropy and temperature, it is shown that Newton's laws appear naturally and practically unavoidably. Gravity is explained as an entropic force caused by a change in the amount of information associated with the positions of bodies of matter. […] The energy that is equivalent to the matter is distributed evenly over the degrees of freedom, and thus leads to a temperature. The product of the temperature and the change in entropy due to the displacement of matter is shown to be equal to the work done by the gravitational force. (2010)

This makes for a very simple derivation of Newton’s law. While the holographic principle is deeply-embedded within Newton and Einstein’s laws, when you turn the argument around and begin with the holographic principle, the laws appear “ directly and unavoidably ”

24 Verlinde’s crucial assumption about the existence of an entropy associated to a surface even in the absence of horizons (Nicolini, 2010). 61

(Verlinde, 2010); as Verlinde explains: “ by reversing the logic that leads from the laws of gravity to holography we obtain a much sharper and simpler picture of what gravity is ” (2010). Finally, Verlinde mentions that these ideas are also consistent with string theory as well; “ however, if correct, it means that the exchange of closed strings cannot account for gravity, and that these also have to be emergent ” (2010). We tend to this after first discussing the key concepts raised in Verlinde’s model, and later employed in the comparative chapters.

Entropic Force

An entropic force is an effective macroscopic force that originates in a system with many degrees of freedom by the statistical tendency to increase its entropy. The force equation is expressed in terms of entropy differences, and is independent of the details of the microscopic dynamics. In particular, there is no fundamental field associated with an entropic force. Entropic forces occur typically in macroscopic systems such as in colloid or bio-physics. (Verlinde 2010)

There is no fundamental field associated with an entropic force. Verlinde develops a polymer model to explain this force (in 2010). In fact, the thought experiment that led Bekenstein to his entropy law is surprisingly similar to the polymer problem. The black hole serves as a heat bath, while the particle can be thought of as the end point of the polymer that gradually returns to equilibrium. The entropic reaction force associated with polymers can be derived from Hooke’s law, given their elasticity (Bekenstein, 1972). 25

Perhaps the best known example is the elasticity of a polymer molecule. A single polymer molecule can be modeled by joining together many monomers of fixed length, where each monomer can freely rotate around the points of attachment and direct itself in any spatial direction. Each of these configurations has the same energy. When the polymer molecule is immersed into a heat bath, it likes to put itself into a randomly coiled configuration since these are entropically favored. There are many more such configurations when the molecule is short compared to when it is stretched into an extended configuration. (Verlinde, 2010)

“The statistical tendency to return to a maximal entropy state translates into a macroscopic force, in this case the elastic force ” (Verlinde, 2010). Verlinde uses this line of reasoning to claim that gravity is also an entropic force, later specifying that the entropic, elastic force in the context of string theory is described as an adiabatic reaction force (Verlinde, 2011).

In the holographic description of this same process, the particle can be thought of as being immersed in the heat bath representing the black hole. This fact is particularly obvious in the context of AdS/CFT, in which a black hole is dual to a thermal state on the boundary, while the particle is represented as a delocalized operator that is gradually being thermalized. By the time that the particle reaches the horizon it has become part of the thermal state, just like the polymer. This phenomenon is clearly entropic in nature, and is the consequence of a statistical process that drives the system to its state of maximal entropy. (Verlinde, 2010)

By pulling out the particle a bit further, one changes its energy by a small amount equal to the work done by the gravitational force. If one then drops the particle in to the black hole, the mass M

25 In another work we will also consider alpha and beta protein dimers, given both are polymers. Once we get to microtubules and subcellular eukaryotic biology we’ll see a whole new layer of contact with Verlinde. 62

increases by this same additional amount. Consistency of the laws of black hole thermodynamics implies that the additional change in the Bekenstein-Hawking entropy, when multiplied with the Hawking temperature must be precisely equal to the work done by gravity. The derivative of the entropy is defined as the response due to a change in the distance of the particle to the horizon.

The mass is defined in terms the energy associated with the particle's holographic image, which presumably is a near thermal state. It is not exactly thermal, however, because it is still slightly away from the black hole horizon. We have pulled it out of equilibrium, just like the polymer. One may then ask: what is the cause of the change in energy that is holographically dual to the work done when in the emergent space we gradually lower the particle towards the location of the old screen behind the new one. Of course, this can be nothing else then an entropic effect, the force is simply due to the thermalization process. We must conclude that the only microscopic explanation is that there is an emergent entropic force acting. In fact, the correspondence rules between the scale variable and energy on the one side, and the emergent coordinate x the mass m on the other, must be such that F = TrS translates in to the gravitational force. (Verlinde, 2010)

Information and Storage on Holographic Screens

Verlinde argues that “ the central notion needed to derive gravity is information ” (2010). More:

It is the amount of information associated with matter and its location, in whatever form the microscopic theory likes to have it, measured in terms of entropy. Changes in this entropy when matter is displaced leads to an entropic force, which as we will show takes the form of gravity. Its origin therefore lies in the tendency of the microscopic theory to maximize its entropy. (2010)

Verlinde posits that space is literally a storage space for phase-space information such that each part of space has a finite information maximum. As he explains, “ space is in the first place a device introduced to describe the positions and movements of particles. This information is naturally associated with matter ” (2010). Wheeler builds off this suggesting we "regard the physical world as made of information, with energy and matter as incidentals " (Bekenstein, 1972). Quantum information specialists—taking their first pulse from Wheeler—hold “it from bit” as a principle motto.

Given that the maximal allowed information is finite for each part of space, it is impossible to localize a particle with infinite precision at a point of a continuum space. In fact, points and coordinates arise as derived concepts. (Verlinde, 2010)

This provides an elegant picture of Heisenberg’s Uncertainty Principle. Zeilinger and Brukner have suggested that quantum randomness arises from the discreteness of information (2006). We can picture this information space as memory where each bit of memory has an address and can store data. From this, Verlinde reaffirms that information is stored on screens.

Screens separate points, and in this way are the natural place to store information about particles that move from one side to the other. Thus we imagine that this information about the location particles is stored in discrete bits on the screens. The dynamics on each screen is given by some unknown rules, which can be thought of as a way of processing the information that is stored on it. (Verlinde, 2010)

63

Information theory is then developed into the context of the holographic principle (see Wheeler and Bekenstein) to provide a key ingredient and (in fact) starting basis for Verlinde’s emergent gravity model. As he explains:

The holographic hypothesis provides a natural mechanism for gravity to emerge. It allows direct "contact" interactions between degrees of freedom associated with one material body and another, since all bodies inside a volume can be mapped on the same holographic screen. Once this is done, the mechanisms for Newton's gravity and Hooke's elasticity are surprisingly similar. (Verlinde, 2010)

Derivation of Newton’s Laws

Verlinde argues that gravity’s interaction with all fields suggests the mechanism is independent of the specific details of any particular field theory (Chivukula, 2010). The starting point of Verlinde’s derivation is the assumption that whenever we have a sphere with area A, the sphere acts as a storage device for information such that the total number of bits of information stored on the surface of the sphere is proportional to the area (see Mäkelä, 2010). “ The only assumption made here is that the number of bits is proportional to the area. Nothing more ” (Verlinde, 2010). As Vedral explains:

First we acknowledge the fundamental thermodynamical relationship that entropy times temperature equals heat. Heat itself is nothing but a form of energy, which according to Einstein equals mass times speed of light squared. The entropy we assume to be proportional to area (radius squared) a la holographic principle. The temperature is, according to Davies and Unruh, proportional to acceleration, which in turn is force divided by mass (from Newton’s second law). Putting all this together gives us the force equal to the product of masses divided by distance squared, namely Newton’s gravity! And that’s more or less what Verlinde does. (Vedral, 2010)

From here, as Mäkelä summarizes: “ Verlinde then identified as the energy E of a system the rest energy Mc2 of the mass inside the sphere, and he assumed that the energy is divided evenly over the bits N” such that the temperature is determined by the equipartition rule to render the average energy per bit. As Verlinde notes, “ here M represents the mass that would emerge in the part of space enclosed by the screen. Even though the mass is not directly visible in the emerged space, its presence is noticed though its energy ” (2010).

After this we need only one more equation: E = Mc 2. As Mäkelä explains, “ if we put E = Mc 2 in the last equation, identify T as the Unruh temperature of an observer with a proper acceleration a, and use the fact that the area of a two-sphere with radius r is A = 4πr2, we find, by means of Newton’s second law F = ma, that the force exerted by a point-like mass M on a mass m at a distance r is: F = GmMr 2 , which is Newton’s universal law of gravitation ” (Mäkelä, 2010). From here the rest is essentially straightforward; as Mäkelä guides, first we eliminate E and insert the expression for the number of bits to determine T. Next we use the postulate for the change of entropy to determine the force; and then finally we insert A = 4πR2 to obtain the Newton’s law describing force. From this, Verlinde (joyfully) exclaims:

We have recovered Newton's law of gravitation, practically from first principles! These equations do not just come out by accident. It had to work, partly for dimensional reasons,

64

and also because the laws of Newton have been ingredients in the steps that lead to black hole thermodynamics and the holographic principle. In a sense we have reversed these arguments. But the logic is clearly different and sheds new light on the origin of gravity: it is an entropic force! (Verlinde, 2010)

In making this distinction between his work and predecessor theories, Verlinde is trying to show what his model offers to the picture. Parsing no words he clarifies that gravity, if not ultimately entropic, is at least emergent. To recapitulate, Verlinde recalls how:

Our starting point was that space has one emergent, holographic direction. The additional ingredients were that (i) there is a change of entropy in the emergent direction; (ii) the number of degrees of freedom are proportional to the area of the screen; and (iii) the energy is evenly distributed over these degrees of freedom. After that it is unavoidable that the resulting force takes the form of Newton's law. (Verlinde, 2010)

Namely, the first equation represents the area law; the second one sets up equipartition; the third one is a statement of Einstein’s equivalence between energy and matter; then, by going thru the steps listed above, the final outcome is Newton’s law (Mäkelä, 2010). In addition, given the even-distributions of energy we are led to a statistical analysis of Verlinde’s derivation of Newton’s law which, “ at the very least, offers a strong analogy with a well-understood statistical mechanism. Therefore, this derivation opens a new window to understand gravity from first principles ” (Sheykhi and Sarab, 2012). We will follow this line of reasoning into Whitehead’s AE’s in the next chapter.

Holographic Dimension of Emergent Space

Verlinde begins with the claim that space has one emergent, holographic direction. Liu offers a follow-up model where the emergent dimension is generalized in all directions, not just one (2010). That said, I think they are really only building off of something Verlinde was saying anyway. He states specifically that the same reasoning can be extended to arbitrary dimensions (2010), and from this we infer he means arbitrary directions where each direction is related to a degree of freedom.

Let us also assume that like in AdS/CFT, there is one special direction corresponding to scale or a coarse graining variable of the microscopic theory. This is the direction in which space is emergent. So the screens that store the information are like stretched horizons. On one side there is space, on the other side nothing yet. (Verlinde, 2010)

As we’ll see in the next chapter, Whitehead’s phases of concrescence creatively growing into genetic coordinates, not in extended space or time (see PR 52, 65), are remarkably similar to Verlinde’s (above) development in the capacity of coarse-graining into an emergent dimension of space. In fact, this represents one of the key pieces of his model that simultaneously distinguishes it from AE’s process without presenting a difference in the process. To these ends, wherein the emergent, spatial dimension “X” appears as a book-keeping device in nature, this speaks to the mechanism responsible for keeping

65 track of the values integrated-out of the matrix, in Verlinde’s model, or as negatively- prehended and excluded from the phases of concrescence, in Whitehead’s AE’s.

Coarse Graining of General Matter Distributions

Coarse graining is a procedure of renormalization phase of quantum field theories in which certain bits are removed while others generally conglomerate and grow bigger through subsequent stages, or foliations. Verlinde’s model demonstrates how space emerges at a macroscopic level only after coarse graining (2010). Hence, there will be a finite entropy associated with each matter configuration. “ This entropy measures the amount of microscopic information that is invisible to the macroscopic observer. In general, this amount will depend on the distribution of the matter .” As he explains:

The information is being processed by the microscopic dynamics, which looks random from a macroscopic point of view. But to determine the force we don't need the details of the information, nor the exact dynamics, only the amount of information given by the entropy, and the energy that is associated with it. If the entropy changes as a function of the location of the matter distribution, it will lead to an entropic force. (Verlinde, 2010)

The Newton potential ( ɸ) keeps track of the depletion of the entropy per bit. It is therefore natural to identify it with a coarse-graining variable, like the (renormalization group) scale in AdS/CFT (Verlinde, 2010). Verlinde proposes a holographic scenario for the emergence of space in which the Newton potential precisely plays that role.

Space cannot just emerge by itself, though. It has to be endowed by a book-keeping device that keeps track of the amount of information for a given energy distribution. It turns out that in a non-relativistic situation this device is provided by Newton's potential, ɸ, and the resulting entropic force is called gravity. (Verlinde, 2010)

Verlinde assumes that like in AdS/CFT, there is one special direction corresponding to scale or a coarse-graining variable of the microscopic theory. This is the (holographic) direction in which space is emergent. The screens that store the information are like stretched horizons, “ on one side there is space, on the other side nothing yet ” (Verlinde, 2010). Holographic screens are located at (and correspond with) equipotential surfaces. The information on the screens is coarse-grained in the direction of decreasing values of the Newton potential. Max coarse graining occurs at black hole horizon when Ф/2c 2 = -1.

Acceleration is related to an entropy gradient. Inertia is a consequence of the fact that a particle in rest will stay in rest because there are no entropy gradients. Given this fact it is natural to introduce the Newton potential, Φ, and write the acceleration as a gradient (Verlinde, 2010). This is one of the most important statements, for when we come to Whitehead’s AE’s it will provide a direct link to the role for negative prehension not able to be clarified in Whitehead, but that Verlinde’s model gives a precise function, as we’ll see in chapter six. As Verlinde explains:

66

We thus reach the important conclusion that the Newton potential (Φ) keeps track of the depletion of the entropy per bit. It is therefore natural to identify it with a coarse-graining variable, like the (renormalization group) scale in AdS/CFT. […] The amount of coarse graining is […] a dimensionless number that is always between zero and one. It is only equal to one on the horizon of a black hole. We interpret this as the point where all bits have been maximally coarse-grained. Thus the foliation naturally stops at black hole horizons. […] This coarse graining can be achieved through averaging, a block spin transformation, integrating out, or some other renormalization group procedure. At each step one obtains a further coarse-grained version of the original microscopic data. (Verlinde, 2010)

This speaks to Whitehead’s view of prehension. Prehension is what Whitehead would call coarse-graining, had it been around in the mid-twentieth century qua renormalization.

The coarse-grained data live on smaller screens obtained by moving the first screen further into the interior of the space. The information that is removed by coarse graining is replaced by the emerged part of space between the two screens. In this way one gets a nested or foliated description of space by having surfaces contained within surfaces. In other words, just like in AdS/CFT, there is one emerging direction in space that corresponds to a "coarse graining" variable, something like the cut-off scale of the system on the screens. (Verlinde, 2010)

Less than a decade ago Chevalier and Debbasch wrote a paper in 2008 suggesting that “matter and charge may be properties of the space-time which only emerge after a certain coarse graining has been performed ” (2008). We endorse this view and lodge it as a parallel narrative to Whitehead’s phases of prehension and concrescence, as we’ll see in the next chapter. Out of both descriptions emerges a final value, or force; in Whitehead’s case, the satisfaction, like the gravitational self-energy of a Coulomb group in a matrix model, emerges at the maximization of prehension—while in Verlinde’s model, the maximization of coarse-graining leads to the emerged part of space, and to emergent gravity.

3.7 – String Theoretic Approach

My result calls into question almost all of the work done on quantum gravity, since the discovery of quantum mechanics. For gravity, there is no longer necessity for a graviton. In the case of string theory, the principal motivation for the profound and historical suggestion by Scherk and Schwarz that string theory be reinterpreted, not as a theory of the , but instead as a theory of the gravitational interaction, came from the natural appearance of a massless graviton in the closed string sector. I am not saying that string theory is dead. What I am saying is that string theory cannot be a theory of the fundamental gravitational interaction, since there is no fundamental gravitational interaction. (Easson and Frampton, 2010)

In this section we’ll discuss string theory and matrix theory in the capacity of emergent gravity/inertia, as an adiabatic reaction force emerging from a hidden phase space of information. To these ends, Verlinde explains how we don’t need to use the full details of string theory but only to consider it as based-on (or following-from) general principles having to do with the real world. This sounds remarkably like the basis for an experiential ontology, as we’ll see in the next chapter. Verlinde uses string theory “ as a source of inspiration to think about these general principles and apply them to the real world ” (2011u). One of the motivations for doing string theory is to learn more about gravity.

67

For one, gravity is a macroscopic force that dominates in IR but it also knows about the UV microscopic states, “ therefore, there must be a principle at work ” (Verlinde, 2011). Open/closed string and AdS/CFT correspondence are attributed to the UV/IR mixing phenomenon that is present in string theory, black holes, and quantum gravity. Harking back to chapter two this represents when the smallest and largest scales interact; “ this connection implies that short and long distance physics cannot be seen as totally decoupled ” (2010). This means we must distinguish it from a conventional effective field theory. As Verlinde explains:

Gravity is a long distance phenomenon that clearly knows about short distance physics, since it is evident that Newton's constant is a measure for the number of microscopic degrees of freedom. String theory invalidates the "general wisdom" underlying the Wilsonian effective field theory, namely that integrating-out short-distance degrees of freedom only generate local terms in the effective action, most of which become irrelevant at low energies. If that were completely true, the macroscopic physics would be insensitive to the short-distance physics. (2010)

The way we visualize this is to consider the molecular force between nuclei influenced by hidden phase space of electrons. This is what gravity is like, also, says Verlinde. The holographic principle is implicit in this move. Indeed, the scenario described in his 2010 paper “ has certainly been inspired by the way holography works in the AdS/CFT and open/closed string correspondences ” (Verlinde, 2010). The AdS/CFT correspondence provides insight by rendering “ dualities between closed string theories that contain gravity and decoupled open string theories that don't ” (2010). Crucially, Verlinde describes how “ in string language the holographic screens can be identified with D-branes, and the microscopic degrees of freedom on these screens are represented by open strings ” (2010); further, the microscopic theory is effectively described by a string theory consisting of open and closed strings. Both types of strings are cut-off in the UV.

The open and closed string cut-offs are related by the UV/IR correspondence: pushing the open string cut-off to the UV forces the closed string cut-off towards the IR, and vice versa. The value of the cut-offs is determined by the location of the screen. Integrating out the open strings produces the closed strings, and leads to the emergence of space and gravity. Note, however, that from our point of view the existence of gravity or closed strings is not assumed microscopically: they are emergent as an effective description. (Verlinde, 2010)

This allows a direct link with the event-narrative describing a holographic snapshot and frozen strand “samples,” as we’ll see in chapter five. In this way, the open/closed string correspondence supports the interpretation of gravity as an entropic force. Verlinde adds:

Many still see the closed string side of these dualities is a well-defined fundamental theory. But in our view gravity and closed strings are emergent and only present as macroscopic concept. It just happened that we already knew about gravity before we understood it could be obtained from a microscopic theory without it. We can't resist making the analogy with a situation in which we would have developed a theory for elasticity using stress tensors in a continuous medium half a century before knowing about atoms. We probably would have been equally resistant in accepting the obvious. Gravity and closed strings are not much different, but we just have to get used to the idea. (Verlinde, 2010)

68

Integrating out off-diagonal open string degrees of freedom between two D-branes induces gravity and closed strings “ leading to the emergence of space and gravity ” where “ the emerged part of space is occupied by the closed strings ” (Verlinde, 2010). Coordinates become matrices and we lose the sense of XT itself (Verlinde, 2011u). This can be compared to the positive prehension values that get integrated-out (negatively-prehended) from the phases of concrescence, in Whitehead. As Verlinde explains, in string theory:

The number of high-energy open string states is such that integrating them out indeed leads to long-range effects. Their one-loop amplitudes are equivalent to the tree-level contributions due to the exchange of close string states, which among other are responsible for gravity. This interaction is, however, equivalently represented by the sum over all quantum contributions of the open string. In this sense the emergent nature of gravity is also supported by string theory. (Verlinde, 2010)

Adiabatic processes generally do not give-off heat. In an adiabatic process, gradually changing conditions allow the system to adapt its configuration, hence the probability density is modified by the process; in other words, it picks up heat as it goes thru its cycle. Unruh’s thought experiment was to wave a thermometer around in empty vacuum and realize that as it interacts with quantum fluctuations it will lead to a heat signature. In this sense the adiabatic force leads to an entropic force qua Born–Oppenheimer force. 26

Hidden Phase Space

There is hidden phase space in the universe; Clifford algebras also model hidden (emergent) phase spaces and degrees of freedom (see e.g., Finkelstein, 2004). Hidden phase space in the universe is not used in everyday descriptions of physics but is important for understanding where gravity and other forces come from, so it is a space that we quantify (see Verlinde, 2011). This is like quantifying Clifford spaces. The phase space volume is an adiabatic invariant that defines an entropy. “Eigenvalue crossing” could relate to concrescence operations, which would also be related to instantons. Eigenvalue crossing numbers also give rise to a gauge field (see Verlinde, 2011). String theory also uses these principles.

Inertia and gravity as adiabatic reaction forces

Inertia and gravity can be interpreted as adiabatic reaction forces. The adiabatic property means that they generate heat in movement. In Verlinde’s proposal gravity is taken as an adiabatic reaction force (see 2011). This means that an entropic force doesn’t lead to a change in entropy but instead acts adiabatically: entropy is an adiabatic invariant. Inertia is therefore also an adiabatic reaction force.

Verlinde divides this scenario in the matrix model into two phase spaces: a fast and large Higgs branch and a smaller, slower Coulomb branch. As he explains, the fast system goes

26 The Born–Oppenheimer force is ubiquitous in quantum chemical calculations of molecular wavefunction, allowing the wavefunction of a molecule to be broken into electronic and nuclear (vibrational) components. 69 into slow system and this is the open/closed string correspondence. The fast dynamical system underlying the universe and inertia is the first-order leading force. Inertia is a force and gravity is an adiabatic reaction-force. The inertial force follows from the response of the phase space volume of the underlying dynamical system under virtual displacements. The strings that connect a single eigenvalue to the rest of the systems represent the reaction force. This is the heat bath of a polymer. There is only a thermodynamic interpretation at horizons; otherwise it is Born-Oppenheimer (see 2011).

3.8 – The End of Gravity as a Fundamental Force

The results of Verlinde’s paper offer an argument endorsing gravity arising as an entropic force after space and time have emerged. We consider this to result from something like large-N gauge theories and the 1/n expansion in YMT; as Verlinde explains, “ if gravity and space time can indeed be explained as emergent phenomena, this should have important implications for many areas in which gravity plays a central role ” (2010). Extending this farther:

Einstein's geometric description of gravity is beautiful, and in a certain way compelling [...] Presumably this explains why we, as a community, have been so reluctant to give up the geometric formulation of gravity as being fundamental. But it is inevitable we do so. If gravity is emergent, so is space time geometry. Einstein tied these two concepts together, and both have to be given up if we want to understand one or the other at a more fundamental level. (Verlinde, 2010)

We will see this quote again in later chapters as it represents one of the strongest links between Verlinde and Whitehead’s theories of gravity. With that said, the following quote is one of the most important to follow from Verlinde’s initial paper on emergent gravity.

Other authors have proposed that gravity has an entropic or thermodynamic origin, see for instance (Padmanabhan). But we have added an important element that is new. Instead of only focusing on the equations that govern the gravitational field, we uncovered what is the origin of force and inertia in a context in which space is emerging. We identified a cause, a mechanism, for gravity. It is driven by differences in entropy, in whatever way defined, and a consequence of the statistical averaged random dynamics at the microscopic level. The reason why gravity has to keep track of energies as well as entropy differences is now clear. It has to, because this is what causes motion! (2010)

Here Verlinde spells out the main updates and consequences of his model, but not without due caution in admitting what he sees as possible drawbacks from a physics perspective, given the generality and heuristics of his model; however, it is precisely this generality of speculative heuristics that places it into camp and principled-alignment within a process and event ontology predicated on the generative dynamics of Whitehead’s AE’s. We give Verlinde the last word:

The presented arguments have admittedly been rather heuristic. One cannot expect otherwise, given the fact that we are entering an unknown territory in which space doesn’t exist to begin with. The profound nature of these questions in our view justifies the heuristic level of

70

reasoning. The assumptions we made have been natural: they fit with existing ideas and are supported by several pieces of evidence. We gather more supporting evidence from string theory, the AdS/CFT correspondence, and black hole physics. (2010)

Over the course of the next chapters we will see how Verlinde’s model can be recognized and developed alongside Whitehead’s AE’s to the extent that both models, taken together, represent a distinction (between force- and value-products) while still both belonging to the same process. In fact, they are considered to (inter)act cooperatively.

3.9 – SUMMARY

In this chapter we framed a step-by-step summary of Verlinde’s emergent gravity hypothesis. With Verlinde’s proposal the notion of emergent gravity receives a solid conceptual foundation using minimal equations to grasp the idea in an accessible way; thus, it is presented as a general theory on the order of a speculative proposal. While it is generally understood that Verlinde’s model is not exacting in all mathematical details and further work remains, the generality of Verlinde’s approach provides an ideally-suited perspective for comparing essential features with Whitehead’s speculative model. Emergent gravity is essentially a conceptual hypothesis describing gravity as the result of an entropic force mediated by a probabilistic consequence of physical systems tending towards an increase in entropy. This therefore distinguishes it from fundamental interaction mediated by a quantum field theory and a gauge particle like photons for the electromagnetic force and gluons for the strong nuclear force, while still remaining “consistent with quantum mechanics and supported by various results in string theory ” (Verlinde, 2010). Verlinde offers three main points encompassing his (2010) proposal:

1. Space is emergent, the part of space which has not yet emerged is enclosed by a holographic screen and the entropy is proportional to the area of the screen. 2. Gravity is an entropic force just like any other generalized forces entering in the first law of thermodynamics. More concretely, gravity is caused by the change of entropy behind the holographic screen due to the emergence of space. 3. The temperature is either related to the acceleration of the observer via Unruh’s law or related to the total energy of the system via equipartition law and the total energy of the system is also equal to the mass behind the holographic screen times c2, i.e. E = Mc 2. Using Unruh’s law for temperature leads to Newton’s law of classical mechanics, and using equipartition law and E = Mc 2 gives rise to Newton’s law of gravitational force.

With this the serendipitous insight of 2009 is given provisional axioms for the simple derivation of Newton’s law of gravity as the result of an emergent force. As will become increasingly apparent, this offers a conceptual revision to classical and Einsteinian gravity placing it precisely into the domain and range of Whitehead’s “event-” and process ontology. In the next chapter we will turn to an introduction of Whitehead’s AE’s and in the process begin to recognize the concomitance between these theories to then be more- precisely situated within such a framework (events) in the following chapters, five and six.

71

Chapter 4 — The Actual Occasions

Whitehead’s fundamental divergence for the philosophical tradition lies in his thoroughgoing acceptance of process as a basic metaphysical feature of actuality. (Leclerc, 1958)

In this chapter we’ll begin by introducing Whitehead’s core idea: the philosophy of organism—and how it’s best suited for a mathematical physics interpretation—before looking at the shift from substance to event metaphysics clarified by Eastman (2008) and tailored by Whitehead’s approach, enabling him to describe the categoreal scheme of the AE’s as the predicate logic of speculative philosophy: a process and event ontology. In addition, a basic framework for the elements discussed in the comparative chapters is given by way of a collection of values.

Imagine you acquire a plethora of little pieces that you are told all go to one puzzle; however, you aren’t told what the picture on the puzzle is about, and in general the pieces are so small that it’s impossible to foretell the picture from any one individually. This is how I consider Whitehead’s PR wherein the actual entities are rigorously introduced and described. Taking this in combination with Whitehead’s own belief that his work wouldn’t make sense until later generations, we find no reason for approaching his work with anything but an eye for admirable traits to be slightly redeveloped into an epical narrative bringing resolve to what Whitehead sought in the first place, we presume. The basic idea is that Whitehead has many of the correct pieces but neither in the right place, nor in the proper ordering. In fact the idea is that Whitehead actually has too many pieces and makes it more complicated than he should. When we apply a geomodal narrative to Whitehead’s work we discover a way to simplify the method into one that also brings it closer in-line with string theory.

4.1 — The Philosophy of Organism

The philosophy of organism is a narrative Whitehead began in SMW whereby the world and all values are viewed as processes and organisms. The distinction drawn out in Whitehead’s work is marked by a resolute determination to approach entities in the universe based on a philosophy of organism, and not on a philosophy of material. To these ends, Whitehead develops an organic philosophy where he considers “actual entities” to represent the building blocks of the universe. On this view, actual entities replace Democritus’ “ inert bits of stuff ” with an organic approach whereby entities “ grow, mature, and perish ” (Sherburne, 1966). Such a view also heralds the notion of “becoming” over “being.” As Whitehead explains, “ The positive doctrine of these lectures is concerned with the becoming, the being, and the relatedness of actual entities ” (PR, viii). On this process view, “the process itself is the constitution of the actual entity ” and its being is determined by its becoming to the extent that “ how an entity becomes determines what it is ” (PR 219). As he continues, “ This doctrine of organism is the attempt to describe the world as a process of generation of individual actual entities, each with its own process of self-attainment ” (PR 60).

72

Whitehead is clear in defining AE’s as the most fundamental of all theories of value; as he explains, “ actual entities…are the final real things of which the world is made up. There is no going behind actual entities to find anything more real ” (PR 18). Thus, the actual entities are the drops of experience themselves. To these ends Whitehead expressed how “ each actual entity is conceived as an act of experience arising out of data ” (PR 40).

In general with Whitehead, we recognize admirable traits in his work that stand to be redeveloped into a new, but very similar narrative to the one, in general, that he proposes. This means we invite the reader to forget any previous associations they have with Whitehead’s work (if none, all the better!) and to consider his works as pieces to a puzzle that stand to be re-developed in a logic consistent with modern mathematical physics. We suggest Whitehead had the right pieces, but not in the right overall narrative. We readily admit it takes modern physics to spell out the logic required to provide the appropriate correspondence to recognize Whitehead’s pieces under the proper narrative.

Whitehead’s philosophy is at the deepest level a philosophy of organism:

The philosophy of organism is a cell-theory of actuality. The cell is exhibited as appropriating, for the foundation of its own existence, the various elements of the universe out of which it arises (PR 242).

Whitehead presented the philosophy of organism as a categoreal framework of actual occasions underwriting experience whereby “ nature is a structure of evolving processes. The reality is the process ” (SMW 102). The basic features of the philosophy of organism can be grouped around three topics, each replacing some aspect of that scientific materialism Whitehead found inadequate: (1) the notion of simple location is replaced by prehensive unity. (2) a materialistic mechanism is transformed into “organic” mechanism involving the evolution of increasingly complex organisms. (3) Consciousness is reconceived so that it is no longer an independent actuality but a function of bodily events.

Point one predicts how at smallest scales, space and time become derived features of a prior logic founded on basis of “events” as a ‘prehensive unity’ or “duration.” Here, simple location eventually dissipates into the prior logic of events, at the smallest scale. The second point is needed to shift the metaphysical paradigm from materialistic and substance-based to organic and process based. The third point recognizes consciousness as embodied, and like quarks (generally) consciousness also observes “confinement” (see Witten and Jaffe, 1999) in the three dantiens of the body (see e.g., Yang, 1989; Cohen, 1999). Taken as such, this study will require the addition of one more category; namely, the vacuum-level connection, in order to complete the list begun of the first three points in Whitehead’s philosophy of organism. This places us into the realm of an experiential metaphysics, or speculative philosophy.

73

4.2 — Experiential Metaphysics & Speculative Philosophy

Whitehead’s summary contribution to the history of Western philosophy is an attempt to complete the revolution initiated by Copernicus, by offering a metaphysical account of the nature of things that can simultaneously take seriously both nature and human experience. (Lucas, 1989)

For Whitehead, the task of metaphysics is the conceptualization of reality in general as inclusive of, and signified by, experience in general (Nobo, 2004). Whitehead’s method is a method of speculative reason from speculative philosophy. His method is an attempt to transcend the limiting methods of particular disciplines. Emphasis is not given to philosophy but to speculative philosophy (Peden, 1981).

We recognize the natural world around us on a foundational level in terms of particle physics and string theory, on the smallest scale, and in gravitational theory and cosmology, on the largest scale. In addition, from an experiential basis we can recognize the smallest elements pertaining to proto-consciousness arising on a subcellular level in the capacity of proto-filaments in cytoplasmic fluid and microtubules (see Hameroff, 2014; inter alia). These two examples comprise the foundational elements of the naturalistic world from the experiential basis of a process metaphysics.

Every act of theorizing begins with human experience [and] addresses some problem, puzzle, or difficulty encountered in human experience….this fact requires the recognition that human experience has primacy for the construction of any theory whatsoever. But truly to recognize the primacy of human experience is to recognize that human experience must itself be adequately conceptualized and elucidated, and thus must be the subject of a theory more fundamental and more comprehensive than any other. (Nobo, 2004)

Indeed, this idea has deep-seeded roots, going all the way back to the late eighteen- hundreds; here we find a passage from Murray (1896) who notes in one of the first editions of the Philosophical Quarterly how:

Underlying all experience -all experiential science - there must be some truth which forms the criterion and foundation of experience itself; but that primordial truth cannot be merely a fact found in experience, that is, found by the method of experiential science.

Nobo coopts Whitehead to describe how, in metaphysics: “ what is ultimately sought is ‘a generality transcending any special subject matter’ (PR 10) and indispensable for the elucidation and interpretation of any concrete moment of experience ” (Nobo, 2004). He goes on:

For Whitehead, the task of metaphysics is the conceptualization of reality in general as inclusive of, and signified by, experience in general […] Whitehead’s metaphysical experientialism is no more and no less than a plausible working postulate on which to essay the construction of a scheme of metaphysical ideas adequate for the successful interpretation of our experience and of our world.

“By ‘metaphysics’ I mean the science which seeks to discover the general ideas which are indispensably relevant to the analysis of everything that happens ” (Whitehead, RM , p. 84).

74

Metaphysics has no choice but to derive its ultimate notions from actual human experience and from the one actual world therein revealed. Metaphysical ideas, Whitehead holds, can be obtained only by generalizing from the essential features of all our experiences (Nobo 2004).

For Whitehead, the task of speculative philosophy is “ the endeavor to frame a coherent, logical, necessary system of general ideas in terms of which every element of our experience can be interpreted ” (PR 3). “ Whitehead’s method is a method of speculative reason from speculative philosophy. His method is an attempt to transcend the limiting methods of particular disciplines. Emphasis is not given to philosophy but to speculative philosophy ” (Peden, 1981). Nobo explains that speculative philosophy:

Must address our experience as a concrete whole, and must exhibit all abstractions from it in their proper relationships, each to the others and all to our experience as a whole […] Speculative philosophy must seek to exhibit reality in general and experience in general as interdependent natures. To accomplish this broader task it must restrict reality to experience. (2004)

Here we consider how Whitehead’s speculative philosophy is able to be developed in the capacity of a process and physical ontology. Whitehead’s speculative philosophy is the heir of natural philosophy and natural history.

4.3 — From Substance to Event (Process) Ontology

Nature is a structure of evolving processes. The reality is the process (SMW 102).

The last hundred years have very-actively born witness to a steady transformation in scientific knowledge spanning to scales previously untenable by direct observation. Perhaps counterintuitively, as technology and experimentation have increased, so too have our models of the world slowly shifted to reflect a patently more organic and processual view of nature. In a 2006 article, “ Our Cosmos, From Substance to Process ,” Eastman recounts the last three hundred years of physics in an effort to highlight the evolution of classical to modern paradigms stemming from an update of substance to event (process) logic.

Substance metaphysics proceeds from the intuition—first formulated by Parmenides— that being should be thought of as simple, internally undifferentiated, and without change. Indeed, philosophy since Aristotle has been substance based. Signs of a substance metaphysics are witnessed in this study under the heading of options whereby objects like gravitons and Higgs boson are considered as fundamental values, rather than emergent quasi-particles and collective values consistent with a process and event logic.

Substance metaphysics models are predicated on the basis of a global realism with the claim that we can exactly model the behavior in the universe if we have all the data. Undercutting this global realism, modern physics has established certain limitations and

75 approximations (Heisenberg’s ‘uncertainty’ and Gödel’s ‘incompleteness’) as fundamental in contributing to the evolution of relativity theory and quantum theory. “ Concepts may be global but concrete facts are local, not global ” (Hansen, 2004). Given we live in a perceptual reality dominated by objects and vision, Eastman insightfully identifies the natural proclivity to “adopt a worldview of perceptual objects in which the world is simply constituted by a multiplicity of discrete objects, from atoms to galaxies, all of which are ultimately like classic substance: hence, a substance metaphysics ” (Eastman, 2006).

No longer satisfied with a strictly substance view of the world absenting the role of subjective experience in nature, however, this burgeoning process paradigm recognizes integrated networks, fields, and relationships as the fundamental description of local operators and event-dynamics in an experientially-predicated universe. In contrast to a substance universe of being, process philosophers analyze “becoming” and what is occurring as well as ways of occurring on the configurative order of quantum field theory and relativity, subsequently developing the world as a plenum of events.

Eastman explains how such a view is also more intuitive with modern physics, explaining how: “ In contrast to a substance universe, quantum field theory shows the world as a plenum of events at multiple scales, now extended in networks of relationships to cosmic scales—in other words, as a process universe ” (2008). To accompany this, Jungerman explains how in quantum theory “macroscopic objects are complex integrations of particles and fields, which in turn are constituted by a plenum of events at multiple scales ” (2000). To bolster this claim, recent research in nonlinear dynamics and ecology “ also demonstrate the emergence of new structures and entities in multiply-interconnected systems ” (Laszlo, 2008).

From out of this Newtonian, mechanistic substance metaphysics, event metaphysics begins to take roots in the early- to mid-twentieth century as a viable alternative to the substance paradigm. Eastman describes “ a need to go beyond forms of modernism that presuppose a substance universe and move on to a more integrated, ecological worldview, or process view of nature ” (2006). Whitehead reacted to this Aristotelian paradigm with the alternative posture that the foundational elements of nature and reality are in fact events , not substances. He explains that “ experiential units, constitutive of both processes and objects, are the basic elements of reality ” (Clayton, 2004). Out of this he presented the philosophy of organism and a categoreal framework for the actual occasions underwriting experience whereby “ nature is a structure of evolving processes. The reality is the process ” (SMW 102).

Process philosophy speculates that these momentary events, called “actual entities,” are essentially self-determining, experiential, and internally related to each other. Indeed, the postulation of actual entities (or occasions of experience) as metaphysically basic is what Whitehead refers to as the working hypothesis of his metaphysics. As Eastman explains: “ the event metaphysics, or process framework, that has been rapidly gaining ground since the late 20th century provides a fundamental alternative to substance metaphysics. An event- metaphysics treats events and processes as more fundamental than things ” (2006).

76

In this light we recognize the opposition of process philosophy to the ‘substance metaphysics’ that has been dominant in Western philosophy since Aristotle. Alternatively, “ substance metaphysics proceeds from the intuition—first formulated by the pre- Socratic Greek philosopher Parmenides—that being should be thought of as simple, hence as internally undifferentiated and unchangeable ” (Seibt, 2013). Substance metaphysicians frames the position that the primary units of reality must be static—they must be what they are at any instant in time, without change.

To reconcile this distinction between substance and event, objectivity and subjectivity, Whitehead determined that the final real things in the universe are "experiential" in nature, made up of momentary events of experience rather than enduring material substances. As Clayton explains, “ the core proposition of event metaphysics is that the basic unit of reality is not a thing at all but rather an event—an occurrence or happening. Only an ontology of events can do justice to this insight, and ‘thing’ language must be carefully subordinated to event language ” (Clayton, 2003). McHenry reiterates the point, explaining how “Whitehead’s metaphysics is the most advanced and sophisticated version of process philosophy, an ontology that takes events rather than enduring substances as the basic units of reality ” (2009). As Hustwit narrates:

The most counter-intuitive doctrine of process philosophy is its sharp break from the Aristotelian metaphysics of substance, that actuality is not made up of inert substances that are extended in space and time and only externally related to each other. Process thought instead states that actuality is made up of atomic or momentary events. These events, called actual entities or actual occasions, are “ the final real things of which the world is made up ,” (Whitehead, Process and Reality , 18). They occur very briefly and are characterized by the power of self-determination and subjective immediacy (though not necessarily conscious experience). In many ways, actual occasions are similar to Leibniz’s monads, except that occasions are internally related to each other. (2015)

We agree with Hustwit in defining events as ontologically prior to conscious awareness in an autopoietic mode. Whereas Leibniz’ monads are windowless, we acknowledge the creative collectivity of AE’s as central to the generative dynamics underwriting this mode. Clayton drives the point farther:

The core proposition of event metaphysics is that the basic unit of reality is not a thing at all but rather an event—an occurrence or happening. Only an ontology of events can do justice to this insight, and “thing” language (on this view) must be carefully subordinated to event language. (Clayton, 2004)

As Whitehead explains, “ I give the name ‘event’ to a spatiotemporal happening […] Without related objects there can be no event. On the other hand, related objects signify events, and without such events there are no such objects ” (PRel, 1922). 27 As Whitehead explains in an early work:

27 In an event-logic we would define ‘related objects’ in concert with string theory as open-strings on a D- brane in the decoupling limit. The events are snapshots and without snapshots there are no such events. 77

An object is an ingredient in the character of some event. In fact the character of an event is nothing but the objects which are ingredient in it and the ways in which those objects make their ingression into the event. (1920, 143-4)

The event is the standard process that each generational cycle of and AE undergoes in its passage into the real and actual world of values (pertaining to that living, agential organism, a speculative/first philosophy adds). All the little components of the event are parts of the process and go into the process of the formation of the AE as a satisfaction: a final concrescence, a final real value. The AE is that enumerative and concrescent event of collective constituents, itself, as a totality of values. As Whitehead explains:

Nature presents itself to us as essentially a becoming, and any limited portion of nature which preserves most completely such concreteness as attaches to nature itself is also a becoming and is what I call an event. By this I do not mean a bare position of space-time. Such a concept is a further abstraction. I mean a part of the becomingness of nature, colored with all the hues of its content. (PRel 21)

We chart this description to the fact that events arise or derive from snapshots of the sea of strands. Scaffolding this claim, systems generally exemplify a combination of reduction (exclusively focused on efficient causes) and emergence with both top-down as well as bottom-up (efficient) causality (see Murphy and Ellis, 1996). These new results also suggest that all entities are constituted by networks of relationships, in contrast to substance views that ground all things in some type of immutable substance (see Eastman, 2008).

Without the shift from substance to event logic we cannot get to emergence. Verlinde is tacitly promoting an event metaphysics without realizing it. When we move to the interpretation of Verlinde, (philosophy of physics) we immediately realize that not substance but event metaphysics is what it will take to interpret him. This shift from substance to event is uniquely suited to mathematical physics.

Eastman reinforces the fact that a new physics is steadily emerging as a viable distinction from classical substance views (see 2008). In this study we formalize a method for event- ontology predicated on a process-cycle of generative dynamics capable of aligning both AE’s and EG into one narrative. What we mean by an event-ontology is an understanding of the world where first materials are not fundamental substances, but instead events representing the initial conditions of a generative cycle of a foundational process. This line of reasoning can be traced all the way back to Whitehead’s 1922 philosophy of organism, as well as more recently in constructs underwriting emergent theories of space- time (see e.g., ‘t Hooft, 2008; Berenstein, 2006; Seiberg, 2006). As Whitehead explains:

Nature is a becomingness of events which are mutually significant so as to form a systematic structure. We express the character of events in terms of space and time. Thus space and time are abstractions from this structure (PRel, 1922).

78

In our present study we will develop events as representing “multiplicities of data,” Large- gauge theories, or Dn-branes of open strings, as the initial conditions of a process. Taking this supposition we will identify the initial conditions of an event-cycle as a duration of relationships. To these ends the identification of an event metaphysics is the most important precursor to initiating a new line of logic underwriting both material and experiential dynamics. We explore this line of reasoning at length in the later chapters.

4.4 — Uniquely Suited to Mathematical Physics

“Whitehead’s process invokes a concept whose precise definition requires mathematical language of a type developed by physicists, not philosophers.” (Chew, 2004)

“In contradistinction to the German Idealists and British Neo-Idealists who preceded him, Whitehead took the results of physics as his major starting point.” (Clayton, 2004)

It is the position of this dissertation that Whitehead’s P&R, thru the lens of 19-21 century physics, yields a retroactive speciation of the actual entities/actual occasions, as the ultimate, proto-experiential constituents of experience. Given AE’s are said to persist at the smallest of small realm, it is natural to also consider them in the context of Planck scale dynamics and string theory. For this reason, it is not beyond the scope of any Whitehead scholar to consider advances in physics that have only just been clarified in the last few decades could hold the key for rigorously linking and redeveloping Whitehead’s programme within mathematical physics, cosmology, and general relativity.

Discoveries in particle physics late in the twentieth century led many scientists to view the world as a plenum of events at multiple scales. (Jungerman, 2000)

This drives-home the point that the landscape of science has dramatically altered since Whitehead’s times, and that this modern enumeration of scientific narrative holds the more promising link to Whitehead’s original vision for science in a process and event ontology. With that said, certain fundamental predicates still remain consistent; for instance, as McHenry explains, Whitehead’s view of the physical world offers “ an ontology of events consistent with the four-dimensional structure of space-time ” (2002). As Lucas explains:

The Whiteheadian school represents a form of cosmology influenced principally by mathematical physics (primarily relativity theory and, to a lesser degree, quantum mechanics), and only to a minor degree by evolutionary biology. (Lucas, 1989, p. 70)

By placing his emphasis upon cosmology and speculative philosophy, within the context of his philosophy of experience, Whitehead asserts an empirical emphasis which “ attempts to go beyond the superficial level of sense-experience expressed in dogmatic empiricism, such as Locke and Hume ” (Peden, 1981). This suggests moving to the subtle sense of phonons. As Kallfelz (1997) concludes: “ Whitehead’s prophetic conceptual structure and the more experimentally founded and mathematically structured developments in physics will continue to illuminate each other for some time to come .”

79

4.5 — The Actual Entities

“Every occasion is a synthesis of being and not-being” (SMW 163).

“Actual entity” (AE) is a term coined by to develop the basic realities that shape all things. Actual entities are clusters of events that shape reality; they do not deal with the substance of things but rather how something ‘becomes’, or happens. The AE generational process proceeds by an event-based ontology. As Whitehead explains, “an actual entity is a process” (PR 41).

Indeed, actual entities reside at the heart of Whitehead’s ontology, having a deep history in physics and mathematics, and also formulated in such a background. Substantively, AE’s formed out of a dissatisfaction with the warring substance paradigm of idealism and materialism, as predicated on Descartes duality (see Eastman, 2008, 2009). Whitehead upended the entire substance paradigm with a process/event metaphysics as a new paradigm and this was also needed to predicate the details of physical field and leading to it in PRel qua AE’s.

According to William Ernest Hocking’s notes, the term “actual occasion” was first used in Whitehead’s lectures for 30 April 1925 (Ford, 1984). “ In Whitehead’s metaphysical system, the category of actual entity—also termed actual occasion or occasion of experience—is the basic metaphysical genus of existents ” (Eastman, 2004). “ In Whitehead’s metaphysics, actual occasions are the ultimate discrete indivisible realities constituting the actual world ” (Nobo, 2004).

The positive doctrine of the Gifford lectures is concerned with the becoming, the being, and the relatedness of actual entities. ‘Actual entities’—also termed ‘actual occasions’— are the final real things of which the world is made up. There is no going behind actual entities to find anything more real. (PR viii)

Whitehead’s metaphysics is specifically based on the hypothesis that “ the ultimate individual actualities of the universe have the metaphysical characteristics of concrete moments of human experience, suitably generalized ” (AI 221). These ultimate individual actualities are what Whitehead refers to as the actual entities. As he puts it, actual occasions are the “final real things of which the world is made ” (PR 27). “ The final facts are, all alike, actual entities, and these actual entities are drops of experience, complex and interdependent ” (PR 18). Thus, the actual entities are the drops of experience themselves.

Whitehead describes the history of the universe in terms of the process of a creative advance into novelty: “this advance is produced by a collection of happenings called actual occasions, or actual entities. Each actual entity has an associated actual world, and arises from its own peculiar actual world ” (PR 284).

The actualities of the Universe are processes of experience, each process an individual fact. The whole Universe is the advancing assemblage of these processes. The Aristotelian doctrine, that all agency is confined to actuality, is accepted. So also is the Platonic dictum that the very meaning of

80

existence is ‘to be a factor in agency’, or in other words ‘to make a difference.’ Thus, ‘to be something’ is to be discoverable as a factor in the analysis of some actuality. (AI 197)

“How an actual entity becomes constitutes what that actual entity is; so that the two descriptions of an actual entity are not independent. Its ‘being’ is constituted by its ‘becoming’ ” (PR 23). As Lango explains:

The fundamental entities of Whitehead’s ontology—the actual entities—are purely hypothetical, postulated by his metaphysics, but not known in any other way. We cannot observe an actual entity with our senses; we cannot infer an actual entity with empirical theories; we cannot consciously apprehend an actual entity through introspection. We must therefore obtain an understanding of actual entities through speculation, by using metaphors derived from the societies of actual entities that we can observe, infer, or introspect. (1972)

In the Whiteheadian process logic, “ the world of fixed and settled facts ” (PR 108, 242) grows via a (non-linear) sequence of actual entities. These actual entities form “ a growing set of non-overlapping regions filling-out a growing spacetime region that advances into the still uncreated and yet-to-be-fixed future ” (PR 89). Each “event” has experiential and physical aspects, which are both needed in order to generate the requisite advance into the future (PR 67). As Whitehead explains, the actual entities atomize the extensive continuum. This space-time continuum is in itself merely potentiality for division (PR 67).

The past actualities generate potentialities for the next actual entity, which is tied to a new space-time standpoint from which the potentialities created by the past actualities will be prehended (grasped) by the current entity. This basic, autogenetic process creates the new actual entity which, upon the completion of its creation, contributes to the potentialities for the succeeding actual entities (Stapp, 2007).

These are the basic units of becoming. For Whitehead, actual occasions are “drops of experience,” and relate to the world into which they are emerging by “feeling” that relatedness and translating it into the occasion’s concrete reality. As he explains, “ In his first meditation, Descartes uses the phrase res vera in the same sense as that in which I have used the term ‘actual.’ It means ‘existence’ in the fullest sense of that term, beyond which there is no other ” (PR 100). Nobo elaborates the point: “ In Whitehead’s metaphysics, actual occasions are the ultimate discrete indivisible realities constituting the actual world ” (Nobo, 2004). Eastman frames it nicely in his 2003 book:

In Whitehead’s metaphysical system, the category of actual entity —also termed actual occasion or occasion of experience—is the basic metaphysical genus of existents. The other metaphysical genera of the system consist of existents that either are generic operations, features, or relationships of actual entities [see contrast, nexus, prehension, proposition , and subjective form ] or are uncreated ontological presuppositions of all actual entities [see creativity, envisagement, eternal object, and extension ]. Actual entities are the final real constituents of the actual world. They are discrete and interconnected, and their generic properties are those Whitehead deemed essential to any discrete moment of experience.

81

One natural consequence of this description is that it positions the categoreal scheme of Whitehead’s AE’s at the most fundamental levels of reality and we are therefore led to consider theories at the Planck scale in the context of string theory, black hole thermodynamics, cosmology, and the holographic principle.

The postulation of actual entities—or occasions of experience—as ontologically basic components of an underwriting process is a refined version of what Whitehead refers to as the working hypothesis of his metaphysics. That is, based on the hypothesis that “ the ultimate individual actualities of the universe have the metaphysical characteristics of concrete moments of human experience, suitably generalized ” (AI 221). As Whitehead explains, each actual entity “ viewed in its separate individuality, is a passage between two ideal termini, namely, its components in their ideal disjunctive diversity passing into these same components in their concrete togetherness ” (AI 303). Whitehead further develops the point in his later work:

Actual entities are the final real constituents of the actual world. They are discrete and interconnected, and their generic properties are those Whitehead deemed essential to any discrete moment of experience, human or non-human. Actual entities are drops of experience, complex and interdependent. (PR 18)

According to Whitehead, the occasions of experience are constituted of four grades:

The first grade comprises processes in a physical vacuum such as the propagation of an electromagnetic wave or gravitational influence across empty space. The occasions of experience of the second grade involve just inanimate matter. The occasions of experience of the third grade involve living organisms. Occasions of experience of the fourth grade involve experience in the mode of presentational immediacy, which means more or less what are often called the qualia of subjective experience. So far as we know, experience in the mode of presentational immediacy occurs in only more evolved animals; consequently, it is inessential that an occasion of experience have an aspect in the mode of presentational immediacy; occasions in the grades one, two, and three, lack that aspect. (Cobb and Griffin, 1976 )

As Wüthrich explains, “ an ontological unification reduces the ontologies of all theories to the one monolithic ontology of what is considered to be the most fundamental theory. In one of its more radical incarnations, ontological unification requires that all basic entities are of one kind only ” (2006). For physics this is a matter of course. For everything that exists its’ built out of the same kind of matter/energy. Compare this to Whitehead:

The presumption that there is only one genus of actual entities constitutes an ideal of cosmological theory to which the philosophy of organism endeavors to conform…but though there are gradations of importance, and diversities of function, yet in the principles which actuality exemplifies all are on the same level. (PR 135)

One natural consequence of this description is that it positions the categoreal scheme of Whitehead’s AE’s at the most fundamental levels of reality. We are therefore led to consider theories at the Planck scale in the context of mini black holes in a black hole cosmology and holographic principle (see: ‘t Hooft, Susskind, Hawking, Bekenstein).

82

In Whiteheadian process logic the world of fixed and settled facts grows via a sequence of actual entities. “ These actual entities form a growing set of non-overlapping regions filling out a growing space-time region that advances into the still uncreated and yet-to-be-fixed future ” (Stapp, 2007). Each happening has both experiential and physical aspects, which are both needed in order to generate the requisite advance into the future. As Whitehead explains, the actual entities atomize the extensive continuum. This space-time continuum is in itself merely potentiality for division (PR 67).

Each actual entity is conceived as an act of experience arising out of data. The objectifications of other actual occasions form the given data from which an actual occasion originates. Each actual entity is a throb of experience including the actual world within its scope. It is the process of ‘feeling’ the many data, so as to absorb them into the unity of one individual ‘satisfaction.’ Here ‘feeling’ is the term used for the basic generic operation of passing from the objectivity of the data to the subjectivity of the actual entity in question. Feelings are variously specialized operations, effecting a transition into subjectivity. They replace the ‘neutral stuff’ of certain realistic philosophers. An actual entity is a process, and is not describable in terms of the morphology of a ‘stuff.’ (PR 65)

Just like with string theory, the AE’s as put forward in Whitehead’s PR draft are also characteristically unordered, or to borrow the phrase from Emerson, lacking in any ‘epical integrity’ in their organization. Instead Whitehead leaves this up to the future of scholars to put together, perhaps in some way intuitively realizing that physics would still be a few decades before catching up to his insights.

For the purposes of this study we find a basic resemblance of procedure in Sherburne’s approach to PR as an attempt to bring an ordering to the descriptions of Whitehead’s AE process. While the narrative model constructed in this study will fine-tune the ordering of some components, and emphasize certain one’s over others (than Sherburne), still it is the approach that justifies the choice of building a narrative that distinguishes itself from Sherburne’s (1966) effort. To these ends we now list a set of components chiefly implicated in the generation of AE’s, identifying and then describing each in the context of Whitehead’s descriptions, along with commentary from other scholars. All together we will discuss: prehension, subjective aim, subjective form, concrescence, feeling, satisfaction, and unity. While this list isn’t exhaustive, in the next two chapters what further concepts are introduced are predicated on a suitable background from these listed here to be incorporated in stride.

Prehension

This is one of the key philosophical concepts that will make sense of Verlinde’s work throughout the rest of this chapter. Prehension refers to analysis of components that arise from their physical data; and concrescence to the way in which positive prehensions creatively coalesce to form the satisfaction of that actual occasion.

Whitehead states that “ there can be no duplication of any element in the objective datum of the satisfaction ” (PR 39). As Christian (1977) emphasizes, “ in the satisfaction each entity will be

83 felt only once .” Echoing Leibniz’ theory of the indiscernibles, he explains how “ the same entity, be it an actual entity or eternal object, cannot be felt twice in the formal constitution of one concrescence…thus objective identity requires integration of the many feelings of one object into the one feeling of that object ” (qtd. in PR 347). Thus, “ the process must produce finally a single feeling in which every object felt in earlier phases will have unambiguous status ” (PR 344-347).

John Cobb esteems prehension even stronger, emphasizing that it “may be the single most important and original concept in Whitehead’s philosophy ” (WWB, 2008; 31). In Verlinde, this describes the factor that distinguishes between values in eigenvalue of matrix and those excluded, or integrated-out. Since this is what Verlinde purports to give rise to gravity, the importance of the dynamics is not lost on emergent gravity, either. In one of Whitehead’s more notable passages he explains:

The philosophy of organism is a cell-theory of actuality. The cell is exhibited as appropriating, for the foundation of its own existence, the various elements of the universe out of which it arises. Each process of appropriation of a particular element is termed a prehension. I have adopted the term ‘prehension’ to express the activity whereby an actual entity effects its own concretion of other things. In Cartesian language, the essence of an actual entity consists solely in the fact that it is a prehending thing—i.e. a substance whose whole essence or nature is to prehend. (PR 218)

Cobb scaffolds this description in noting that:

A prehension is an internal relation. That is, it is internal to the prehending occasion while being external to the prehended occasion. The prehension does not change what it prehends, but the subject of the prehension becomes what it becomes through its prehensions. (Cobb, WWB, p.31)

This can be visualized in a corresponding logic according to the example of two notes and their interacting waveforms. For instance, if I strike one note on a piano while holding the sustain pedal and then another note right after it, then notice how the general tendency is for the two notes to become as one. The second note can be said to morph into the first and together they form a simplest chord: an interval, and both notes must feel each other. “ The cause passes on its feeling to be reproduced by the new subject as its own, and yet as inseparable from the cause ” (PR 355).

Whitehead defines each prehension consisting of three factors: (a) the ‘subject’ which is prehending, namely, the actual entity in which that prehension is a concrete element; (b) the ‘datum’ which is prehended; and (c) the ‘subjective form,’ which is how that subject prehends that datum (PR 35) . These three factors combine to characterize Cobb’s description of a prehension as “ the bond between two actual occasions. The past occasion shares in the constitution of the new occasion ” (WWB, 32). Thinking back on the example of two notes interacting by virtue of succession, the two waveforms generate either a harmonic or cacophonic chord-value. Accordingly, Whitehead distinguishes between two types of prehensions: those of a positive variety and a negative variety. He states:

84

There are two species of prehensions, the ‘positive species’ and the ‘negative species.’ A ‘feeling’ belongs to the positive species of prehensions. An actual entity has a perfectly definite bond with each item in the universe. This determinate bond is its prehension of that item. A negative prehension is the definite exclusion of that item from positive contribution to the subject’s own real internal constitution. A positive prehension is the definite inclusion of that item into positive contribution to the subject’s own real internal constitution. This positive inclusion is called its ‘feeling’ of that item. All actual entities in the actual world, relatively to a given actual entity as ‘subject,’ are necessarily ‘felt’ by that subject, though in general vaguely. (PR, 66)

In the case of the two notes, values that are positively prehended by the ‘initial aim’ are proceed into positive values participating in the phases of concrescence, while negatively prehended values are excluded from the phases of concrescence ultimately forming into a ‘satisfaction’. Positive prehensions are called by Whitehead, “feelings.” In terms of the selection and synthesis of values this is referred to in renormalization theory as the coarse graining procedure, and is a very important concept as saw in chapter three.

Prehension begins by supposing a subject (X) that prehends. X prehends the initial data giving rise to the objective datum (B) and per each X,B relation forms a unique prehension value that is either positive or negative, depending on the initial aim of the initial datum (of the initial data). If it is a positive prehension then it is called a feeling ; if it is a negative prehensive value then it affords only noise and is thusly excluded from the phases of concrescence. This collection of feelings as one amalgamated condensate, or concrescence, forms the final actual entity. Within the phases of concrescence, feelings link up with (or follow sequentially from) other feelings in order to form simple/complex conformal and comparative feelings.

In this capacity, prehension presupposes a subject, the initial datum, with an initial aim to prehend the set of ‘objective data’ (of the ‘initial data’) to either a positive or negative outcome of prehension. As data are positively selected for inclusion into the concrescence, the subjective aim guides them into conformational and supplementary phases of subjective form (see PR; 114, 127, 133).

Simple Physical Feelings

Physical feelings are always derived from some antecedent ‘experient.’ (PR 336)

Whitehead cites physical feelings in the capacity of: 1) the actual occasions felt, 2) the eternal objects felt, 3) the feelings felt, and 4) its own subjective form of intensity (PR, 211). As he explains, “ in the process of concrescence the diverse feelings pass on to wider generalities of integral feeling ” (PR 211). Included are elements of ‘identity’ and ‘contrast.’

This datum, which is the primary phase in the process constituting an actual entity, is nothing else than the actual world itself in its character of a possibility for the process of being felt. This exemplifies the metaphysical principle that every being is a potential for a becoming . The actual world is the objective content of each new creation [...] A simple physical feeling has the dual character of being the cause’s feeling re-enacted for the effect as subject. But this

85

transference of feeling effects a partial identification of cause with effect, and not a mere representation of the cause. It is the cumulation of the universe and not a stage-play about it […] Simple physical feelings embody the reproductive character of nature, and also the objective immortality of the past. In virtue of these feelings time is the conformation of the immediate present to the past. Such feelings are conformal feelings. The conformal stage merely transforms the objective content into subjective feelings. (Sherburne, 1966, p.40-41)

Feeling isn’t about cognitive, conscious states of an organism. It isn’t subjective. This is about ontology and is a part of physical theory. Contrary to the colloquial sense of the term, “feeling” is actually nuanced in Whitehead’s work and proves used in a physical denotation; in fact, we’ll see that feeling is at the heart of Verlinde’s string theory analysis of open string one-loop amplitudes.

To be positively prehended is to become a feeling of a new subject for later phases of concrescence. Surprisingly, Whiteheadian feelings will help us understand strings. In Verlinde’s model this means open strings not integrated out represent positively prehended values akin to gravitational self-energy. As Whitehead describes: “Each actual entity is a throb of experience including the actual world within its scope. It is the process of ‘feeling’ the many data, so as to absorb them into the unity of one individual satisfaction ” (PR 117).

A positive prehension is termed a feeling. If a datum is prehended negatively it is simply excluded, however a positive prehension yields that content as data that go into the formation of the satisfaction. From this we can also conclude that not all elements of the data, and not even necessarily a majority, are used to form the (eventual) satisfaction. However, all are assessed, and the ones not-used are called negative prehensions and are withheld from the combinatorial phases of concrescence of that actual occasion. 28

A ‘simple physical feeling’ entertained in one subject is a feeling for which the initial datum is another single actual entity, and the objective datum is another feeling entertained by the latter actual entity. (PR 236)

Whitehead describes how “ simple physical feelings embody the reproductive character of nature, and also the objective immortality of the past. In virtue of these feelings time is the conformation of the immediate present to the past ” (PR 214). Such feelings are conformal feelings. The conformation can be described like two notes mixing into one waveform to form a chord.

A simple physical feeling is an act of causation. The actual entity which is the initial datum [A] is the ‘cause,’ the simple physical feeling [X] is the ‘effect,’ and the subject entertaining the simple physical feeling [B] is the actual entity ‘conditioned’ by the effect. This ‘conditioned’ actual entity [B] will also be called the ‘effect.’ All complex causal action can be reduced to a complex of such primary components. Therefore simple physical feelings will also be called ‘causal’ feelings [or feelings of causal efficacy]. The

28 We will compare this language with the processes described in Verlinde’s proposal in the next two chapters. 86

‘power’ of one actual entity on the other is simply how the former is objectified in the constitution of the other. (PR 259)

This can be taken as a possible explanation of the simple dynamics of striking one note on a piano, followed by another, and how they interact to form a “dual resonance” said to be either harmonic or an-harmonic. Next Whitehead links simple physical feelings to conformal feelings. As he explains:

A simple physical feeling has the dual character of being the cause’s feeling re-enacted for the effect as subject. By reason of this duplicity in a simple feeling there is a vector character which transfers the cause into the effect. It is a feeling from the cause which acquires the subjectivity of the new effect without loss of its original subjectivity in the cause. Simple physical feelings embody the reproductive character of nature, and also the objective immortality of the past. In virtue of these feelings time is the conformation of the immediate present to the past. Such feelings are conformal feelings. (PR 245)

Causal feelings are also identified with simple physical feelings. As Ford explains, they are understood as “not ‘reaching out beyond’ (PR 236) the occasion to other actualities; they are wholly within the occasion” (Ford, 1984, p.218). In another passage, Whitehead further elaborates the initial datum as the cause , A, and to the simple physical feeling, X, as the effect . The “ subject entertaining the simple physical feeling, B, is the actual entity conditioned by the effect ” (PR 363). As he continues:

Simple physical feelings will also be called ‘causal’ feelings [or feelings of causal efficacy]. The ‘power’ of one actual entity on the other is simply how the former is objectified in the constitution of the other. (PR 363)

Whitehead explains that the term “feeling” is used as a neologism, but that “it has been chosen to suggest that functioning through which the concrescent actuality appropriates the datum so as to make it its own ” (PR 220). In one of his more famous passages he develops the notion of a feeling as “ a transition effecting a concrescence analyzable into five factors that express what the transition consists of, and effects ” (PR 221). These are:

(i) the subject which feels (ii) the ‘initial data’ which are to be felt (iii) the ‘elimination’ in virtue of negative prehensions (iv) the ‘objective datum’ which is felt (v) the ‘subjective form’ which is how that subject feels that objective datum.

Here ‘feeling’ is the term used for the basic generic operation of passing from the objectivity of the data to the subjectivity of the actual entity in question. Feelings are variously specialized operations, effecting a transition into subjectivity. They replace the ‘neutral stuff’ of certain realistic philosophers. A feeling appropriates elements of the universe, which in themselves are other than the subject, and absorbs these elements into the real internal constitution of its subject by synthesizing them in the unity of an emotional pattern expressive of its own subjectivity . (KPR, 8)

87

The perspective of the initial aim initial datum chooses from only the positive feelings based on the initial aim and subjective aim. As feelings take shape within the phases of concrescence, the subjective form is said to emerge and also take shape to help guide and orchestrate the feelings into a unified bundle of values that, like all sound, also has a certain affective valence linked to emotion. The ‘subjective aim’ (PR 133, 212) is a feeling of what the process may achieve, namely the satisfaction possible to it, together with ‘appetition’ toward the realization of this relevant ideal; as Christian explains, “ the envisagement of and tendency toward satisfaction gives unity to the many feelings in the concrescence ” (1959).

In addition, Whitehead adopts a mathematical frame for the prehension process, referring to “feelings” as vectors: “ for they feel what is there and transform it into what is here. We thus say that an actual occasion is a concrescence effected by a process of feelings ” (KPR, 8). In Verlinde’s model we consider vectors like the one-loop amplitudes going between two D-branes in a matrix model. Given Whitehead’s description of the vectorial analysis of prehension, as “ a feeling from the cause which acquires the subjectivity of the new effect without loss of its original subjectivity in the cause ” (KPR, 11 ff. 12)—it isn’t much of a stretch to link this in the context of an open-string in off-diagonal of matrix gaining an expectation-value in Higgs branch of phase space, and lifted into the Coulomb branch. We consider this later, in chapter six.

Subjective Forms

An actual entity, on its subjective side, is nothing else than what the universe is for it, including its own reactions. The reactions are the subjective forms of the feelings. (PR, 378)

The subjective form is largely entailed in the description of a feeling. Whitehead cites several examples: “emotions, valuations, purposes, adversions, aversions, consciousness” and more (PR). In the process of subjective form origination the many initial data become a complex objective datum whereby:

The subjective form originates and carries into the feeling the way in which the feeling feels. The way in which the feeling feels expresses how the feeling came into being. It expresses the purpose which urged it forward, and the obstacles which it encountered, and the indeterminations which were dissolved by the originative decisions of the subject. (PR 232)

When one entity prehends the other, how they feel each-other determines their shared value. When two notes are compared to each other, the interval created between the two is a synthesis of both notes in a chord. The chord is greater than either note in itself and more than the addition of both: it is a synergetic property made possible by the laws of music and harmony; and yet, the chord still cannot be said to exist without the expression of both (all) of its basic notes. As Whitehead explains:

The essential novelty of a feeling attaches to its subjective form. The initial data, and even the objective datum, may have served other feelings with other subjects. But the subjective form is the immediate novelty; it is how that subject is feeling that objective datum. There is no

88

tearing this subjective form from the novelty of this concrescence. It is enveloped in the immediacy of its immediate present. The subjective form is the ingression of novel form peculiar to the new particular fact, and with its peculiar mode of fusion with the objective datum. In the becoming, it meets the ‘data’ which are selected from the actual world. (PR 233)

More concisely, Cobb explains how “a prehension consists in an objective datum as well as a subjective form. The objective datum is what it prehends. The subjective form is how it prehends it ” (WWB, 31).

The subjective form is the immediate novelty; it is how that subject is feeling that objective datum. There is no tearing this subjective form from the novelty of this concrescence. It is enveloped in the immediacy of its immediate present (Ibid).

This passage can be used to signify the confining structure of the chiral bag as the environment in which the objective data are each prehended individually and in the asymptotic freedom conditions of one state in particular as described by that bag model, as a holographic dual of an initial snapshot. We see both descriptions play out in the thought experiment of two notes played in a space where they are analyzable in comparison to each other, as in musical set theory.

Initial/Subjective Aims and Decision

The initial aim provides the causality of EO’s. As Cobb explains:

Whitehead affirms that purposiveness characterizes the subjective existence of all occasions. They all aim at a creative synthesis of the prehensions that arise from their physical data. This aim does not arise from the actual occasions that constitute its actual world , although it is directed at the particular value that is possible given the actual world of the occasion. (Cobb, WWB, 72)

It is generative and emergent and directs the metric/scale for the creative synthesis process of the prehensions. I would compare it to the tonic note of a musical scale. Thus, if the tonic is C then you can know the parameters insofar as knowing that there are no flat notes or sharp notes in that scale. You also know the basic chord structures of that scale once you know the tonic. Thus, the tonic is compared to the initial aim, according to Whitehead and Cobb’s descriptions—it offers the acoustic parameters of that vector. “This [subjective] aim doesn’t derive from the actual occasions ” (WWB, 72) because the AE’s represent the data assessed for value given the tonic of the initial datum and initial aim. The subjective aim operates, however, only on the order of those values that are positively selected during prehension to enumerate into the phases of concrescence—where it guides and characterizes the progressive evolution of the concrescing AE.

In his Whitehead Word Book, Cobb begins the entry on Subjective Aim and Decision with a mention to teleology and purposiveness in nature (WWB, 57). As Cobb explains, “ For Whitehead, all experience is purposive .” As such, the subjective aim depends on the feelings

89

(positive prehensions) selected for phases of concrescence. The subjective aim is what that occasion becomes given the elements/feelings added to it (qua positive prehension) in each successive phase of concrescence. Each new datum (feeling) added into the phases of concrescence adapts and adopts to influence and characterize and contrast further the subjective aim of that occasion throughout out its phases of concrescence leading to the final satisfaction.

This describes a creative process selecting structures on the basis of components available in each phase for the sake of the maximal potential (intensity) of each set of data per phase of concrescence. The set of positively-prehended values continuously changes as each new feeling and contrast is added to the mix per successive phase and until the final concrescence gives rise finally to the satisfaction of that occasion.

The subjective aim guides the organization and assembly of feelings (as “self-determined” decisions) over the phases of concrescence. The subjective aim also describes the feelings as they progressively find a place within the generative synthesis over concrescent phases, for the sake of the satisfaction of each occasion.

Incorporating a principle of harmony into the narrative of AE’s we could also develop the subjective aim in terms of a harmonic potential. Harmony provides the “loose harness” of the subjective aim, to borrow the phrase from Robert Frost. The subjective aim is like the harmony spectrum of dynamic parameterization within the phases of concrescence per that evolving (creative) occasion.

Each occasion “ aims ” at achieving some value. Indeed it aims both at realizing some value in and of itself and also at contributing some value to future occasions. Every occasion has this “subjective aim.” (WWB, 58)

All decisions are made in light and accordance with the subjective aim, allowing through harmonic parameters the ability for more than one decision in progressively concretizing its subjective form. Such a dynamics can also be exemplified in Robert Frost’s definition of freedom as “running easy in harness.” The harness is like the subjective aim where each decision is self-determined within a small set of alternatives, given the initial aim of the subjective aim. What it becomes within these parameters speaks to the progressive contextualization of its subjective form.

The aim of an occasion is not at value in general, but at some particular realization of value that is possible in that situation. In general, value is attained by generating and heightening “ contrasts.” (C0bb, WRB, 58)

Thus, each physical datum positively prehended by the initial aim into a feeling adds a contrast and limiting harmonic function to the set of harmonic potential. The harmonic potential is what gets contrasted with each additional note to narrow down the harmonic spectrum of creativity, per occasion, the more it contextualizes itself causa sui . “Every occasion is in part causa sui ” (Cobb, WWB, 31). Thus, the subjective aim is what guides the

90 process in its evolving expectation potential per associative components selected during the phases of concrescence qua positive prehensions, culminating in a satisfaction.

To put it in the example of an old TV game show from the 1980’s, “Name that Tune,” the subjective aim refers to the “reason for” (the ability we have to be able to name a song from hearing only a few notes played in sequence. The subjective aim is what allows you to name the tune from hearing just a few notes. In another example, as any good soloing instrumentalist knows, you can listen to a song you’ve never heard before and suddenly be able to play along with it and predict its local harmonic series, if not the exact notes. Thus, subjective aim can also be recognized as the reason and platform for improvisational accompaniment, or soloing, as an instrumental and creative capacity of a musician. Therefore, the subjective aim must actually represent a real phenomenon in order for us to be able to partake in these capacities of music-making and spontaneous analysis based on autonomic cognitive computations of acoustic phenomena; thus:

The aim of most occasions is completely unconscious. Even in conscious occasions, the aim is generally not consciously in view. Nevertheless, it influences the whole process of concrescence. (WWB, 58)

The computations required to be able to improvise or recognize a song just based on listening to its tones are largely handled on the autonomic brain-scale. However, the ability to become familiar with this on an affective and known scale is to be able to then solo, improvise, and recognize songs from just a few notes. The subjective aim and subjective form (plus initial aim) of each entity can all be said to describe unique elements in providing an aim and specifying an actual value within that aim via contrasts to the (expectation) potential.

The subjective aim guides the positive expectation values (feelings) given the subjective form, or the substance of the song qua melody, etc., based on the initial aim. The initial aim of the AO comes from the initial datum and characterizes the coupling constant (in physics) or “sets the tone” for combinatorial dynamics of the objective data during prehension and the phases of concrescence. Out of the initial aim a subjective form is creatively constructed, as guided by (and contrasted from) the subjective aim.

Concrescence

“An actual occasion is a concrescence effected by a process of feelings.” (PR 211) “It repeats in microcosm what the world is in macrocosm.” (PR 215)

Explicating a synthesis of terms, Whitehead describes how “ the first analysis of an actual entity into its most concrete elements discloses it to be a concrescence of prehensions which have originated in its process of becoming ” (PR 35). This points towards Whitehead’s “ontological principle” which states that “ every reason for what an occasion becomes is found in some actual entity ” (Cobb, WWB, 31). Thus, prehension enumerates the “live options” (James) of potential data for the phases of concrescence qua a substantive selection process that

91

enumerates components to underwrite the formation of a maximally harmonic amalgam of organized entities: the satisfaction of that occasion.

“All further analysis is an analysis of prehensions” (PR 35).

Therefore, the driving operation during the phases of concrescence is found to bear in the analysis of prehensions. Whitehead explains how: “ There are two species of process, macroscopic process and microscopic process. The macroscopic process is the transition from attained actuality to actuality in attainment; while the microscopic process is the conversion of conditions which are merely real into determinate actuality ” (PR 214). The microscopic describes the phases of concrescence.

The phases of concrescence are where the positively prehended feelings organize into optimal combinations and sequences of conformal and comparative feelings capable of reordering with each successive phase of concrescence, given new data that accompany it. By the time of Process and Reality, Whitehead conceives of concrescence as “ a process of unification requiring an initial many to unify ” (Ford, 1984, p.213). More specifically, it is “ a process of unification of simple physical feelings of a multiplicity of past actual occasions: a process presided over by a subjective aim ” (Ford, 1984). As Cobb explains, “ Everything that happens in the process of concrescence presupposes the unity that is its outcome ” (2008). Whitehead establishes the original sentiment, explaining how:

An actual occasion is nothing but the unity to be ascribed to a particular instance of concrescence (PR 212). This concrescence is thus nothing else than the real internal constitution of the actual entity; to use Locke’s terms, “it is the ‘real internal constitution’ of the actual entity.” (PR 210)

Whitehead uses the word concrescence to signify this process of becoming which constitutes the new actual entity. “ The word concrescence is a derivative from the familiar Latin verb, meaning growing together... [and]… also has the advantage that the participle ‘concrete’ is familiarly used for the notion of a complete physical reality ” (AI 303). As he explains:

Concrescence is the name for the process in which the universe of many things acquires an individual unity in a determinate relegation of each item of the ‘many’ to its subordination in the constitution of the novel ‘one.’ An actual occasion is nothing but the unity to be ascribed to a particular instance of concrescence. (PR 211)

The process of concrescence is divisible into an initial stage of many feelings, and a succession of subsequent phases of more complex feelings integrating the earlier simpler feelings up to the satisfaction which is one complex unity of feeling (Sherburne, 1966; 36).

The primary stage in the concrescence of an actual entity is the way in which the antecedent universe enters into the constitution of the entity in question, so as to constitute the basis of its nascent individuality. A simple physical feeling is one feeling which feels another feeling. But the feeling has a subject diverse from the subject which feels it. A multiplicity of simple physical feelings constitutes the first phase in the concrescence of the actual entity which is

92

the common subject of all these feelings. All the more complex kinds of physical feelings arise in subsequent phases of concrescence, in virtue of integrations of simple physical feelings with each other and with conceptual feelings. (Sherburne, 1966; p.40)

Whitehead refers to this first phase as the “ phase of pure reception of the actual world in its guise of objective datum for aesthetic synthesis ” (PR 212). “ In this phase there is the mere reception of the actual world as a multiplicity of private centers of feeling, implicated in a nexus of mutual presupposition ” (PR 212).

Concrescence is simply the process of becoming concrete. Concrete means fully actual. As Cobb explains, the term ‘concrescence’ places emphasis on the idea that even these momentary flashes of actuality Whitehead calls actual occasions are processes (2008).

Concrescence is nothing else than the real internal constitution of the actual entity […] this is a theory of monads; but it differs from Leibniz’ in that his monads change. In the organic theory, they merely become . Each monadic creature is a mode of the process of ‘feeling’ the world, of housing the world in one unit of complex feeling in every way determinate. Such a unit is an ‘actual occasion’; it is the ultimate creature derivative from the creative process. (PR 105)

“Concrescence is not in time, rather time is in concrescence ” (Sherburne, 1966; 38). Put differently, concrescence is said to take place all at once and yet is also told in order of phases. This is referred to as genetic time by Whitehead. As Cobb explains:

Even if a concrescence occurs, temporally speaking, all at once, to understand it requires analysis into the stages or phases of its becoming. Much of Process and Reality is an account of these phases. The concrescence can be analyzed genetically in more than one way, resulting in naming and counting the phases differently. Given the fact that these phases have no separate existence, we do not have correct and incorrect analyses. For dif- ferent purposes we may analyze the concrescence somewhat differently. Whitehead calls this the genetic analysis of the occasion. (WWB, 2008)

The concrescence can be analyzed genetically in more than one way, resulting in naming and counting the phases differently. Given the fact that these phases have no separate existence, we do not have correct or incorrect analyses. For different purposes we may analyze the concrescence somewhat differently. Whitehead calls this the genetic analysis of the occasion (Cobb, 2008). This seems to predict the multiple variations of renormalization techniques, and chief among them, those of coarse graining and foliation, described and modeled in this study.

In Sherburne’s 1966 book, a Key to Whitehead’s Process and Reality (KPR) he offers a diagram outlining the phases of concrescence. Here he classifies the phases of concrescence into five distinctions: 1) conformal feelings, 2) conceptual feelings, 3) simple comparative feelings, 4) complex comparative feelings, and 5) satisfaction. He references the conformal feelings as the initial phase, and the conceptual and comparative feelings as the supplementary phase.

93

In a recent, personal conversation with John Cobb (2013), reiterating his entry in the Whitehead Word Book (2008), he simplifies the description of Sherburne’s phases into 1) the initial, conformal phase, 2) the supplementary, responsive phase, and 3) the satisfaction. This calls for a brief description.

The conformal phase is that in which the new occasion reenacts the past. This is the causal efficacy of other actual entities for the present concrescing occasion. Physical feel- ings, both pure and hybrid take place in this phase. In most of the world this is the dominant factor. It establishes the endurance of things. (Cobb, WWB, 61)

Actual occasions are only definable near the end of the cycle, once final concrescence ‰ satisfaction takes place. Before that it is defined per its phases in genetic formation as demonstrated in Whitehead’s description, as well as associatively, through Verlinde’s emergent gravity framework.

Satisfaction

Perhaps the best way to analyze the internal structure of actual occasions is to study the “satisfaction” of an occasion, in which its unity is achieved. (Christian, 1959)

Sherburne describes the process of concrescence as “ the process of integrating the initial welter of many simple physical feelings into the one complex unity of feeling that is the satisfaction” (KPR, 41). He explains how this proceeds in accordance with several Categoreal Obligations, or Categories of Obligation (KPR 41).

There is the actual occasion in the process of becoming, and then there is the completed occasion. Whitehead calls this completion the satisfaction of the actual entity. The term emphasizes that this process of becoming is characterized by subjectivity (Cobb, 2008). Whitehead defines a satisfaction as the final phase of concrescence (or the process of integration of feeling) in which prehensions are integrated into a concrete unity. The term first appears in his category of explanation xxv in the following passage:

The final phase in the process of concrescence, constituting an actual entity, is one complex, fully determinate feeling. This final phase is termed the ‘satisfaction.’ It is fully determinate (a) to its genesis, (b) as to its prehension—positive or negative—of every item in its universe. (PR 38)

“It is clear that the satisfaction is a feeling” (Christian, 1959). Whitehead is shown to have described the satisfaction as a feeling in numerous other passages as well (see also PR 38- 9, 66, 71, 434; AI 298). The satisfaction is a single complex feeling, unifying all the component prehensions in the concrescence (Christian, 1959, p.25). As Whitehead states, the satisfaction is “the concrete unity of feeling” obtained by the process of integration (PR 322). It is an experience which has “intensity” (PR 129) or more specifically “quantitative emotional intensity” (PR 177). It is “subjective” (PR 82) and it is “immediately felt” (PR 235). As Whitehead explains: “ There is a mutual sensitivity of feelings

94 in one subject, governed by categoreal conditions. This mutual sensitivity expresses the notion of final causation in the guise of a pre-established harmony ” (PR 221). We can also link this to the feelings (positive prehensions) that are mutually-sensitive qua differentially-harmonic with each other, in the formation of each satisfaction, as harmonic packet of maximal intensity. One condition of “intensity,” as Christian explains, is that:

Feelings should not inhibit but should mutually enhance one another. Achievement of the maximum intensity possible to an act of experience therefore would require a single integrative feeling in the immediate subject, in which all the component feelings would be adjusted to one another in a pattern of contrasts. It seems clear therefore that whatever else the satisfaction of an actual occasion may be, it is a feeling immediate to its subject. It is an emotional experience of some positive intensity” (PR 177) and is felt by its subject. (p.25; 1959)

The subjective aim is a feeling of what the process may achieve, namely the satisfaction possible to it, together with ‘appetition’ toward the realization of this relevant ideal. The envisagement of and tendency toward satisfaction gives unity to the many feelings in the concrescence (Christian, 1959).

In the conception of the actual entity in its phase of satisfaction, the entity has attained its individual separation from other things; it has absorbed the datum, and it has not yet lost itself in the swing back to the decision whereby its appetition becomes an element in the data of other entities superseding it (PR 233). It provides the individual element in the composition of the actual entity. (PR 129)

The [feeling of] satisfaction is what becomes and perishes. Though it does not exist except in the context of processes of becoming and perishing, it is not itself such a process. As Christian explains, in a sense the satisfaction “ is present in the process from the beginning, but only in the sense that it is the ideal at which the process aims and by which the process is guided ” (Christian, 1959). In Verlinde’s account this is linked to a re-definition of the polymer on horizon after thermalization.

The “ideal of itself” (Gifford lectures) is the final conceptual pattern of the satisfaction, progressively defined on concrescence (Ford, 1984, p.221). As Whitehead states, “ the progressive definition of the final end is the efficacious condition for its attainment ” (PR 150). We can take this to cross-over into Verlinde’s model to represent the maximal of coarse graining that is ever potential until realized during the prehension phase. Once the maximal amount of coarse graining has taken place, all the component feelings of the concrescence thereby reach final form and assembly into a satisfaction. As Whitehead explains, “ In its self-creation the actual entity is guided by the ideal of itself as individual satisfaction and as transcendent creator. The enjoyment of this ideal is the ‘subjective aim’ by reason of which the actual entity is a determinate process ” (PR 130, 341-2).

The satisfaction is not, however, merely a goal, pursued but not achieved; the process actualizes its ideal (Christian, 1959). The notion of an actual entity involves “ an attainment which is a specific satisfaction ” (PR 130). The concrescence “reaches the goal” (PR 251) of satisfaction. An occasion “ enjoys its decisive moment of absolute self-attainment as emotional

95 unity ” (AI 227). Christian sets the matter plainly: “ Whatever else the satisfaction of an occasion may be, it is clearly something not only aimed at, but achieved. It is indeed, in the earlier phases of a process of experience, an ideal. But it also becomes a fact ” (Christian, 1959).

Unity and Determinateness

By a process of progressive integration (Christian, 1959) “completion is arrived at—at least, such ‘formal’ completion as is proper to a single actual entity” (Nobo, 1986; see also PR 44, 248, 322-3, 327, 373; AI 247; MT 123). In the feeling of satisfaction this process of progressive integration culminates and all indeterminations have ‘evaporated’ (PR 71). As Whitehead states, “ all indeterminations respecting the potentialities of the universe are definitely solved so far as concerns the satisfaction of the subject in question ” (PR 234). Christian follows this adding that “ there is no longer any ambiguity about the relation of the occasion to other entities. The categoreal obligation of objective identity is satisfied ” (1959). As Whitehead explains. “ the datum of the satisfaction is a complex unity of actual entities, eternal objects, and propositions, felt with corresponding complex unity of subjective form ” (PR 434). In this datum every item in the universe of its subject is implicated (see PR 38).

The determinateness of satisfaction is required by the categoreal scheme. The categories of subjective unity and objective identity require that the occasion achieve a completely definite and consistent character. As Whitehead quips, “Becoming is the transformation of incoherence into coherence” (PR 38). Those same categoreal obligations taken together with the principle of relativity require further that the satisfaction be completely determinate in relation to all other entities. The principle of relativity is that “every item in its universe is involved in each concrescence” (PR 33). The satisfaction must then be determinate and self-consistent, and related to every other entity in its universe. This is possible only if its relation to every other entity is determinate (Christian, 1959).

The satisfaction therefore establishes the actual occasion as a completely definite and self-consistent whole, with definite and unambiguous relations to every other entity in its universe. Nothing can henceforth be added to the experience of the occasion; as Whitehead describes, “ The final satisfaction of an actual entity is intolerant of any addition ” (PR 71). No further adjustments within the experience of the occasion are necessary or possible. With the achievement of the satisfaction the individual character of the occasion has been finally determined (Christian, 1959). The satisfaction is a complete feeling (Christian, 1959). “ It is by virtue of this feeling of satisfaction, aimed at and attained, that any actual occasion is a real individual, a genuine term in the universal scheme of relatedness ” (Christian, 1959). The feeling of satisfaction involves no internal process within itself. It is the final act of “decision” (Ibid). The satisfaction is the outcome of the internal process of becoming, and it leads to the transition to the future (Christian, 1959). The satisfaction of an actual occasion is related to the process of concrescence as the “outcome” (SMW 247) of this process. It is that state of complete coherence of feeling in which the becoming of the actual occasion terminates (PR 71). The satisfaction is often described as “the final phase in the process of concrescence” (PR 38, 227-8, 323). In its

96 complex subjective form the satisfaction embodies the history of its own becoming. As Whitehead says, “ the genesis and internal history of the concrescence have left their mark upon the final feeling ” (PR 354). The feeling of satisfaction involves no internal process within itself. It is the final act of “decision” (Christian, 1959). The satisfaction is the outcome of the internal process of becoming, and it leads to the transition to the future (Christian, 1959). The satisfaction represents a pause in the midst of the flux. The pause is not empty; it is occupied by a single complete feeling. It is the “halt for attainment” (MT 139). As Whitehead explains “ the attainment of a peculiar definiteness is the final cause which animates a particular process; and its attainment halts its process ” (PR 223; see also p. 340).

Taken as such, the AE’s reflect a generative process. Within this process there is some interpretive ambiguity in terms of whether concrescence begins from a primary datum or an initial multiplicity of data. Indeed, Whitehead propounds both options over the course of the Gifford draft and until setting his final PR draft. By the end he leans towards the initial multiplicity model. In the next chapter we’ll explore how we can reconcile this “data/datum dilemma” through an appeal to string theory mathematics modeling of holographic black hole dynamics. This stands to update the AE’s into the most current models of physics and bring rounding resolution and coherence to the internal issue concerning the initial components of the concrescence process.

4.7 — SUMMARY

This chapter began with the intent to portray the ambit scenario of a holographic process in which the AE’s can be located: an experience-based process paradigm and philosophy of organism within a speculative ontology. Eastman’s update of ‘a substance- into an event metaphysics’ was also described. In addition, a basic framework for the elements discussed in the comparative chapters has been provided by way of a set of categoreal components of the AE’s, each identified in Whitehead and developed in the context of this project. In the next chapter Whitehead’s “corrected categories of existence” (as just developed) will be shown to indicate the emergent nature of “multiplicities” in a way that proves to overlap nicely with Verlinde’s recognition of string theory as emergent.

97

Chapter 5 — Origination, Emergence, & Reenactment

The purpose of these next two chapters is to provide a comparative analysis of Whitehead- and Verlinde’s models. We assumed at the outset (in chapter one) that Whitehead and Verlinde’s models hold the potential to be interrelated in a productive manner; that is, if they could be integrated, we would have a philosophical grounding for emergent gravity as well as a physical and gravitational grounding for actual entities. The goal is to reconcile their dynamics within a conceptual framework grounded ultimately in processes and events. Within this, space-time also is derived from out of the same, ultimate source (an event-logic). Thus, nature begins with a process, not a substance.

This chapter aims to correlate four connections between Verlinde and Whitehead’s models: (1) Whitehead’s “eternal objects” in correspondence with Verlinde’s ‘microscopic information’ qua ‘pre-event strands’ of (Chew, 2008). We consider this like a sea of bosonic strands. Out of this initial environment will be shown to yield a clarification of the ‘measurement problem’ in physics, not as a direct collapse of the wave-function, but a sampling of “strands” by virtue of a snapshot of a manifold. This gives rise to initial conditions defined in the capacity of (2) a “multiplicity” of “initial data” (in Whitehead) qua “open strings” on a “D-Brane” leading to the formation of (3) a “primary datum” (in Whitehead) correlating with a “closed string,” or a “phonon.” In (4), recognition of the “snapshot” as holographic leads to model a dual-projection. This is shown to resolve what this study calls the “data/datum dilemma” representing the historical tension in Whitehead to clarify whether ‘concrescence’ begins with a ‘primary datum’ or a ‘multiplicity’ of ‘initial data.’ The extensive claim to be demonstrated is that by constructing the dynamics of an event-logic we arrive at a clarified understanding of string theory and a platform for Verlinde’s EG in conjunction with Whitehead’s AE’s. Both physics and philosophy are (re)united in this move.

We find it important to remember that Whitehead’s AE’s are constructed in a philosophical framework without direct connection to any specific scientific framework, and thus inevitably require some adaptation for the interpretation of particular scientific results. For this reason we trust many Whitehead scholars will likely welcome the consideration that advancements within mathematical physics, cosmology, and gravitational physics only just clarified in the last few decades (years) as providing potential clues for fine-tuning Whitehead’s programme.

Verlinde emphasizes how string theory lacks a concrete narrative accounting for how all the pieces fit together. As Gross explains: “ string theory awaits something that changes its dynamics in a fundamental way ” (2014). Verlinde cites that string theory has useful elements, but they are not presented (as Emerson says) with any ‘epical integrity.’ The same goes for Whitehead’s AE’s: they are described but not put together. In both cases what is missing from each is an overall integrity tying the two together in a unified order or logic. Whitehead leaves this for future generations, perhaps understanding that it would take a few before the required physics needed would be adequately developed.

98

Accordingly, the next two chapters promote a candidate renaissance in string theory and AE’s through situation and developments within an event-based narrative. Such a rendering illustrates the role (the objects of) string theories can play in the larger context of this basic event-logic underwriting nature and experience. To these ends, the narratives of string theory and Whitehead’s AE’s are brought into a uniform, epical integrity purchased through the development of this underlying process-cycle, also fulfilling Whitehead’s foreshadowing of an experiential foundation for ontology.

Indeed, for Whitehead’s metaphysics to hold, experience must also hold a fundamental status in cosmos and nature. Here we proscribe the processual dynamics of an experiential metaphysics distinguished from substance metaphysics. As Leclerc highlights: “ Whitehead’s fundamental divergence for the philosophical tradition lies in his thoroughgoing acceptance of process as a basic metaphysical feature of actuality ” (Leclerc, 1958). At the same time, both elements of being and becoming, relations and relata, subject and object are foundational within the Whiteheadian framework, not simply “process” over against “substance.” Here, “feeling” and “experience” as, at least in part, metaphor, and:

Despite the implications of its name, the process philosophy of Alfred North Whitehead does not opt for the Heraclitean alternative; nor does it attempt to make permanence in any way the subservient member of the pair. Rather, it asserts that being and becoming, permanence and change must claim coequal footing in any metaphysical interpretation of the real, because both are equally insistent aspects of experience. (Krause, 1997; p.1)

The clear emphasis on an “event” and “process” logic is motivated, in part, by both basic philosophical arguments and the historical context that the substance-thinking, dominant in the Western tradition for over a millennia, needed to be balanced by an emphasis on the processual aspects of physical existence and human experience (see also Rescher, 1996 and 2003, on process metaphysics and experience).

Following Aristotle’s ‘first’ philosophy and Whitehead’s speculative approach (see Ramal, 2003), the process developed here represents an anchor from which the two models are shown to trace a parallel course underwriting the foundational dynamics of physics, cosmology, and gravitation, encountered in the framework of Verlinde’s emergent gravity and Whitehead’s AE’s. As such, the description from both Verlinde’s scientific model and Whitehead’s philosophical model represent two aspects that when combined narrate one effective cycle of a process underwriting both material and experiential systems.

Tantamount to these ends, we will link the integrated-out modes of open strings in matrix with negative-prehension. The corresponding claim to establish is that Whitehead’s ‘positive prehensions’ supply the corollary logic underwriting the generative dynamics of gravitational self-energy within the string-theoretic description of Verlinde’s emergent gravity. Even stronger, we place ourselves in a position to reconcile both models as attributes of one, deeper process. This further serves to leverage a transition from a substance to process paradigm in an experientially-predicated metaphysics.

99

To achieve the desired overall integrity of this event-cycle, we begin with a rudimentary identification of basic constructs from both scientific and philosophic guidelines and constraints regarding foundational, underlying process in an event-logic. Given Whitehead’s ‘speculative’ philosophical notion of taking “experience” as the first mode through which the universe is ordered and understood, we regard this process as underwriting both conscious and material systems from ‘first’- and ‘second-person’ approaches. 29 Within this we identify an initial mode at “vacuum” scale wherein a process begins. In this same way we also approach Verlinde’s physics.

5.1 — The GEOMODAL CONSTRUCT

Geometry and Physics: Two Metrics, not One

Whitehead also developed a principle of relativity, serving as a mathematical and philosophical rival to Einstein’s, as we will explore in chapter seven. Einstein rendered XT equivalent to geometry; thus, curvature is equivalent to gravity such that gravity is not a real force but a pseudo-force. For Whitehead, on the other hand, instead of combining physics and geometry into one metric, he instead retains the more Newtonian and Maxwellian position keeping the two separate and treating gravity as ‘real.’ This enables him to develop two metrics within his alternate rendering. As Tanaka explains:

Whitehead was convinced that geometry should be distinguished from physics. Geometry represents the uniform relatedness of nature, especially of spatiotemporal relations. Physics treats the contingent properties of nature. These convictions were related to his rejection of scientific materialism and of the bifurcation of nature. The theme of physics, according to him, is not the material things themselves cut off from the perceptual data but the perceived phenomena which show themselves "contingently" in the uniform framework of space-time. (1987)

Thus, for Einstein we speak of STR and GTR, whereas for Whitehead we refer to the first and second metrics (see Fowler, 1975). This connection can be taken one step farther: Einstein’s STR is shown to be equivalent to Whitehead’s first metric (Fowler, 1975). 30

Minkowski’s Lightcone

Einstein’s STR is based on the Minkowski lightcone to such an extent that the lightcone provides the background, or amphitheater, for all values and activity therein; indeed, STR is simply a theory of events linked by finite-speed light signals. This makes Whitehead’s first metric equivalent to the Minkowski lightcone qua local, relativistic SR frame of an experiential system. As Fowler explains: “ Whitehead’s approach is very similar to Minkowski’s in that Whitehead accepts Minkowski spacetime as the arena for explanation ” (1975, p.60). We find Coleman also explains this, paraphrasing Synge’s classic paper:

29 See Nathalie Despraz on second-person phenomenology 30 See also Russell and Wasserman within Eastman’s “Process Studies Supplement” in conjunction with the “Resource Guide for Physics and Whitehead,” 2004 100

The first essential thing to observe about Whitehead’s theory is that it uses the space-time of the Special Theory of Relativity, or, more correctly, the space-time of Minkowski. Mathematically, this means that we consider a four-dimensional continuum of events and in it certain privileged systems of coordinates (x,y,z,t) related to one another by Lorentz transformations. (Synge, 1951; Coleman, 2005)

Elements of Minkowski space are called events or four-vectors (see e.g., Jancewicz, 1988; Reddy, 1994; Heald and Marion, 2012). We will consider this in light of Whitehead’s “event ontology.” Strictly speaking, the use of the Minkowski space to describe physical systems over finite distances applies only in the Newtonian limit of systems without significant gravity. When dealing with significant gravitation, however, spacetime becomes curved and we must shift from special relativity to the full theory of general relativity (see e.g., Walter, 1999). In the realm of weak gravity, XT becomes flat and looks globally, not just locally, like Minkowski space and is referred to as flat spacetime.

Conventionally, the 4d lightcone of XT consists of two light- cones, a forward and backward facing one. The hypersurface of the present appears like a vector through the central crossing of the two (opposite-facing) cones. By turning these cones inwards, instead of a “point” we arrive instead at a “region” of space demarcated by the intercession of the two lightcones. Here, the hypersurface appears in one higher dimension: 5d anti-de-Sitter space (AdS).

Figure 2 c/o Stib, 2007

Given we can locate this principle basis within the definite structure of an inverted lightcone (following Lockwood’s treatment, see Jammer, 1999)—it is therefore also realized that this represents a uniform structure. By inverting Minkowski’s lightcone, Lockwood argues that we acquire the emergent field of consciousness and come to recognize the space and time of the mind in the spirit of his first procedure. 31 From this we acquire the space and time of conscious (operational) systems.

Every act of theorizing begins with human experience [and] addresses some problem, puzzle, or difficulty encountered in human experience….this fact requires the recognition that human experience has primacy for the construction of any theory whatsoever. But truly to recognize the primacy of human experience is to recognize that human experience must itself be adequately conceptualized and elucidated, and thus must be the subject of a theory more fundamental and more comprehensive than any other. (Nobo, 2004)

31 Lockwood solves for and interprets ‘the mental event’ as emerging in between, or at the intersection of two light-cones, in a region that Lockwood determines must-be-called the space and time of the mind. 101

In our geomodal framework this includes 5d via the redistributed hypersurface of the present into AdS. Minkowski’s 4d local, relativistic lightcone is also inverted in the geomodal model to represent Whitehead’s first metric in the context of the sensory- present of a conscious experience or individual. Correspondingly, the second metric is represented by a uniform manifold in background “vacuum” space.

2 Second Metric, dJ , gravitational

First Metric, dG 2 qua special uniformity of XT

The second metric will be developed in the capacity of a derivative metric, not a fundamental one. While the manifold of hypersurface is considered fundamental qua soular, the individual ‘durations/events’ or snapshots are patently derived from instants of topological measures involving the simultaneity of eternal objects in manifold space. Thus, the hypersurface of the present provides a derivative metric, not fundamental one. By comparison, the first metric represents a uniform geomodal element of consciousness from a conceptual, first-person vantage. This provides a uniform, virtual metric for first- person experience as part and parcel of the universe.

Symbolically, the classical lightcone of two outward-facing Minkowski lightcones meets in the middle at a point, or intersection. This issues a sign or symbol for the point-like classicality of a substance metaphysics seeking ultimate exactness. When we invert the lightcones, however, instead of a point we immediately recognize an interior region of space that emerges in place of substance metaphysics to an experiential, process metaphysics. Thus, by inverting the lightcones we geo-symbolically bring the model into a process and emergent alignment consistent with the burgeoning interpretation of modern physics, string theory, and cosmology (see Eastman, 2008).

The inverted Minkowski lightcone represents the essential construct for consciousness in the geomodal model. Here we attribute the emergent, toroidal region of the model with the primary field of sensory-perceptive awareness/consciousness and experience in any given individual. When we speak about our field of consciousness and awareness, therefore, we are referring to the inverted (special) uniformity of Minkowski’s lightcone formalizing the space and time of consciousness, as Kant sought to define. Here, the observer is not at a point, but in a region of spacetime.

Hypersurface of the Present and Manifold

In geometry, a hypersurface is a generalization of the concept of hyperplane. Suppose an enveloping manifold M has n dimensions; then any sub-manifold of M of n − 1 dimensions is a hypersurface. 32

32 See Kobayashi and Nomizu, 1969 102

Normally we consider both hypersurfaces and hyperplanes as existing in one fewer dimension than its associated space (G.O. James, 2010). When we invert the lightcone it is possible that we also transform the hypersurface into a five-dimensional object. We consider such an arrangement here. 33

A simple inversion of the lightcone (as in Lockwood’s model) fails to account for the hypersurface of the present. This is because the inversion of the lightcones leads to an emergent space/field that Lockwood calls a “mental space” and a decision space, and that the geomodal model develops as the field of consciousness/awareness/intuition/insight. Given the emergence of “mental space” ‰ “field of consciousness/awareness” as an elliptical field, this pushes the hypersurface of the present up into the pre-material, or potential, “eternal objects” realm where it acts as a frame of reference capable of converting values from potential into actual. In this setting, the manifold gives mass; the horizon makes values real. We’ll come back to this slogan later.

where:

is of

+ + like hypersurface of a manifold

The field of (conscious) awareness reflects the agential “now,” or present tense. The hypersurface of the present reflects the present in “vacuum”-connection, as a privileged (unique) spatial frame-of-reference. The hypersurface is derived from a manifold in 5d.

Whitehead’s second-metric is considered as a derivative hypersurface of a manifold, or “representational apparatus” (Schwanauer, 2005): a snapshot-taking mechanism (in the geomodal description). This is logically achieved through the realization that both the first and second metrics are like holographic correspondences belonging to the identity and existence of experiential systems (and their operators). In this sense, the hypersurface gives each system a unique connection (frame of reference) in “vacuum” space—as our “vacuum”-level connection to the universe and others, collective consciousness, history, etc. (see Chew, 2008). This takes place at the microscopic domain underwriting matter and conscious systems in “vacuum” 5d background space.

The manifold represents a frame-generator of hypersurfaces (snapshots, D-branes). Schwanauer describes these as ‘representational frames’ of a ‘representational apparatus’ (2003, 2005). Aside from grounding our metaphysics in an experiential platform, this move also places experiential systems within nature. The second metric does not appear within conscious awareness (PRel, 1922), however, like Whitehead did, and is instead considered ontologically-prior in distinguished-immediacy to the sensory-present, only

33 AdS-space also involves a hypersurface. 103 made available in these “snapshots” representing the starting points of the AE process as well as a basis for Dn-branes in string-theory.

The hypersurface defines a snapshot of the frame of the ontological present of immediacy of the background kinematic elements of the physical field (dJ 2) in Whitehead’s Principle of Relativity (PRel). As such the hypersurface is a derived value, like a snapshot: it is a sampled or captured moment. Geomodally, the manifold is like the frame of a camera constantly generating hypersurfaces (snapshots) of the contingent contents streaming through it. In Whitehead’s terminology the manifold is an event-generator.

The hypersurface represents the ontological present, 5d

The ‘X’ represents the causal nexus between 5d and 3/4d

The intercession of two lower cones represents the sensory-now, 4d of 3d

A uniform basis for factual experience (consciousness of the world) is presented in the elliptical region of geomodal model, while the proto-constituent values of experience bear in the hypersurface connection in “vacuum”-mode of ontological immediacy, as dynamical strands. Symbolically, we recognize the natural distinction between the two metrics in the geomodal model qua basis of the X [here “X” denotes a region of intersection vs. earlier usage within this book of denoting a spatial dimension] that emerges in the layout. This X is taken as the symbol for Schwanauer’s “causal nexus” (2005) as an intermediary “representational-apparatus” between the two modes of higher- and lower-dimensional representations distinguishing between the ontological immediacy of hypersurface of the present in microscopic setting and the sensory- conscious “now” of the present, in conscious systems.

Ontological Immediacy v. Sensory-Conscious Present

The interesting thing about Whitehead's approach is that, while his epistemology is developed through an appeal to our immediate sense-awareness in the analysis and structure of nature—in the geomodal method, AE’s are posited as modes-of-nature that persist, given their ontological immediacy, at a phenomenal time-scale just prior to the (ordinary) awareness spectrum in human experience. 34

The way we maintain both is to recognize that immediate sensory experience refers to sensory qua physiological and affective, as drops of pure experience in phonons 35 that make immediate impact on physiological lowest levels, just below (ordinary) neural threshold of awareness. In the geomodal model this describes the ontological immediacy

34 Even though still posited to make a sensory impact directly on physiological and affective systems, and putting the two together, into qualia of the neurovisceral axis, perhaps preferentially exuded in the 2x2 matrix of higher cortices (see Craig, 2009; Allman, 2010; Mayer, 2012) 35 Referring back to earlier discussion of “phonon” propagation in this fundamental information-theoretic substructure. 104 of phonons making interoceptive, affective, and physiological impacts just below the (common) threshold level of awareness.

As such we must qualify Whitehead’s claim that “ nature is given in sense awareness as now present ” (Fowler, 1975, p.63) to offer instead that nature is given in AE’s as ontologically- present and therefore prior to brain-signaling. This precludes the vast majority of fundamental “feelings” and values from being consciously witnessed in the ‘now’ of sensory systems, and only dimly felt, if at all. 36 To explain this simply we need only to recognize that consciousness is more than we see in visual experience. We are conscious of a ‘now’ moment of the ever-streaming present, but in fact, there is also a more-subtle mode of consciousness whose values transpire processually in ontological immediacy, signaling prior to the ‘presentational immediacy’ (PR 192, 198) of sensory-systems.

Translated into Taoism: the ‘now’ that can be named (experienced) is not the true ‘now’— there is a hidden order of dynamics and values persisting before this. The ‘actual’ now pertains prior to signal conductance and neural complexity; thus, as the East ascribes, the perceived world is a persistent illusion: it’s real , no doubt, but not where the true mode- of-values associated with experiential-dynamics occur. 37 This describes the distinction between the modes of actual, ontological-immediacy corresponding to an individual’s “vacuum”-level connection, and the ‘specious -present’ (‘now’) perceived through sensory systems. The geomodal model clarifies that immediate-actuality takes place at a scale of dynamics once-prior to agential 38 sensory-awareness by some temporal lapse (see Peirce, Einstein). To these ends, qualia are sensory, but at the lowest-level of physiology and experience ordinarily below our conscious-threshold of awareness.

There are thus two ‘presents’ of consciousness described in the geomodal model: the ontological (operational) and experiential (agential). Here, sensory experience is perceived to some degree of ontological delay from actual occasions to the extent that ontological immediacy takes place in a phase prior to sensory experience. 39 As such we also might imagine two time-systems associated with Whitehead’s first and second metrics (PRel, 1922 ). Geomodally, “ontological immediacy” is derived through snapshots of strands. This qualifies experience as an effective present, like only seeing the sun’s light eight-seconds after it shines. This inherent delay predicates the temporal distinction between actual, ontological immediacy and the sensory-conscious present of experience (represented in the toroidal region of overlapping lightcones). We thus retain Whitehead’s appeal to an ‘immediacy of sense-awareness’ but reconstruct it to recognize an ‘immediate slab of nature’ as once-removed from the sensory-present and not readily available to human perception, 40 however possibly underwriting the interoceptive faculty.

36 Eastman considers “feeling” (positive physical prehension) in Whitehead’s framework with reference to biologist Charles Birch’s book “Feelings” written in 1995. 37 This line of logic: “real but not true” comes from George Kollias, high school humanities teacher, 1997. 38 Agential = 1 st person; Operational = 2 nd person, of autopoietic operations below the neural threshold. 39 This is characteristically exhibited in the famous Bereitschafts potential of motor activity in neuroscience. 40 This proves advantageously so, otherwise it would be too complex to make out objects in the world. 105

5.2 — WHITEHEAD & VERLINDE SIGNATURES in an EVENT-NARRATIVE

How an actual entity becomes constitutes what that actual entity is; so that the two descriptions of an actual entity are not independent. Its ‘being’ is constituted by its ‘becoming’. (PR 23)

The inversion of Minkowski’s 4dxt lightcone generates the symbolic space-time construct of consciousness in the geomodal narrative. Minkowski’s hypersurface of the present, in the inverted mode, becomes the connective-token bearing each individual a unique residence in the “vacuum,” 5d space. This generates the following figure:

This provides the basic, geomodal construct describing consciousness and the pre-material (soular) connection from a first-person, experiential metaphysics. The X in the model symbolizes a causal nexus signifying the earmark of a process (Schwanauer 2005). Further, the arc represents the AdS horizon. This locates the 5d local “vacuum” connection of soular system per each conscious individual, in a black hole construct.

While the lightcone (experiential, sub-cellular, neurobiological) is well-established from a theoretical perspective, it does not enter directly into the mainstay of this study; rather, we will focus instead on the development of a primordial process (in the upper cone) whose applied dynamics are shown to bear equivalently across the domains of EG, string theory, and the AE’s. As such, we will only attend in this study to the upper manifold: the ontological, cosmological, and physical components (Whitehead’s ‘physical pole’). 41 Through this construct is substantiated a process phasing from the upper diamond into values in AdS propelled to the horizon. This refers to a foundational process underwriting conscious and material systems, as the cornerstone of experiential metaphysics in a process paradigm.

The first-installment of this event-logic can be categorized into four topics guiding the conceptual progression of a narrative sequence. Accompanying these are four, pre- linguistic tokens representing the kinematic action of an event/process-cycle read-into string theory, emergent gravity, and the actual occasions. This begins with: 1) sea of strands; 2) snapshot; 3) phonon; 4) holographic dual of snapshot. The following exposition will provide an effective overview of synchronized correspondences between the objects, properties, and principles found both in Whitehead’s AE’s and Verlinde’s emergent gravity models. The basic scenario is one where an intermediary process spans between two modes of being—from the pre-spacetime scenario of a holographic black hole, to a real value added to the horizon (like a polymer, see Verlinde, 2010). Verlinde also takes a holographic black hole scenario as one of his starting points for emergent gravity (the other being ‘information theory’). We develop this intermediary process in the context of an event-logic utilizing both Whitehead’s AE’s and Verlinde’s EG models.

41 Experiential and physical poles form the basis of Whitehead’s AE’s as a ‘dipolar’ process. 106

I. SEA of STRANDS — streaming through local background (“vacuum”) of physical metric, dJ2; kinematic elements; microscopic data underwriting gravitational force; continuous potentialities into atomic actualities

This process begins with the description in a 5d “vacuum” environment of a sea-of- strands streaming in continuous flux thru a local background space. Strands appear in the basic semblance of those postulated by Chew (2004) and in this study further depicted as evolving potentialities of dynamical information whose values are attributed to the organization of internal parameters. Visually these confer like strands of digits with slowly-evolving cores and fast-fluctuating tails, teased-out in the following images:

The tails bear the property of appearing in a state of dynamic-instability, or perpetual flux. Given the constant flux of strands, they are not attributed a definiteness of value; as Whitehead explains, “in the essence of each eternal object there stands an indeterminateness which expresses its indifferent patience for any mode of ingression into any actual occasion ” (SMW 171). They are like coded digits of coefficient decimals with dynamical tails, and as such they can also be associated with massless particles, i.e. bosons. Such a basis for a sea of strands in “vacuum” is at least plausible, given Eastman’s account how “ recent research results (primarily within the past three decades) have dramatically shown how energetic particles, fields, and space plasmas permeate the cosmos at all levels; that is, there is no such thing as a pure “vacuum” anywhere ” (Eastman, 2004). Thus, our modern concept of nature revises from a static universe to reflect that “ all things are constituted ultimately by networks of relationships, from the microscopic to macroscopic and cosmic scales ” (Eastman, 2008).

All strand values moving through the background plenum of events can travel freely through the manifold as if it were not there. The only time the manifold ‘takes mass’ is in a snapshot wherein all strands at that instant are captured. Even though any strand is eligible for inclusion in a snapshot, only those found in the frame at that moment are included in the “actual world” (PR 73) and transformed from a “continuous potentiality” into a constituent of’ “atomic actuality” — as the initial conditions of a process. Similarly, Chew calls for the identity of pre-event, pre-material, “vacuonic strands,” where ‘material’ refers to all matter, including strings (see Eastman, 2004). As Chew states:

Interpretation needs to be found for ‘loose vacuonic strands’ of history that lie outside material fabric even though comprising pre-events that, individually, are similar to those building material reality. The vast majority of history consists of such loose strands, which influence material history, although not in the manner, according to the laws of materialistic physics, by which one piece influences another. (2004)

107

Verlinde’s Microscopic Data

In Verlinde’s model, the fluctuating-strands are considered as microscopic data. Strictly speaking, however, Verlinde’s model is independent of the microscopic theory (Verlinde 2010; Chen 2010). Given the holographic theory, “ the dynamics on each screen [are] given by some unknown rules, which can be thought of as a way of processing the information that is stored on it. Hence, it does not have to be given by a local field theory, or anything familiar. The microscopic details are irrelevant for us ” (Verlinde, 2010). This means the theory of emergent gravity should not depend on the underlying details of the microscopic theory from which it emerges, only the total entropy, making it an ‘effective’ theory. Verlinde explains:

To determine the [gravitational acceleration] force we don't need the details of the information, nor the exact dynamics, only the amount of information given by the entropy and the energy that is associated with it. If the entropy changes as a function of the location of the matter distribution, it will lead to an entropic force. (Verlinde, 2010)

As such, Verlinde’s model is able to leave-out the microscopic theory as inessential, and thereby the philosophical roots that locate the theory in metaphysics and ontology. We realize in this that science begins from an ‘effective’ level, and thus so too does theory. Echoing Emerson: “ something is wanting in science until it has become humanized ” (1838). We apply Whitehead’s AE’s within an event-logic with the intent of filling this “vacuum”.

In terms of a string-theoretic correspondence, when we turn to the very first, bosonic string theory we find a possible connection involving tachyons. Specifically, ‘continuous strands’ (Chew, 2004) could be framed in the context of (bosonic) tachyons and this would account for why they aren’t witnessed (as such) in the material sector: they describe not ‘strings’ but ‘strands.’ Whereas Pauli’s exclusion principle qua fermions implies exact or critical strings, the bosonic (massless) strands demonstrate the ability to contain several coexisting values. In fact, fermionic strings might only exist in snapshots. 42

Whitehead’s ‘Continuous Potentialities’

Whitehead drew a basic distinction upon which his ontology can be based: continuous potentialities versus atomic actualities. “ Continuity concerns what is potential; whereas actuality is incurably discrete ” (PR 61). As he continues: “ actuality is the exemplification of potentiality, and potentiality is the characterization of actuality ” (MT 96). “ Actual entities make real what was antecedently merely potential ” (PR 72). ‘Pure potentialities’ represent microscopic data of the physical field/metric, dJ 2 (PRel, 1922), operating in a pre-space- time, pre-material, pre-quantum mode corresponding to a plenum of values in “vacuum”.

We can make a further link between the ‘continuous potentialities’ of microscopic data (strands) and the kinematic elements of Whitehead’s second metric, dJ 2 (PRel, 1922).

42 As open-strings of a Dn-brane 108

Kinematic elements, similar to eternal objects, may be termed “ pure potentials for the specific determination of fact, or forms of definiteness ” (PR 22). Here, Whitehead’s “continuous potentialities” are taken to refer to evolving strands streaming through the “vacuum” of Minkowski-space like quantum fluctuations in an evolving superposition.

Whitehead explains how “ entities in their capacity as infinite aggregates cannot be the termini of sense-awareness ” (CN 92). This speaks to the notion that strands in their continuous mode are not the termini of sense-awareness, either: we do not witness strands in their strand- mode, they are once-removed from sense-awareness in the “vacuum” mode. Here we might be led to follow Pythagoras to say that in this dimension, all appears in the mode of number. Whitehead also characterizes these potentials as ‘eternal objects’:

One type of potential entity, namely the pure potentials or eternal objects, while in their essential natures are eternal, by the ontological principle cannot exist except as implicated in some particular actual entities. (Leclerc, 1958, p.100ff.)

Within Whitehead scholarship the question emerges whether ‘continuous potentialities’ have their own realm or not (Christian, 1959; Nobo, 2004; Ford, 1984). Whitehead does not accept the doctrine of Platonic realism in the sense that it would suppose eternal objects as a kind of actuality existing independent from actual entities; instead, ‘pure potentialities’ are the microscopic data of the physical field/metric, dJ 2 (PRel, 1922); thus, they are not purely independent like Plato suggests. Leclerc attends to this by pointing- out that potentiality is potentiality for actuality: “in their nature as potentials, eternal objects must be given to actual entities [...] Potentiality implies the capacity for givenness, given the role of indeterminateness in potentiality ” (1958, p.98).

Here we attempt to update both Plato and Aristotle/Whitehead’s views by re-envisioning the distinction between potentiality and actuality, like Chew (2004), under a modified scenario operating at a sub-Planckian pre-space-time mode. This enables us to describe how eternal objects (EO’s) qua strands do have their own realm in the 5d wing; and that kinetic elements (KE’s), like frozen strings of strands in snapshots, are also unique, representing the values initiating the sequence of a process/event logic.

In this way, Plato’s perspective that the two are independent is taken to indicate the fundamental difference in modality ; meanwhile, Aristotle’s relational perspective is accounted for by the way in which EO’s evolve over phases of the process. Thus, eternal objects and actual entities are not independent of each other, but related in the sense that the forms of definiteness of the EO’s exist “ as pure potentials in need of actual entities to turn their potentiality into actuality ” (Krause, 1997; p. 49).

109

II. SNAPSHOT — samples of frozen strands; the Measurement Problem; multiplicity of initial data; Dn-brane of open strings; continuous potentialities ‰ atomic actualities; instantaneous spread of elements; actual world; duration; events

Each actual entity is conceived as an act of experience arising out of data. The objectifications of other actual occasions form the given data from which an actual occasion originates. (PR, 65)

The central topic to be addressed now is how this sea of strands, as “continuous potentialities, modally transform into atomic actualities. In order to provide integrity to both Whitehed and Verlinde’s models, some mechanism or operation must be added (to AE’s and string theory) in order to convert the streaming strands in “vacuum”-mode into initial conditions of the material sector. To answer this is also to provide substance to a narrative and description of the measurement problem in physics.

This section is organized around two topics: the snapshot of a manifold (in a sea of strands) as a “mechanism” for the measurement problem; and the snapshot as an “event”: a photograph of frozen strands depicting the “vacuum” elements (of a manifold) on a derived hypersurface. We reconstruct the same logic in string theory from a “Dn-brane of open strings,” and in the AE’s from a “multiplicity of initial data” — both taken as the initial conditions of a generative event-cycle of AE’s and foundations of physics.

Snapshot as Mechanism: offers candidate resolution to Measurement Problem

Classically, the measurement problem refers to the conflict between the linear dynamics of quantum mechanics in-concert-with the postulate that during measurement a non- linear collapse of the wave-packet occurs (see Krips, 2013). Albert elaborates the point:

The dynamics and the postulate of collapse are flatly in contradiction with one another [...] the postulate of collapse seems to be right about what happens when we make measurements, and the dynamics seems to be bizarrely wrong about what happens when we make measurements, and yet the dynamics seems to be right about what happens whenever we aren't making measurements. (Albert 1992, 79)

We take this to indicate that the conceptual phenomenon of measurements is real, but understood in the wrong context, and describing the wrong elements. In order to comprehend the measurement problem we have to instead re-envision autopoietic measurements as snapshot-samples taking place one-level removed from human ability.

A few, key properties emerge proving useful to reinforce descriptions of the strands; namely, the wave-function, superposition, and Heisenberg’s uncertainty principle. We attend to these below. Converting the QM discourse into philosophical language, we recognize our arena within the classical distinction between ‘potential’ and ‘actual.’ Here, measurements are the mechanism that ‘makes actual’ the potential values; thus, we must also posit a passage, or process, from one mode to the other. Whitehead refers to this under the distinction between “continuous potentialities” and “atomic actualities.”

110

We are reminded that in the geomodal metric the 4d hypersurface as a foundational metric disappears in the inversion into geomodal mode. To remedy this, the geomodal model in 5d as derived from a causal manifold and the measurement problem is refashioned on the order of a camera scenario: the ultimate paparazzo.

This study considers a snapshot mechanism used to measure samples of strands, representing the “diversification” of a set of facts: an event. This means that strands cannot come into existence in the material sector except as sampled approximation- values qua frozen strings. These frozen moments of “continuous potentials” are what provide the samples for the “specific determination of fact” (PR 48) and ‘forms of definiteness’. They pass from potential to actual by virtue of being captured in a moment.

Instead of a direct operation on the wave-function, collapsing it in a measurement, our ‘access’ to the wave-function is reconsidered as always in an indirect mode; to these ends, we adopt the language of a field biologist and speak of taking “samples” of the sea of strands. This describes the microscopic data of “continuous potentials” streaming through the “vacuum” of physical field, dJ2. Snapshots are fashioned from samples of these continuous potentialities, like hydro-forensic reports. The contents only appear to collapse (freeze) in the frame, but the wavefunction itself as a global property, which continues to represent possibility space (see Bub, 1999), does not actually collapse.

Visually, this describes the snapshot and its data as an initial “event” that sets the process into motion. Snapshots are defined as approximations of exact strings qua frozen moments of strands with some degree of “ ultimate indeterminateness and indifference ,” as Whitehead describes (PR 113, 281). 43 Here we regard snapshots as derived hypersurfaces of a manifold in 5d. As such the snapshots are not fundamental but rather, emergent, virtual-abstractions. Whitehead explains in his earlier work, “ I have always insisted that our lowest, most concrete, type of abstractions whereby we express the diversification of fact must be regarded as ‘events,’ meaning thereby partial factors of facts retaining process ” (PRel, 1921). Tanaka draws out the point further:

According to Aristotelian ontology, Being precedes Becoming because the former is the actuality of the latter. The opposite is the case with Whitehead. Becoming is the actuality of Being: what has been thought to be substantial Being must be re-interpreted as derivative from Becoming. Therefore the most fundamental category of nature should be found in "events", and not in "substance." (Tanaka, 1987)

Whitehead also correlates an event (or ‘duration’) with an “instantaneous moment of time,” herein describing it as “a slab of time with temporal thickness, and the final fact of observation from which moments and configurations are deduced as a limit which is a logical ideal of the exact precision inherent in nature ” (PRel 7). As he explains: “ a relativistic view of time is adopted so that an instantaneous moment of time is nothing else than an instantaneous and

43 In QM we consider this in light of Heisenberg’s Uncertainty Principle. 111 simultaneous spread of the events of the universe ” (PRel 7). This means that we can further our understanding of an event (qua duration) as an ‘instantaneous and simultaneous spread of the events’ in an actual world (universe)—that is, in a snapshot. Formalizing it:

Our sense-awareness posits for immediate discernment a certain whole, here called a ‘duration’ [...] a duration is discriminated as a complex of partial events [...] a duration is a concrete slab of nature limited by simultaneity which is an essential factor disclosed in sense-awareness. (CN, 53)

The metaphysical key to a better understanding of Whitehead’s physics is his claim that “the spatiotemporal and lawful texture of the external world is an expression of the relational texture of the internal worlds of its elementary events ” (Desmet, 2010).

The concept of events as four dimensional structures plays the role of mediation between space and time. Both matter as a self-identical substance and space-time as a fixed framework of physics are to be deconstructed to the interrelation of becoming events. (Tanaka, 1987)

This speaks to the precursor logic to space-time that Moskowitz describes as the “ smaller ingredients that exist on a deeper layer of reality” where “ space-time’s properties emerge from the underlying physics of its constituents ” (2014). To these ends, Whitehead’s ‘instants of time’ are not point-like, but event-like collections of frozen strands in a snapshot. This also validates Whitehead’s resolve in process philosophy to update from a point to an event logic; as he explains, we must “ describe what a point is [by showing] how the geometric relations between points issues from the ultimate relations between the ultimate things which are the immediate objects of knowledge ” (PNK 5). Grünbaum paints the scenario:

On the relational theory of space demanded by relativity, geometry is the science not of absolute 'container'- space, but of the complex relations obtaining directly between physical things. […] We must therefore abandon the nineteenth-century approach to the foundations of geometry, an approach which proceeded from the assumption of points as ultimate given entities. (1953)

To describe the snapshot scenario in string theory we turn to the bosonic theory, this time to Sen’s solution for tachyonic condensation (1998), and describe a two-part interpretation within an event-narrative speaking to how the snapshot mechanism interacts with strands (as tachyons).

It turns out that the precise closed string states that are emitted from the brane depend on the physical interpretation of the Sen’s solution. The first is the one […] where one thinks of a tachyon field coming up from its true minimum to a point close to the local maximum associated to the unstable D-brane and then back to its true minimum. (Maldacena, 2007)

Within the event-scenario so far postulated, we are led to interpret Sen’s solution as a first-indication of the nature of a snapshot whereby a collection of (tachyonic) contents— flowing inertially through 5d “vacuum” as a ‘sea of strands’—are locally contained in a hypersurface frame and lifted from a minimum to a local maximum state associated with a snapshot (D-brane), and then back to a minimum in inertial flow through “vacuum.”

112

The snapshot is described as an infinitesimal “halting” of field tachyons in the frame raised from a minimal value, streaming inertially through “vacuum” background, to a local maximum, where a ‘snapshot’ is taken and then releases the values back into their inertial flow. This sets up the logic for the second interpretation of Sen’s solution for tachyonic condensation; as Maldacena explains (1997):

The second interpretation of the solution is more closely related to the Euclidean computations. Here we cut the Euclidean path integral at t = 0 and paste it to the Lorentzian path integral. In this way the Euclidean path integral is viewed as a prescription for setting up initial conditions at t = 0. […] In this case the closed string radiation that comes out of a decaying brane is basically identical to the state at t = 0.

It is ‘basically identical’ because it represents a collective value, or quasi-particle. Within a geomodal construct we take the first sentence to represent the corresponding property of a snapshot as a ‘cut’ (or partition function) of the Euclidean path integral. The Lorentzian evolution results from these initial conditions in the same sense that the principle process results from a Dn-brane of open strings, as the initial conditions. Furthermore, we also recognize the logic for the phonon as a large, ‘closed-string radiation’ bursting-off the decaying brane of the snapshot. We describe this now.

Snapshot as Contents

Whitehead’s description of a “duration” as “ a complex of partial events ” (PRel 7, 35) fits nicely into the postulation of a snapshot of dynamic strands frozen into samples as representing the initial conditions of an event-cycle. Here, ‘partiality’ corresponds with the ‘virtual’ nature of frozen strands to appear as if exact strings and only partially- reflecting each strand sampled.

In string theory we also witness the fact that bosons and fermions are related to each other and call this supersymmetry, developing the axiom that for every boson there is a fermion, and vis-versa. Here we would also suggest a relationship between bosons and fermions, however we offer an alternative explanation for how this dynamical relation comes to pass. In doing so we take the fermionic strings as open stings on a Dn-brane and suggest that these are the corresponding fermion states to initial bosons—and in fact, that to consider a simple 1:1 correspondence could be to misunderstand the nature of bosons as a superposition of values. 44 Instead we develop the fermion correspondence to bosonic strings as the open strings on a Dn-brane spontaneously arising from snapshots of strands in a causal manifold. 45

Within this model, it is important to recognize how only a very-small amount of the total strands are even realized; for one, we never capture a full-strand in a snapshot. Strands

44 That is, how could a superpositional boson be represented by one fermion? 45 We might consider snapshots to take place under the logic of Penrose’s OOR where intrinsic information-density levels in frame are what initiate QG collapses. 113 streaming through “vacuum” mode each represent a superposition of values; this is also seen on account of the dynamical tail of each strand. In any given snapshot, only one moment of a superpositional strand of digits is portrayed. This is like taking a measurement of a quantum system: you don’t get the whole system but only a moment or sample of the system. In QM, the property of “superposition” describes states with complex coefficients carrying phase information and producing interference effects (see e.g., Dantas et al., 2000; Mastriani, 2015). Secondly, only strands captured in a snapshot pass into material forms. This greatly constrains how much interaction there is with strands in the first place, possibly speaking to a model for dark matter, dark energy, and alternative to a multiverse. 46 Snapshots represent the starting-basis for the AE process; as a mechanism we consider them as a theoretical resolution to the measurement problem.

Frozen Strands in a Snapshot

Snapshots are like the first bits of experience rendering raw-samples of the universal mode of continuous potentialities (strands) into frozen moments as (nearly) exact strings. When snapshots are taken, values in the frame become like frozen-strands or open- strings attached to a Dn-brane. We could also consider these strands as bosonic strings.

A snapshot conforms to Whitehead’s description of a uniform, instantaneous spread of the universe qua kinematic elements 47 of the initial data of an occasion. As a sampling of dynamical strands in a snapshot, each strand is defined by a kinematic element and eternal object, as a glimpse into the values of that strand. The first earmark, or initial conditions of the AE process appear as virtual samples of KE’s/EO’s captured in the frozen moment of a snapshot: as the frozen strings of strands. To these ends Whitehead explains how “ actuality is the exemplification of potentiality, and potentiality is the characterization of actuality ” (MT 96); “ actual entities make real what was antecedently merely potential ” (PR 72) and “every occasion is a synthesis of being and not-being” (SMW 163).

This suggests the way in which superpositions ‰ sampled measurements. The initial data represent the “ antecedent experients ” (PRel 10, 17) whose antecedence is defined in relation to the initial data and subsequent, ‘objective data’ (PR 190, 254, 263). In Whitehead we link KE’s to the core of a strand, and EO’s to the frozen moments or glimpses into the contents of that strand, as an event in the actual world of that strand. With the advent of a snapshot, EO’s partially objectify in KE’s in the sense that each KE reflects EO’s in its tail value. The KE’s in flux represent historical-time (Chew, 2008) while snapshots represent an instant of time. The configuration of frozen KE’s/EO’s resembles the ‘logical ideal of exact precision’ by virtue of appearing like exact strings of the strands.

In the following diagram, if formatting permitted, you would see a coefficient decimal:

46 Recalling Chew, the vacuonic strands make up the majority of values in the universe. 47 Kinematics = the study of motion excluding the effects of mass and force. Whitehead’s “kinematic elements” also resemble “bit-strings” (see Noyes, Parker-Rhodes) of information. 114

Kinematic elements = the core of each strand; Eternal Objects refer to the dynamic tail values in the strand.

The dynamical content of each tail of a frozen strand is compared to an EO as a factor; the collection of them in a snapshot is compared to “ a collection of factors in contingent relationships ” (PRel 17). The dynamic, continuous strand itself is the fact, or kinematic element (PRel 18). “ What is in fact given to a concrescing actual entity is not pure potentials as such, but antecedent actualities as formed or determined by certain eternal objects ” (Leclerc, 1958). This can be shown to correlate with the setup of event-logic where pure potentials = continuous potentialities. This means that the snapshot data represent the antecedent actualities formed by certain eternal objects. We take this here to represent the antecedent actualities as open-strings on a Dn-brane determined by ‘certain’ eternal objects, or strands in the frame of a snapshot. Leclerc spells out the point in the following entry on EO’s: “ the antecedent entities are given through the mediation of eternal objects, and the eternal objects are given as implicated in the antecedent actual entities ” (Leclerc, 1958). The data are thus complex and Leclerc advises that we therefore look carefully into the complexity of this aspect. As he explains:

Eternal Object: in Whitehead, the only category of existence whose members are not created by the becoming of actual entities. A geometrical form, a shade of blue, an emotion, and a scalar form of energy are all examples of eternal objects. Basically they are qualia and patterns whose reproduction within, or ingression into, an actual entity render determinate the latter’s objective and subjective content. Thus, relative to actual entities, they are said to be forms of definiteness or pure potentials for the specific determination of fact. (Leclerc, 2009)

We take this to refer to a strand in a snapshot where the bulk represents the “fact,” and a “factor of a fact” refers to the EO that in that moment characterizes the value of the tail. As Whitehead explains, “ a factor will be said to be an adjective pervading a route when it is an adjective of every stretch of the route. Such a factor will be called a pervasive adjective, or uniform object ” (PRel 32). In this way, an “adjective” refers to a factor within a snapshot, and each factor, as the momentary product of the tail, is necessarily that factor on account of its membership in the general ‘fact,’ or bulk of the strand itself. As Whitehead states, “ Fact is a relationship of factors. Every factor of fact essentially refers to its relationship within a fact. Apart from this reference it is not itself. Thus every factor of fact has fact form its background, and refers to fact in a way peculiar to itself ” (PRel 14). The adjectives captured in a snapshot are “pervasive” (PRel 32) over the ensuing course of dynamics in the process.

While any eternal object can be part of an ingression into an actual entity, only some are, others being incompatible by virtue of not being contained in the frame of that snapshot. As Whitehead explains, not all EO’s everywhere are meant, but that “ any one-occasion prehends only a selection of the eternal objects ” (PR 31f). This refers to prehension of only those EO’s simultaneous in a snapshot.

Adopting the term “selection of EO’s” affords it nicely into the logic of a snapshot, while also corresponding with Whitehead’s notion of the “actual world of an occasion;” thus,

115 only the strands captured in a snapshot are valued into open strings on a Dn-brane. Whitehead also describes this by the fact that “ every item in an actual entity’s universe is involved in each concrescence, but not all entities are items in every entity’s universe ” (Lango, 1972; p.23). “ The universe of an actual entity, its universe, is its actual world ” (PR 33-34). Thus, rather than a snapshot of “the entire universe,” we instead recognize it as covering only a hypersurface. We thus resolve Whitehead’s notion of an “ instantaneous configuration of the universe ” (PRel 7) as belonging to the “ actual world of an occasion .”

Open Strings on a Dn-Brane

Dirichlet boundary conditions are specified from Neumann boundary conditions such that in Dirichlet’s example strings end on a D-brane. As we saw in chapter two, D-branes are objects relating in string theory qua attached open strings (Polchinski, 1997). As Polchinski (et al) explain: “ in weakly coupled type II string theory, D-branes are defined by the property that fundamental strings can end on them ” (Polchinski, Chaudhuri, Johnson, 1996; Kutasov, 1998). Actually, Polchinski’s original accomplishment involved recognizing d- branes as equivalent with “black p-branes” (Kutasov, 1998).

A Dirichlet p-brane stretched in the hyperplane, located at a point, is defined by including in the theory open strings with Neumann boundary conditions for [the hyperplane] and Dirichlet boundary conditions for [the point elements]. (Kutasov, 1998) 48

Verlinde’s description of emergent gravity also includes, on a string-theoretic level, Dn- branes of open strings stacked on-top-of each other. 49 As Verlinde explains, “ In string language, the holographic screens can be identified with D-branes, and the microscopic degrees of freedom on these screens represented as open strings defined with a cut off in the UV ” (2010).

This comes close to describing the type of dynamics simplified in Verlinde in the decoupling-state of open and closed strings. Within the context of a snapshot we consider a Dn-brane of open-strings attached on one end. Instead of stacked, we place them all on one complex surface. Verlinde is convinced there must be something more behind open-strings, however.

The event-narrative built-up so far validates Verlinde’s intuition through recognition of Dn-branes as the first components of the process corresponding to a “snapshot” of “strands.” The data of strands generate the illusion of “enduring objects” (PRel 32, 56) qua frozen moments of otherwise dynamic and superpositional strands; we can thus resolve that open strings are not “really-real” but instead, virtual approximations of the frozen moments of strands; thus, not entirely emergent, but not representative of the full strand value, either. Open strings, therefore, are not recognized as fundamental values, but as proximal and partial values emanating from frozen moments of strands in hypersurface.

48 Some approximations of black branes might lead to entropy consistent with energy from E=mc 2 49 Open strings on Dn-brane also resemble 1/n expansion in YMT. 116

We can classify this particular relationship as found in the principle-element of Buddhist metaphysics and worldview qua ‘dependent origination’. In fact, this is precisely the way we clarify and contextualize the nuanced identity of open strings as distinguished from emergent (and holographic) closed strings. As such, open strings in an event-logic are shown to originate from the frozen moments of otherwise streaming strands in flux through background “vacuum” dimension (in 5d). These arise via snapshots of all the strand elements within the manifold frame, like a camera frame. This means that open strings ‘depend’ for their ‘origination’ on frozen moments of strands.

Multiplicity of Initial Data

For Whitehead’s counterpart, we identify a snapshot of frozen strands as open-strings on a Dn-brane relating what Whitehead calls a “multiplicity of initial data” (PR 20, 244). He also refers to KE’s appearing in the instantaneous spread of a “ multiplicity of initial data .” From the last discourse we acknowledge that this is not a fundamental entity, either. In fact, Whitehead was the first to acknowledge this, himself, when he corrected his “categories of experience” (PR 14, 48, 50) from a list of eight features, including the “multiplicities,” into a list of six, excluding them (PR 48). In so doing, he concludes that multiplicities are real —representing the initial configuration of the process—but not persisting in themselves.

A multiplicity of simple physical feelings constitutes the first phase in the concrescence of the actual entity which is the common subject of all these feelings. All the more complex kinds of physical feelings arise in subsequent phases of concrescence, in virtue of integrations of simple physical feelings with each other and with conceptual feelings.

The reasoning for this follows from (his) “reformed subjectivist principle ” (PR 18, 214); this is the idea that “ data, as a sheer multiplicity, cannot be felt, and there can be no togetherness except togetherness in experience ” (PR 189f). As Ford explains, “ a multiplicity is not a proper entity; in fact it is difficult to see how it is any entity at all ” (EWM 214). By Whitehead’s correction of his ‘categories of existence’ we witness precisely the same thinking. As he states: “ The many feelings which belong to an incomplete phase in the process of an actual entity, though unintegrated by reason of the incompleteness of the phase, are compatible for synthesis by reason of the unity of their subject ” (PR 223). This leads to the description of a phonon as an emergent and collective-value. Open strings identify in event-logic as virtual features of snapshots, not fully emergent but not fundamental either. Instead, each value is causa sui (PR 33, 111, 113) or for itself, nominating a “relational essence” of data predicated on membership within the mutual adherence of a snapshot.

The notion of “ data as a sheer multiplicity ” candidly refers to a snapshot of frozen strands as a “sheer” set of values happening to be in the same frame at the same time. Neither are fundamental; instead, the ‘togetherness of experience’ is encapsulated by the collective- value of a phonon, as we will see in the next section. As Ford reinforces, a multiplicity “has only the spatiotemporal internal relatedness of its many members expressed by their mutual prehensions of one another. If the multiplicity can only be felt as a unity, then the multiplicity

117 specifies only what is to be felt, not what is finally felt ” (Ford, EWM 214f). The phrase “ what is to be felt ” hints at the holographic dual of initial data, to be noted in the fourth phase.

Whitehead also refers to this as the “initial data” of the “actual world” of an occasion. The “actual world” relates to open strings as frozen strands qua virtual strings in the hypersurface of a snapshot. In the moment of pause, contents therein represent a spread of events; this represents the information-content of the actual world (snapshot). This initial predicate: ‘a snapshot of strands,’ represents the initial conditions for the phases of concrescence, and the proposed starting-basis for string theory in an event-narrative.

Emptiness and Dependent-Arising/Origination

The nature of acquiring the ‘initial conditions’ of a process is linked to the principle Buddhist tenet of ‘dependent origination’ or ‘dependent arising.’ In the native language this is known as Pratītyasamutpāda and refers to the central tenet that all things arise in dependence upon multiple causes and conditions. In an event-logic, open strings are modeled as arising in dependence upon the multiple strands captured in momentary conditions of a snapshot, where each strand is defined as ‘causa sui,’ or ‘for itself’ (PR 113). The famous Buddhist monk, Tsongkhapa explained that emptiness and dependent origination are like two sides of the same coin. The event-logic constructed in this study provides a narrative for both items and shows how they can be viewed as co-termini of the same process.

Within the event-logic modeled here, we recognize the causal manifold in “vacuum” as an empty frame through which all values flow. It’s empty because it holds nothing: its walls are opaque and strands move through them as if not even there. Holding onto nothing, everything passes through it; because it holds onto nothing we call it empty. In a similar sense, physics up to the twentieth century regarded the “vacuum” as empty; however eventually it became clear that the “vacuum” is a plenum of fluctuations, virtual particles, and dark energy teeming like a hidden sea (see chapter two). The strands in background are the plenum of values flowing through the ‘empty’ manifold. In a paradoxical sense well-suited to the Buddhist tradition, even in its emptiness, the manifold is always full of streaming strands.

Buddhism encourages us to imagine our minds like the manifold, not holding onto anything or creating anything. Given the link between mind and manifold, as we minimize mind so do we therefore draw closer to the activity or ‘true nature’ (of AE’s) bearing from the snapshot samples of the manifold in hypersurface frames. These themes are also found in the Tao de Ching. The Tao refers to these dynamics as the “workings of all creation” (verse 16).

When we condition mind into the likeness of the manifold conditions (a sea of strands) we adhere to a patently Taoist approach to first-person consciousness. Consider the quote

118 from the Tao de Ching: “ Be totally empty, embrace the tranquility of peace. Watch the workings of all creation, observe how endings become beginnings ” (verse 16). When we establish no- mind we correspondingly cultivate the clarity needed to approach the presence of the inner-workings underwriting experience. Here we reinforce the fundamental status of ‘dependent origination’ in the Buddhist tradition and demonstrate how it aligns to a nominative property describing the set-up and initial conditions of a process and event- cycle. This same logic can be found in the initial conditions of Whitehead’s generative AE process, as well as in providing string theory with an opening narrative, as we will pursue.

Within Whitehead’s process-generative logic of the AE’s, the ‘multiplicities of initial data’ represent the initial conditions for the phases of concrescence. As he explains: “ a multiplicity of simple physical feelings constitutes the first phase in the concrescence of the actual entity which is the common subject of all these feelings ” (PR 259). Applying this logic into string theory we recognize the multiplicity of initial data as a large Dn-brane of open- strings. To these ends we also regard the Dn-brane of open strings as providing the initial conditions of a foundational process. In all cases the category for explaining how the fermionic, open strings arise from bosonic strands is correspondingly attended-to by the central Buddhist notion of ‘dependent origination.’ Thus, open strings and ‘initial data’ represent the dependent arising of contingent and derived values from a sea of superpositional strands streaming through “vacuum” space.

Emptiness describes the mode of the manifold in “vacuum” where strands stream through as if it weren’t even there. In this sense it is referred to as emptiness in the Buddhist tradition, or rather, as opacity, and the language of motion in both refers to (something like) fluid dynamics. The manifold only ‘takes form’ (or ‘takes mass’) in snapshots, as a sample of all values simultaneous in the manifold. These samples are properly understood as ‘dependently originating’ (‘arising’) as contingent moments and derived aspects of otherwise superpositional strands, like bosons, streaming in flux through the silent and tranquil sea of strands in the “vacuum”. The contingency of the open strings attached to a Dn-brane stems from the somewhat random simultaneity of strands in the frame for any given snapshot, plus the random configuration the strand itself happens to be in at that moment.

Given the suggestion that only a frozen moment of a strand yields an open string attached to a Dn-brane (a snapshot), we consider dependent origination to describe the nature of the beginning of the principle process; and specifically the nature of relation of the first-values, or initial conditions of the process, as ‘dependently originated’ from the (frozen moments of) strands streaming through “vacuum” space. And also as tempering the multiverse theory into snapshots as the preferred frames chosen for materialization.

The event-like property of ‘dependent origination’ is given credence by the preliminary conditions on a snapshot (Dn-brane). We therefore find reason for linking the central basis of Buddhist ontology to the same event-logic underwriting conscious and material systems, including string theory. String theory further describes the nature of dependent

119 origination, supplying a narrative satisfying these properties and relationships. This is shown to justify Sahakian’s reservation at calling open-strings emergent; as we see here, the most-accurate description comes from Buddhism: they are dependently originated. We could also refer to open strings as ‘contingent’ (derived) from strands. To these ends the Buddhist notion of Pratītyasamutpāda—that ‘all things arise in dependence upon multiple causes and conditions’—is reinforced through considering the initial conditions of process as an event qua snapshot of stringy strands in the capacity of Whitehead’s ‘multiplicity of initial data’ and Dn-branes in string theory. Given these snapshots depend on strands for their (derived) valuations, they fit the Buddhist understanding of origination in a foundational process to an impeccable degree.

Pratītyasamutpāda provides the basis for other, key concepts in Buddhism such as karma, the arising of dukkha "suffering," and the possibility of liberation through the realizing anātman "no self". In addition, the general principle of pratītyasamutpāda is complementary to the concept of śūnyatā "emptiness." Applying these ideas in the geomodal metric, ‘no-self’ is seen as referring to the 5d “vacuum” mode of a person occurring in a mode prior to sensory awareness. The “vacuum” is comprised of quantum fluctuations considered in geomodal as strands (see also Chew, 2004). Given they represent the natural conditions out of which sampled approximations are conglomerated in snapshots whose end products produce subcellular values underwriting consciousness, we have solid ground for recognizing strands and the “vacuum” connection space of each person with the concept of no-mind. It is the domain of proto-self from out of which proto-constituents are autopoietically fashioned in nature and underwriting each person. This is also shown to naturally accord with sunyata qua emptiness associated with “vacuum” which turns out to be a plenum of values unbeknownst to us in a mode once- removed from awareness. 50

When we’re reminded of the initial background of streaming strands out of which snapshots are taken—plus the fact that the strings witnessed on snapshots aren’t actually real values, but only virtual approximations of a dynamical strand—we are led to the Buddhist notion of ‘dependent origination’ as a precise description for how open strings come to appear from-out-of the context of fluctuating strands. In string theory this would be like asking how open strings on a Dn-brane come to appear from out of the context of tachyonic (bosonic) strings in 25 dimensions. In Whitehead this refers to the context where ‘continuous potentiality’ becomes ‘atomic actuality’ qua ‘atomic adjectives’ of ‘a multiplicity of initial data’ (PRel 29). In addition, Whitehead corrects his ‘categories of existence’ to accommodate the position that the ‘multiplicities’ do not represent a fundamental value in themselves (see PR 111, 113).

50 The full geomodal narrative the process leads into subcellular domain into MT values underwriting consciousness. Given that we can indirectly influence strand quality or creative dynamics based on volitions and heart, we witness an effective circuit from agency into strands that cycle down through snapshots into creative, collective values that then transform into cytoskeletal sector and subcellular into protofilaments and MT’s that underwrite consciousness. Given the link to agency, this describes a karmic circuit. By the same principle, suffering arises from bad, or poorly managed volitions and actions. 120

III. HOLOGRAPHIC DUAL of SNAPSHOT ‰ chiral bag on QCD conformal boundary; AdS/QCD correspondence; large N QCD; 1/n expansion in YMT; initial data (frozen samples) ‰ objective data (free-quarks in chiral bag) via open/closed string correspondence

In order to go from continuous potentialities to atomic actualities requires a transition and translation mechanism constituting the measurement problem. The method provided here considers a sampling mechanism by the manifold generating snapshots of frozen strands: like open-strings on a Dn-Brane, and an ‘initial multiplicity of data.’ Here, the frozen strands immediately coagulate to form a collective quasi-particle: a phonon. This logic is used to revive Whitehead’s notion of “event-entities” (PRel 8, 10, 36) qua “the original datum theory” (see Ford, 1984; p.188). As Whitehead explains:

The primary stage in the concrescence of an actual entity is the way in which the antecedent universe enters into the constitution of the entity in question, so as to constitute the basis of its nascent individuality. (PR 177)

Now we will discover the snapshot in a holographic capacity such that it generates a dual- representation group in the general capacity of the AdS/QCD correspondence, or what Whitehead calls the reenactment” of ‘initial data’ into ‘objective data’ (see Ford, 1984).

Simple physical feelings embody the reproductive character of nature, and also the objective immortality of the past. In virtue of these feelings time is the conformation of the immediate present to the past. Such feelings are conformal feelings. The conformal stage merely transforms the objective content into subjective feelings. (Sherburne, 1966; p.40-41)

Applying the holographic principle to the snapshot mechanism we realize it provides the remarkable property of a holographic-dual of the snapshot’s ‘initial data,’ projected to the conformal boundary in one-less dimension than the (AdS) bulk space. This describes the mathematical procedure for the holographic principle: the AdS/CFT correspondence developed by Juan Maldacena in 1997. In this study we will also consider AdS/CFT in light of AdS/QCD. We imagine the correspondence of the large-N QCD or 1/n expansion to a chiral-bag of quark matter, as the holographic projection of the initial snapshot to its counterpart on the horizon. Chirality, after all, is one of the foundational properties of string theory (see Bachas, 1999).

In Verlinde’s matrix model of Higgs and Coulomb branches, the fast (Higgs) system goes into the slow (Coulomb) system and this is the open/closed string correspondence (see 2011). The closed string as phonon is product of the fast system and the open strings represent the individual values of the actual world of the AE that get prehended. It is

121 slow because the prehension of each actual value (open string) in the snapshot qua coupling takes more time in principle or is slower qua more numerous and complex in its number operations.

It is fast when the phonon bursts after the snapshot actual world of open strings are identified and coalesce into a long wavelength phonon, which bursts onto the scene in a gravitas of pressure and adiabatic heat signature multiplied by tonic (the sound-wave qua experienced/felt and heard on a sub-conscious microscopic level) of the snapshot. This is the fast system whose phonon becomes the coupling that acts upon the holographic AdS/QCD projection of the snapshot in the capacity of prehending each individual element, or open string, making up the actual world of that snapshot (first phase product of values) underwriting the potentials and parameters of that AE.

In Whitehead we encounter this under the logic of initial data ‰ objective data, prét-a- prehension. The initial data represent the “ antecedent experients ” (PRel 27) whose antecedence is defined in relation to the subsequent objective-data then made eligible for analysis (selection) into positive and negative prehensions.

At least three descriptive options emerge: the holographic-dual of the snapshot projects into 1) a chiral-bag model (QCD) on the horizon filled with quark-matter and a coupling; 2) into a matrix theory and moduli space of off-diagonal open strings and eigenvalued VEV’s on the diagonal (Verlinde, 2011); or 3) In the =4 super Yang Mills representation, “the open strings, which correspond to gluons, end on the infrared D3 brane. This (infrared) brane plays the role of the infrared cutoff in gauge theory ” (Khoze, et al., 2008). As Kutasov explains, “light matter consists of the ground states of open strings stretched between different D-branes, giving rise to a gauge field for the group G, and scalars in the adjoint of G” (Kutasov, 1998). In all cases, as a dual-representation of the initial snapshot, this allows each original sample to be represented, or ‘reenacted’ as a single item in one-less dimension. The correspondence of values between modes also describes the open/closed string—as well as the AdS/CFT—correspondences, and gauge-gravity correspondence, in general.

Snapshots undergo a two-fold development: first, open strings on Dn-brane ‰ collective value: a long-wavelength phonon; second, a holographic-replica of the open-strings is projected onto the boundary space of the horizon 51 into a chiral-bag of quark matter appearing in a de-confined environment observing asymptotic freedom. Phases within the bag correspond with what Whitehead critically develops as the notion of “genetic time” (PR 52), and what Verlinde refers to as a foliative development into the emergent dimension of space during holographic renormalization, discussed in the next chapter.

51 Membrane Paradigm: Thorne, Macdonald, Price. Horizon interpreted as a fluid membrane with certain dissipative properties: (e.g. electrical conductivity, shear & bulk viscosity, etc.) On the spacetime boundary. (AdS/CFT), fluid dynamics describes the full spacetime, not just horizon. 122

Through a two-fold application of concepts not available at the time to Whitehead— closed-strings and the holographic principle—we are able to lend additional support here via “phonon” dynamics for reviving Whitehead’s “original datum” — as well as resolving the ‘data-datum dilemma’ of how to incorporate the feasibility of both versions into one account— through application of the holographic principle. By incorporating the holographic principle, we are able to recover Whitehead’s “original datum” and resolve it into the AE process in such a way that it does not ‘halt the process’ but instead translates it into the coupling, as a fine-tuning factor for the holographic projection-group of ‘initial data’ ‰ ‘objective data,’ in closed-string mode via open/closed string correspondence.

The dots represent Whitehead’s ‘reenacted,’ ‘conformal feelings’ qua QCD duals of open strings in the snapshot, like massless “free quarks” in a chiral bag obeying asymptotic freedom in a de-confined (decoupled) phase, making them prét-a-prehension. Each dot represents a conformal feeling of one of the strings from the initial snapshot. As conformal reenactments of past occasions, they are distinct in form without a difference in value.

In Whitehead, “ conformal feelings ” refer to the “ reenactment of the past occasions ” (see Cobb, 2013). Essentially, this means the replication of past entities in a different mode of presentation than as the frozen strands of initial data. This provides a narrative to the open-strings ‰ closed-strings correspondence as a reenactment of conformal feelings; as such, each objective datum is a conformal feeling as reenactment of an initial datum— though not the initial datum, which instead is referred to as Whitehead’s ‘original datum.’ In the event-logic, ‘reenactment’ refers to the holographic dual projection of the initial data onto a conformal boundary. This ‘reenactment’ of the past occasions of the ‘initial data’ into ‘objective data’ is for the sake of entering into a mode prét-a-prehension, or into the mode of holographic renormalization (see Verlinde, 2010, 2011).

Open-Closed String Correspondence

The open/closed string correspondence provides a string-theoretic understanding for how open-strings naturally transform (holographically) into closed-string correspondence modes. Here, open-strings on D-2 give rise to closed-strings in D-4 (Verlinde, 2011). Though they appear in different representational forms, they still bear the same informational content, though evolving differently qua chiral asymmetry property.

More intriguingly, the correspondence is holographic : the two dual theories live in different number of dimensions. A useful conceptualization of the duality is to think of the gauge theory as ‘living on the boundary’ of AdS. We therefore refer to the gravity side as the “bulk” and the gauge side as the “boundary” theory. (Maldacena, 1997)

The equivalence between open and closed strings gives rise to details in AdS/CFT when you take multiple D-branes and decouple open strings from closed strings to get to the CFT at the low-energy limit (Verlinde, 2011). In some regions, the gauge theory of the D-

123 branes is decoupled from gravity in the bulk, meaning that open-strings attached to D- branes do not interact with closed strings. This is termed a decoupling limit. In these cases the D-branes have two alternative descriptions: from the point of view of closed strings they are gravitational sources, and thus we have a gravitational theory on spacetime with some background fields. For open strings, the physics of the Dn-branes is described by the appropriate gauge theory (see Verlinde, 2011). As Gubser explains:

The gauge-string duality can be understood, at least in part, as the necessary equivalence between two ways of describing the same physics—namely, the low-energy dynamics of a large number of coincident branes. (Gubser, 2011)

In the event-cycle, the decoupling limit could be seen as the moment in the process when the dual of the initial snapshot is projected onto the holographic surface of QCD-limit in a chiral bag, but before the phonon bursts. Given the decoupling limit describes open strings on the gauge theory limit of boundary space (horizon), we imagine each open- string also has a potential tonality qua corresponding closed-string interpretation to be realized after phonon bursts from the initial snapshot. In this sense, the open/closed string duality is like the wave/particle duality and the correspondence between a note and a wave-packet of frequency (qua Fourier transform). As Verlinde notes:

The gravitational or closed string side of these dualities is by many still believed to be independently defined. But in our view these are macroscopic theories, which by chance we already knew about before we understood they were the dual of a microscopic theory without gravity (2010).

In Whitehead this represents the transformation of initial data into their objective data mode, ready for prehension. As such, the closed-string sector of the open-strings represents “ the potential for initial elements of data to be prehended ” (Fortescue, 2001). Within this scenario, open-strings are “contrasted” (PR 50, 120) from closed-strings on (QCD) horizon. We can apply Whitehead’s concept of “conformal feelings” (PR; 140, 190) directly here; thus, the objective data of initial data represent the conformal feelings, as individual candidates made eligible for prehension. As such, the open/closed string correspondence provides a mathematical mechanism for the initial data (of AE’s) to become eligible for prehension in the dual-mode of a snapshot as “objective data” (in a bag) on the conformal boundary.

The open/closed string correspondence also leads to the more specific, AdS/CFT . In mathematical-physics, AdS/CFT has several off-shoot varieties including gauge-gravity and gauge-string dualities; the fluid-gravity and acoustic-gravity—plus open/closed string and AdS/QCD—correspondences; and more generally, in UV/IR mixing. We can take this as a sign that the correspondence itself is a basic feature of the universe (and process). Within the principle process we focus initially on AdS/QCD and propose a method for modeling it from a physical means reinforced by Whitehead’s AE’s.

124

In this process scenario we take phonon generation as referring to the dynamics of the initial snapshot while AdS/QCD refers to the holographic-dual of snapshot modeled as a chiral-bag on the QCD conformal boundary. Following the action principle, the phonon serves as ‘coupling’ for the AdS/QCD mode of ‘objective data,’ prét-a-prehension. This also explains in a nutshell how we overcome Whitehead’s data-datum dilemma for beginning the phases of concrescence. Developing the model in the setting of a holographic scenario we get the corresponding dual of snapshot on horizon for free. 52 Given the phonon can also be described in the mode of a closed-string, converting each open-string into a closed-string sets the initial values into the same representational form as the phonon qua coupling: the unique measure of discretion for the elements selected/excluded for that phase of prehension and the “genetic phases” of concrescence, as we’ll see in the next chapter.

In this difference between the evolutions of both snapshots we find a characteristic resolve with the chiral-symmetry-breaking property in SM. This can be exemplified in Whitehead through recognition of the difference between the two terms “ingression” and “prehension.” Lango acknowledges that they are two distinct operations. In Whitehead we distinguish between ‘ingression’ as the “graded entry” of ‘initial data’ into the original datum—whereas ‘prehension’ refers to the selective coarse-graining process whose positive remainders further synthesize (foliate) into a collective value, and whose negative values coalesce into an emergent 53 gravitational force.

All together, the open/closed string correspondence describes transitions like that of the Dn-branes of open-strings into closed-strings in a chiral-bag (special phase/modulus- space), prét-a-prehension. This can be stated even stronger to consider the open/closed string correspondence as representing a fundamental transition from the ‘initial data’ of a snapshot ‰ 1) a phonon, and 2) ‘objective data,’ prét-a-prehension. As such this would describe a naturally-occurring element of the fundamental process at the core of nature, lending itself to an enhanced physical description of Whitehead and Verlinde’s models.

IV. PHONON — Initial Data into Original Datum = closed string (as collective mode of degrees of freedom of open strings); collective boson, quasi-particle

Whitehead articulates the idea that “data,” as a ‘sheer multiplicity,’ cannot be felt within (his) “reformed subjectivist principle,” stating: “there can be no togetherness except togetherness in experience ” (PR 189f). We take this to mean that the snapshot of frozen strands (initial data), as sheer data, are not ‘felt’ until they are felt “in togetherness.” They are therefore not a fundamental value but a derived one. Whitehead expresses similar reasoning when he removes the ‘multiplicities’ from his ‘categories of existence,’ saying they are not truly fundamental, even though they represent the initial conditions.

52 This refers to a chiral bag on conformal horizon where phonon serves as coupling (decision-maker) for membership in concrescence phases via selective process of prehension (tuning) to the coupling. 53 An emergent gravitational force, as an adiabatic reaction force (see Verlinde, 2011). 125

Three themes define this section: 1) Phonon is emergent from snapshot elements; 2) Phonon as a revival of the “objective datum” of Whitehead’s Gifford draft; 3) Phonon is chosen over graviton qua closed strings.

Phonon as Emergent

In physics, phonons represent quasi-particles (collective modes) with a long-wavelength frequency. In string theory Verlinde relates phonons to emergent, closed strings. Here we portray the phonon to spontaneously burst off a D0-brane as an emergent, collective value of the “initial data” of a snapshot. 54 Whitehead’ description of “the antecedent universe” refers here to the initial data of the snapshot that “ enters into the constitution of the entity ” (PR 174) by way of ingressing into a phonon, as a collective boson: a quasi- particle: a Nambu-Goldstone boson.

Whitehead heralds something similar: “ Each actual entity is a throb of experience including the actual world within its scope” (PR 117). In another instance he explains how: “there is a rhythm of process whereby creation produces natural pulsation, each pulsation forming a natural unit of historic fact. In this way amid the infinitude of the connected universe, we can discern vaguely finite units of fact ” (MT 120). To these ends, Whitehead describes how “ our sense awareness posits for immediate discernment a certain whole, here called a duration […] which is an essential factor disclosed in sense-awareness ” (CN 53). As such the phonon represents the ‘essential factor disclosed in sense-awareness ’ as the synthesis of the collective degrees of freedom into a feeling-tone, as a physical feeling which displays the real extensiveness of the contemporary world. " It involves the contemporary actualities but only objectifies them as conditioned by extensive relations " (PR 494).

The way in which the initial data coalesce into a collective phonon is also an important feature — here borrowed from Whitehead’s term “ingression” (PR 34) and used to describe the transformation of initial data into a phonon. In Whitehead’s model this signifies how the multiplicity of frozen strands ingress into a collective phonon. Specifically, ingression describes a “graded entry” (Ford, 1984) into the phonon. Ford explains how “ the mode of ingression is the particular way in which the components of that relational essence have graded entry into that occasion ” (1984, p.75). More formally, Whitehead explains how: “Ingression is used for the complex relationship of those abstract elements of the world, such as sense objects, which are devoid of becomingness and extension, to those other more concrete elements (events) which retain becomingness and extension ” (PRel,

54 Snapshots and phonons could be appended to instanton-soliton loops in 5d SYM, which as we learned in chapter two applies to the near-horizon setting of a black hole and the spontaneous emergence of virtual particles (see Hawking, Bekenstein). Instanton represents snapshot (Dn-brane) of open strings; and soliton represents phonon synthesis of instantonic open-strings into one large-wavelength phonon (closed string. 126

37). 55 A ‘throat diagram’ in physics proves most useful for considering how initial components ingress into an emergent, collective phonon. 56

Here, ingression is specified to refer to the mode in which phonons synthesize from-out-of the initial data of snapshots. As such this also distinguishes it from prehension—occurring in a subsequent phase of the process—as a “selection” whose ‘selector’ is none-other than the product of ingression itself: the phonon—for determining what values go into the phases of concrescence for synthesis, in Whitehead.

This has not always been apparent within scholarship, however, leading Lango to note that the “ common features of prehension and ingression mark an underlying metaphysical concept that is not expressed in a single category ” (1972; p.10). Namely, ingression gives rise to an emergent phonon, as a collective particle synthesized out of the initial data of snapshot.

Phonon > Graviton qua Closed Strings

Perhaps the precursor logic to this section can be found in Chew’s quote, citing how: “Whitehead subordinates energy to ‘impulse.’ Localized energy is not a priori; the occasions that build our universe are localized impulses ” (Chew, 2004). Not only does this speak to the ‘emergent’ quality of phonons, but also to their spontaneous generation out of snapshots. As Maldacena explains, “ the D-brane acts as a source for closed string modes. With a rolling tachyon the D-brane becomes a time-dependent source. Thus, generically there will be closed string creation from the rolling process ” (2007). A rolling tachyon goes to a minimum state above zero where it exhibits a Nambu-Goldstone boson qua spontaneous symmetry breaking. A NGB is a quasi-particle that can also be considered like a phonon and associated with the “Mexican hat” model of SSB (see Goldstone, 1961), resembling the ingression (throat) model above.

Indeed, Verlinde links the closed string to a phonon, rather than a graviton, and explains that we must regard this value as an emergent quantity, therefore, not a fundamental value. As he explains: “ I am not sure that string theory will necessarily take us in the right direction if we keep regarding the definition in terms of closed strings as being microscopically defined… and keep our eyes closed for emergent phenomena ” (Verlinde, 2011). Thus, gravitons are not fundamental particles able to be quantized, but rather like phonons. Parsing no words, he explains: “Gravitons do not exist when gravity is emergent; gravitons are like phonons ” (2010). Verlinde continues: “ We know that phonons are quite useful concepts, which even themselves are often used to understand other emergent phenomena. Similarly, gravitons can be

55 Eastman: “ apart from their joint ingression into an actual entity, eternal objects are isolated from one another ” (2004). 56 Phonons were discovered in experimental conditions recently using a fluid black hole setup (see e.g., Zapata, 2011). 127 useful, and in that sense exist as effective "quasi" particles. But they do not exist as fundamental particles ” (2010). Frampton offers closely-shared sentiments, citing how:

For gravity, there is no longer necessity for a graviton. In the case of string theory, the principal motivation for the profound and historical suggestion by Scherk and Schwarz that string theory be reinterpreted, not as a theory of the strong interaction, but instead as a theory of the gravitational interaction, came from the natural appearance of a massless graviton in the closed string sector. This is not saying that string theory is dead. What it’s saying is that string theory cannot be a theory of the fundamental gravitational interaction, since there is no fundamental gravitational interaction. (Frampton, 2010)

This lends credence to the initial track of this study opting for frameworks predicated on the non-quantization of gravity at the quantum level. You cannot quantize something that is not fundamental to begin with. Why are phonons important in the infinitesimal quantum realm? “ I feel they're important because of the notorious weakness of gravity at that scale ” (Verlinde). If, as Verlinde speculates, the phonon effect goes away with a few particles, and if gravity is ruled by such effects, then the weakness of gravity is explained (2010). As such, gravity is not fundamentally quantized into gravitons, but is emergent from the lower-dimensional degrees of freedom frozen onto a Dn-brane as a snapshot sample of the underlying strands whose collective-value issues an emergent phonon.

We follow Frampton and Verlinde to consider the closed string as an emergent phonon: a quasi-particle taking-value given the selection—or “actual world” (PR 98, 118, 154) of open-strings represented on that D-brane. Geomodally, the quantizations are of quasi- particles whose values are felt and negotiated physically as biological (affective and subcellular) qualia. In addition, we can also think of the phonon in terms of a D0-brane. As Kutasov explains, “ the D0-brane corresponds to a Kaluza-Klein (KK) mode of the graviton carrying momentum along the compact direction. It is electrically charged ” (1998). Converting the graviton into a phonon we can then link the D0 brane to a phonon, whose bursting creates an electrically-charged item. 57 As Kutasov explains, “ the momentum in the eleventh direction P10 is reinterpreted in ten dimensions as zero-brane charge ” (1998). Thus, the snapshot, or derived hypersurface is considered as a charged D0-brane leading to ingression of a bursting phonon from off the charged brane, like a graviton.

Revival of Whitehead’s Original/Primary Datum

The doctrine of a single objective datum from which concrescence begins may be quite alien to our ordinary notion of Whitehead’s philosophy, but it is quite appropriate to the Giffords draft (Ford, p.202) . The doctrine that concrescence starts from a unified datum, and not from a multiplicity of initial data, permeates the Giffords draft. (Ford, 190)

The phonon can also be shown to revive Whitehead’s notion of the “original datum” as the beginning of the phases of concrescence, along with the “multiplicity of initial data.” This comes from the Gifford’s draft (II.7.4b) where Whitehead speaks of concrescence as flowing from an original datum—and in fact was the single-most edited element between

57 …with predicted responses on the lowest physiological and affective levels. 128 his first and final drafts of Process and Reality (see Ford, EWM 1999). We’ll discuss this more in chapter seven. As Ford explains: “ the many cannot finally form a many unless somehow together, which is possible only in experience. Only that which has the minimal unity of a “single” datum can be the object of a feeling ” (1999, p.214). Lango adds, “The objective datum of an integral feeling is realized through the synthesis of the initial data of that feeling ” (Lango, p.47). This can be taken to represent the synthesis of open strings (initial data) into a phonon (original datum). The ‘integral feeling’ represents the snapshot (actual world) of initial data themselves. The original datum is the “actual entity” and the collective value it fine- tunes is the “actual occasion.”

Notably, the final value, or occasion, is fine-tuned by the phonon (as coupling) during the renormalization phase of prehension where ‘objective data’ are each assessed individually. “Each entity arises from a primary phase of the concrescence of objectifications which are in some respects settled ” (PR 83). This describes the phonon that arises out of the snapshot of initial data that are “ in some respects settled ” on account of being elements in the snapshot. As Ford explains, “ what is here called concrescence, to indicate the unification involved in forming the original datum, is later called transition to contrast it with concrescence as a process within the occasion ” (EWM 199). This reinforces the distinction between ingression qua transition, and prehension.

The first phase is the phase of pure reception of the actual world in its guise of objective datum for aesthetic synthesis. This datum, which is the primary phase in the process constituting an actual entity, is nothing else than the actual world itself in its character of a possibility for the process of being felt. This exemplifies the metaphysical principle that every being is a potential for a becoming . The actual world is the objective content of each new creation. (PR 90)

Lango states it precisely: “ the ‘original’ datum of an integral feeling is realized through the synthesis of the initial data of that feeling ” (Lango, p.47). This represents the synthesis of the initial data into a phonon. The ‘integral feeling’ represents the snapshot of open strings as synthesized into an emergent phonon. The immediate synthesis of kinematic elements transforms the initial data into an “original datum” like a collective quasi-particle. The ‘one integral feeling’ produced in the transition from ‘initial data’ to ‘objective datum’ is linked to the phonon created by the ingression of initial data. This enables the snapshot to designate the “original datum” not as a real value, but a collective quasi-particle, like a phonon. The string theoretic companion of a phonon is a closed string, and the companion to initial multiplicity of data represents a Dn-brane of open strings.

5.3 — DICTIONARY ITEMS and SUMMARY

Event -Logic Whitehead Verlinde 1. Sea of Streaming Strands Kinematic Elements, Continuous Microscopic Data underwriting Potentialities of dJ 2 emergent gravitational force 2. Snapshot of Frozen Strands Kinematic elements; actual world; Open strings on Dn-brane, large N ‰ Strings multiplicity of initial data QCD; 1/n expansion in YMT 3. Snapshot ‰ Phonon Original Datum, from Gifford draft Closed string as phonon; emergent 4. Holographic dual of snapshot Initial Data ‰ Objective Data Open/Closed corresp.; AdS/CFT

129

This dictionary is represented in a pictorial-logic as a set of simple sequences describing the two, initial operations of the principle process exemplified in AE’s and EG.

A.

The first pictorial operation shown above describes how potentiality ‰ actuality (the Measurement Problem) by virtue of a snapshot mechanism whose contents “ingress” (are selectively graded) into a quasi-particle phonon, as an emergent and collective value. Dynamical strands ever in-flux—evolving in “continuous potentiality” in “vacuum”—are captured in a snapshot and made to resemble strings appearing at rest, as objects. These values resemble open-strings on a Dn-brane that “ingress” into an emergent phonon, bursting from-off the brane like Peirce’s “infinitesimal” (see Weiss, 1935) off of a continuum, or one of James’ “drops of pure experience”—like a quale of consciousness.

B.

The second mode describes the holographic operation that projects a ‘dual’ translation of the first (plate-like) snapshot ‰ representational form in another mode of d+/-1. Moreover, the holographic principle can be mathematically substantiated by the AdS/CFT correspondence and all of its other permutations (AdS/QCD; gauge-gravity; open-closed string, etc.). This accounts for the topological difference between strings/strands in snapshot and dots in dual like “free quarks” in a chiral bag. This is a natural result and property of the holographic principle. In Whitehead, the holographic dual represents the “reenactment” (Lango, 1972) from ‘initial data’ to ‘objective data,’ prét-a-prehension.

In our event-narrative, a ‘sea of strands’ fills the background (microscopic) “vacuum”. The ‘causal diamond’ serves as the manifold out of which D-branes are generated as the holographic photographs (hypersurfaces) of snapshots. The first snapshot evolves into a collective quasi-particle while the dual is assessed part-per-part, generating a more selective (and synergetic) synthesis of chosen elements. The phonon is not a fundament entity but a collective quasi-particle that, as Verlinde notes (2010, 2011), is emergent.

We credit Polchinski greatly for realizing the existence of D-branes, but as Verlinde (2011) remarks: “ I don’t get the sense that anyone really knows where they come from and how they originate in the first place. ” This is precisely what an event-logic speaks to. From a string theoretic perspective, the collection of all frozen strands (“multiplicity of data”) within the causal frame at snapshot resembles an abundant sampling of open strings on a Dn- brane. Providing a coherent narrative to the origination of Dn-branes thereby allots an

130 identical foundation for string theory. Building this narrative into AdS/QCD is the purpose of next chapter and is shown to help clarify a proposed narrative for both string theory and gravity.

SUMMARY

Transforming Minkowski’s lightcone into an event-logic context requires two steps: the first requires Lockwood’s geometrical-inversion of the lightcones into a diamond with an elliptical field. In Whitehead’s context we can interpret this ‘emergent’ field as the ‘presentational immediacy’ of an experiential organism; the ‘now’ of sensory-awareness; or the field of consciousness.

Second, owing to the suggestion of an emergent XT, we reinterpret the ‘hypersurface of the present’ from a fundamental value into a ‘derived’ or ‘emergent’ value (or snapshot of a causal manifold) in a holographic 5d scenario (see Maldacena, 1997; Verlinde, 2010); and therefore not referring to a fundamental metric of space and time. Since both space and time are derived from processes and events, a geomodal analysis is perhaps the first-level of analysis we can bring to that which is produced by processes and events” (2015). Pictorially this confers:

a) where is a derived hypersurface of as a causal manifold (like M-theory)

b) and where ‰ = a snapshot of initial conditions

In this event-logic the manifold, like Witten’s M-theory, is ‘fundamental’ in 5d, out of which snapshots — like Whitehead’s ‘durations’ or ‘events’ — are the emergent derivatives. Such a scenario qua 1/n expansion in YMT (open-string setting) is heralded as the pre-platform dynamics out of which space and time are derived (see ‘t Hooft, 1999).

After introducing this model we set about to correlate four connections between Verlinde and Whitehead’s models: (1) Whitehead’s “eternal objects” in flux with Verlinde’s ‘microscopic information’ qua ‘pre-event strands’ of (Chew, 2008). Out of this initial environment will be shown to yield a clarification of the ‘measurement problem’ in physics, not as a direct collapse of the wavefunction, but a sampling of “vacuum” through a manifold. This gives rise to (2) a “multiplicity” of “initial data” qua “open strings” on a “D-Brane” and leads to the formation of (3) a “primary datum” correlating with a “closed string,” or a “phonon.” In (4), recognition of the “snapshot” as holographic leads to model a dual-projection. This is shown to resolve what this study calls the “data/datum dilemma” representing the historical tension in Whitehead to clarify whether ‘concrescence’ begins with a ‘primary datum’ or multiplicity of ‘data.’

131

As we discovered in comparing the two models to each other, a derivation of EG from string theory allows us to provide a fuller and more-adequate philosophical account of EG in terms of Whitehead’s AE’s: it provides a gravitational-signature within the AE’s themselves as a remarkable role for negative prehensions underwriting the emergent gravity model as an adiabatic reaction force (see Verlinde, 2011). In a physical-signature, the initial conditions of concrescence, as ‘a multiplicity of initial data,’ are correlated into a string-theoretic account bearing the same ontological role in an event/process-logic underlying spacetime and gravity.

We pick up in the next chapter with the phonon prét-a-prehension for the assessment of ‘objective data’ (closed strings; gluons; free quarks) in a D4 (chiral bag). This leads to further phases of holographic renormalization, and to prehension and the phases of concrescence ‰ final satisfaction. Whitehead seeks “ to describe how objective data pass into subjective satisfaction ” (PR 88). We trace from the holographic duals of the frozen strands of initial snapshot into free quarks in chiral bag, as the ‘objective data’ wherein the phases of prehension and concrescence “genetically" (PR 198, 239) unfold.

132

Chapter 6 — Selection, Creativity, Final-Synthesis

In the last chapter we evaluated how a snapshot mechanism, representing transitional dynamics from potentiality to actuality, converts otherwise ‘continuous potentialities’ of ‘eternal objects’ (in microscopic background of the physical metric) into a ‘multiplicity’ of ‘atomic actualities,’ ‘kinematic elements,’ or ‘initial data,’ like ‘open strings on a Dn- Brane.’ In the simplest case, the data ‘ingress’ into a closed-string as a patently emergent, collective quasiparticle of the ‘initial data’—bursting like a phonon. Combining the logic of phonons and closed-strings into Whitehead’s texts, we were able to revive his retracted description of an ‘original datum’ as initial-basis for concrescence; furthermore, the holographic projection of initial data of snapshot (open strings on Dn-brane) ‰ objective data, prét-a-prehension, as closed strings in D4 or a chiral bag of QCD (quark) matter in D3. The transition from these open strings of the multiplicities to a closed-string can be seen as a natural step to account for the open/closed string correspondence associated with AdS/CFT (Maldacena, 1997) and AdS/QCD (Becciolini , Redi, and Wulzer, 2009; Brodsky, 2013; Batell and Gherghetta, 2008; Gherghetta and Kelley, 2009).

In this chapter, the ‘original datum’, like a closed string, becomes the ‘coupling constant’ for the ensuing prehension and phases of concrescence wherein AE’s are the “only reasons” (PR 50). Values are selected-out as well as positively-prehended into “feelings” for further phases of concrescence (PR 111, 268). As such, the collective quasiparticle of the “multiplicity of initial data” ‰ fine-tuning factor (coupling) in renormalization. This is explained by the simple axiom that as a coupling factor: any value selected by the phonon for positive-prehension must be to some extent “in-synch” with that phonon.

If a “multiplicity of initial data” begins the phases of concrescence, how does the ‘original’ datum also serve in the same capacity? Whitehead could not render a resolution to this and thus, while the ‘original datum’ was ubiquitous throughout the Gifford draft as a ‘vital teaching’ (Ford, 1984), by the final draft of PR it was replaced subtly by ‘multiplicities of initial data.’ However, we here revive the ‘original datum’ in our geomodal framework in such a way as to show how both the multiplicities of the initial data, as well as the original datum, co-initiate the phases of concrescence. 58 Through this we also see how the holographic dual of snapshot goes to horizon value in AdS/CFT and AdS/QCD models.

The main thrust of this chapter is in drawing out the penchant between the renormalization phase in physics and Whitehead’s prehension and phases of concrescence in AE’s. Characterized by our review of chapter four, prehension refers to the analysis of components arising from data, and concrescence to the way in which positive-prehensions creatively coalesce to form the satisfaction of an actual occasion.

58 To Whitehead’s credit, this is only realized on account of the holographic principle and adS/CFT, plus open/closed string correspondence—theories developed in late 20 th and early 21 st centuries.

133

Respectively, two major procedures of renormalization theory used by Verlinde: coarse- graining and foliation, are shown to blend nicely with Whitehead’s definitions of prehension and concrescence. Specifically, prehension and concrescence correspond with the physical logic of the selection qua integration-out of certain values (coarse graining) in a matrix, plus acquisition of expectation-value by others, leading to the gravitational self-energy of matrix via holographic renormalization (see 2011).

This chapter can be divided into four parts:

1. Phonon as Coupling for Prehension during Holographic Renormalization a. Coarse-graining and foliation = prehension and concrescence 2. Prehension = Coarse-Graining a. Selection process b. - Prehension linked to integration-out of open strings in a matrix c. + Prehension linked to off-diagonal open-string acquisition of expectation value 3. Concrescence = Foliation a. Concrescence of creative feelings ‰ emergent dimension of space b. Foliation of “feelings” ‰genetic phases 4. End Products/Phase Accumulation Values a. Satisfaction = Gravitational Self-Energy b. Max. of Coarse-Graining = Thermalization (of Polymer) onto Horizon = Φ

134

6.1 — PHONON as COUPLING CONSTANT in RENORMALIZATION

Prehension refers to the dynamics between a ‘selection-criterion’ and ‘feelings’ that determine the internal constitution of that AE during the phases of concrescence. In Whitehead’s terms, prehension is a relation between an actual entity: the subject of the prehension, and other entities: the data of the prehension (PR 35). This ‘subject of prehension’ is shown to correspond to the “original datum” of the Gifford draft. In the event-logic, this represents the phonon as coupling during phases of concrescence (renormalization).

The closed string as coupling constant represents the ‘unity of a datum’ for concrescence, serving as the phonon that selects other ‘objective’ associates during prehension, in order to supply materials (positive prehensions, feelings) to the phases of concrescence. This creativity speaks to an autopoietic process once removed from first-person, direct influence. As such the phonon serves as fine-tuning, or coupling factor in a bag of holographically projected replicas of initial snapshot of strands into free-quark forms in a bag model that are each analyzed individually by the coupling and selected (or not) for the further phases of concrescence. This describes a creative process demonstrating the proclivity between open/closed strings and the initial starting points of the AE process (as multiplicities of initial data; the reenactment of the initial data into objective data; an original datum).

In a Whiteheadian sense, “ in actualization the individual essence must be realized, but it can only be realized by means of its relational essence” (Ford, 1984). This is to say, the synthesis of the ‘multiplicity of initial data’ is only realized as a collective phonon as quasiparticle that then is related with objective data in prehension. We find further reason in Whitehead for linking the “ideal of itself” or “conceptual valuation” (see Ford, 1984, p.222) to the phonon, as coupling. As he states: “ Conceptual valuation introduces creative purpose. The mental pole introduces the subject as a determinant of its own concrescence. The mental pole is the subject determining its own ideal of itself by reference to eternal principles of valuation autonomously modified in their application to its own physical objective datum ” (PR 248).

In our geomodal, physical framework this refers to the phonon as a coupling: a relational essence; as Tanaka describes, “ an actual occasion, having arisen from its actual world, always transcends it as a novel self-creating creature, and gives itself to its future actual occasions ” (1987). This describes the original datum arising from a snapshot of data, and then transcending it as a coupling for objective data of the occasion. Whitehead clarifies (PR 213):

The first phase is the phase of pure reception of the actual world in its guise of objective datum for aesthetic synthesis. This datum, which is the primary phase in the process constituting an actual entity, is nothing else than the actual world itself in its character of a possibility for the process of being felt. This exemplifies the metaphysical principle that every being is a potential for a becoming . The actual world is the objective content of each new creation.

135

We associate this with the phonon as ‘original’ or ‘objective’ datum for synthesis as the coupling of prehension. The character of a possibility of being felt again represents the phonon as coupling and leads to the property that it is a potential for becoming insofar as coupling selects values to participate in concrescence and others ‰ gravity.

As we saw in the last chapter, the open/closed string correspondence is responsible for initially converting open-strings into closed-strings prét a prehension in holographic renormalization. This explains how ‘initial data’ are “reenacted” into ‘objective data’ in Whitehead’s language; however, reenactment alone is not sufficient to initiate prehension without something to serve as a ‘selecting’ or ‘fine-tuning’ factor: a ‘coupling constant.’ To these ends we unburied the single-most significant revision between Whitehead’s early and final drafts of PR: the original datum—and reinterpreted it as a phonon. Verlinde also develops the phonon as a ‘coupling’ in holographic renormalization (2011).

The closed string, taken as a phonon, is shown to play a further role in the AE process as enabling the coupling for prehension and the phases of concrescence. This affords the description of a phonon as “setting the tone” for data selection during the mode of prehension. The phonon in the bag, as ‘coupling,’ is what prehends. In Whitehead’s terms, the ‘original datum’ serves as coupling for the objective data qua prehension, until every element has been individually considered.

In physics, a coupling constant is a number that determines the strength of the force exerted in an interaction. Usually, the Lagrangian or Hamiltonian of a system describing an interaction can be separated into a kinetic part and an interaction part . The coupling constant determines the strength of the interaction part with respect to the kinetic part. (Peskin and Schroeder, 1995)

The coupling is responsible for giving mass to the fermions it couples with. This is like open-strings receiving an expectation value in matrix theory (Verlinde, 2011). From a string force perspective we recognize the Yukawa coupling as giving a “vacuum” expectation value to some of the free quarks (fermions) in a chiral bag (see e.g., Dorey, 1994; MacKenzie, Paranjape, and Zakrzewski, 2012). The phonon represents the non-zero minimum of “vacuum,” selectively giving mass to some of the massless quarks (in that matrix group) through prehension. Verlinde refers specifically to the phonon as a “coupling” during renormalization qua EG (see 2011). 59

“Virtual" particles going off the mass shell renormalize the coupling and make it dependent on the energy scale, at which one observes the coupling. The dependence of a coupling on the energy-scale is known as running of the coupling. The theory of the running of couplings is known as the renormalization group. (Peskin and Schroeder, 1995)

59 In QFT and the statistical mechanics of fields, renormalization is any of a collection of techniques used to treat infinities arising in calculated quantities (see e.g., Neumaier, 2011). 136

Whereas the coupling constant in gauge theory, the “fine-structure constant,” and the “gravitational coupling constant” are all dimensionless numbers, in string theory the coupling is dynamical and thus so is the scalar field. Thus, the number is not considered dimensionless. As Blumenhagen, Cvetič, Langacker, and Shiu (2005) explain:

Upon quantization, logarithmic divergences in one-loop diagrams of perturbation theory imply that this "constant" actually depends on the typical energy scale of the processes under considerations, called the renormalization group (RG) scale. This "running" of the coupling is specified by the beta-function of the renormalization group. Consequently, the interaction may be characterized by a dimensional parameter at the QCD scale, Λ.

The holographic dual (objective) data are regarded as massless quarks selectively acquiring an expectation value (or else eliminated) via the coupling. As such, the existence of infinities is to be expected here: the free-quarks in bag represent infinities (in potentiality space) until determined in prehension either positively or negatively by phonon (coupling).

In Whitehead we link renormalization to the prehensive operation and subsequent phases of concrescence. Here, the phonon serves in a chiral-bag as ‘coupling constant’ and therefore the “specific and determinate measure” (PR 38, 76) of objective data to distinguish positive from negative elements to be included/excluded from the further phases of concrescence in “genetic time” (Whitehead) or holographic, emergent dimension of space (Verlinde). In string theory, values positively-prehended represent open strings (in the off-diagonal mode of a matrix) that acquire (or are given) an expectation value. Such values, in Verlinde’s model, are said to go from a Higgs branch into the Coulomb branch (see 2011).

The following chart is adapted from a blackboard drawing from Fast Phase Space; Verlinde’s 2011 lectue delivered at University of Jerusalem. In Higgs branch Whitehead’s model we recognize open strings as acquiring an expectation-value in the matrix under the two-fold heading of ‘conformal feelings’ (PR 189, 190) or ‘simple, physical feelings’ (PR 260, 261). These represent a melding of the coupling with Small Phase space; the datum into a positively-combined product: a Whiteheadian Coulomb branch “feeling.” For simplicity we can imagine this as the combination of two tones into an interval.

This image is reconstructed from chalkboard notes of Verlinde (2011) describing the ‘big and fast’ phase space of the Higgs branch to the phase space of all objective data waiting to be prehended by the coupling, in the AE model. Those values that acquire a positive expectation value during prehension are then lifted into the smaller phase-space of the Coulomb branch. Coupling also relates with asymptotic freedom:

In quantum field theory, a beta function β( g) encodes the running of a coupling parameter, g. In non-Abelian gauge theories, the beta function can be negative […] An

137

example of this is the beta function for QCD, and as a result QCD coupling decreases [logarithmically] at high energies [...] a phenomenon known as asymptotic freedom; conversely, the coupling increases with decreasing energy. (Peskin and Schroeder, 1995)

As well, the phonon of our event-logic functions like a pseudo-scalar of the Yukawa coupling in renormalization. As Itzykson and Zuber explain, the “ Yukawa interaction is used in the Standard Model to describe the coupling between the Higgs field and massless quark fields (the fundamental fermion particles ” (1980). Goldstone bosons = pseudo-scalar mesons = “feelings.” In linear algebra, a pseudo-scalar can be considered as an invariant value with a sign-transpositional property such that it represents the otherwise antiquark of a quark/antiquark, or meson pairing (see e.g., Creutz, 1978; Weinberg, 2008). This provides a physical counterpart to Whitehead’s prehension dynamics of ‘feelings’ where the coupling surveys free quarks in bags to positively select values for the further phases of concrescence. In each assessment the connection between coupling and datum generates a “feeling” valuing either positive or negatively. This describes the phonon as a tuning- factor for the ‘potential multiplicity of data’ eligible for positive prehension into further phases of concrescence.

In string theory each perturbative description depends on a string coupling constant. However, these coupling constants are not pre-determined, adjustable, or universal parameters; rather they are dynamical scalar fields that can depend on the position in space and time and whose values are determined dynamically. (Peskin and Schroeder, 1995)

If we eliminate “ pre-determined, adjustable, and universal ” parameters we are naturally left describing “emergent” ones. Recognizing the phonon as emergent (Verlinde, 2010; Easson and Frampton, 2010) naturally enables us to correlate it with the string-coupling as described above. In the second clause from above, the ‘dependence on the position’ in spacetime “ whose values are determined dynamically ” means that phonons emerge only from the actual world of its “initial data.” This allows us to regard the phonon as an emergent, “dynamical scalar field” per snapshot. Maintaining the phonon as an emergent value, the fact that the dynamical scalar fields depend on the position in XT (of certain values) is expected as emergent phenomena are also dependent. This carries over to Whitehead’s recognition that positive-prehensions create a unique “feeling” between each pair; seen as such, this correlates with the dynamical element of phonons as couplings in string theory.

This gives the original datum a specific role in the outcome of the multiplicity-set, thereby resolving Whitehead’s dilemma over whether concrescence begins with an ‘original datum’ or ‘a multiplicity of data.’ Realizing the holographic possibility for a replica projection of the first set into a dual-state affords a description following the open/closed string correspondence wherein the original datum acts as coupling for the multiplicity of data. Out of this prehensive analysis each value is realized either positively or negatively: further integrated into the phases of concrescence or else eliminated (excluded, integrated-out) of the group. Positive prehensions are called ‘feelings’ for inclusion and synthesis into ‘complex’ and ‘comparative’ feelings during the genetic phases of concrescence (see PR 52, 65, inter alia).

138

From a Whiteheadian perspective this makes intuitive sense: if the coupling is established by the phonon, then in order for the data of open strings to be compared (in prehension) they must first be transformed, or as Whitehead says, ‘conformed’ into a mode in which they can be assessed by the phonon as coupling. Given the similarity between a closed string and phonon, open to closed strings, as a natural “re-enactment” of the initial entities, enable ‘antecedent’ values to ‘conform’ into a mode where they have been made eligible for prehension by the coupling, i.e. as a phonon. Verlinde makes the case clear that from a mathematical perspective, we should opt for describing gravitons not as fundamental values, but as phonons in an emergent depiction.

6.2 — COARSE-GRAINING = PREHENSION

As general procedures, both prehension and coarse-graining indicate characteristically autopoietic, selective operations (in renormalization and concrescence) wherein data are distinguished into positive and negative varieties and either included or excluded. In Whitehead, prehension is a selection process identifying elements through “feeling” for the further phases of concrescence: the phonon is what feels; the data are ‘felt.’ Intuitively we consider prehension like two notes autopoietically assessing wave- symmetry. Positive-prehensions give rise to ‘feelings’ for the phases of concrescence. These phases are characteristically emergent; as Verlinde explains, “ values on horizon grow into the emergent direction ” (2011).

We find this logic both in Whitehead and Verlinde’s models. Generally defined, coarse- graining involves a procedure in renormalization where certain values are eliminated while others go on to combine in the phase. In Whitehead this describes the operation of prehension whereby some values are positively prehended while the others are negatively ‘eliminated’ from the further phases of concrescence. In fact, Verlinde uses the same language in speaking to values removed (eliminated) from the matrix:

In other words, just like in AdS/CFT, there is one emerging direction in space that corresponds to a "coarse graining" variable, something like the cut-off scale of the system on the screens […] The information that is removed by coarse graining is replaced by the emerged part of space between the two screens. In this way one gets a nested or foliated description of space by having surfaces contained within surfaces. (Verlinde, 2010)

In string theory, positive prehensions are linked to off-diagonal open strings acquiring expectation-values in a matrix. The coupling is responsible for attributing a VEV to a subset of the open-strings in matrix. This specifies a description for Verlinde’s gravitational self-energy; as he describes, “ off-diagonal modes are the gravitational self-energy, and these are actually positive values ” (Verlinde, 2011). Alternatively, negative prehensions represent the off-diagonal ‘open strings’ integrated-out of matrix, but still kept track of by nature via Newton’s potential for gravity (Verlinde, 2010, 2011). This gives negative prehension a role not realized in Whitehead, leading to gravity as an emergent force.

139

Coarse-Graining

Coarse graining is a procedure of renormalization in non-abelian quantum field theories in which certain bits are removed while others generally conglomerate and grow bigger through subsequent stages, or foliations. Coarse-graining thus implies a two-way process of removing certain values and keeping others. We link coarse-graining to Whitehead’s notion of ‘prehension’ and recognize immediately the concomitance of their superlatives; specifically, where coarse-graining removes certain values and keeps others, prehension elects between positive and negative values.

In Verlinde, the information on the screens is coarse-grained in the direction of decreasing values of the Newton potential. Max coarse-graining occurs at a horizon (Verlinde, 2010). Verlinde describes a finite entropy associated with each matter- distribution measuring the amount of microscopic information invisible to the macroscopic observer (in cosmic censorship). This means that:

Space cannot just emerge by itself. It has to be endowed by a book-keeping device that keeps track of the amount of information for a given energy distribution. It turns out that in a non-relativistic situation this device is provided by Newton's potential, ɸ, and the resulting entropic force is called gravity. (Verlinde, 2010)

The Newton potential ( ɸ) keeps track of the depletion of the entropy per bit. It is therefore natural to identify it with a coarse-graining variable, like the (renormalization group) scale in AdS/CFT (Verlinde, 2010). Verlinde’s EG proposes a holographic scenario for the emergence of space in which the Newton potential precisely plays this role.

Prehension

“Whitehead’s basic hypothesis is that some type of low-level experiencing (prehension) is ubiquitous and a basic metaphysical principle ” (Nobo, 2004). Cobb considers prehension in high- esteem, emphasizing it “ may be the single most important and original concept in Whitehead’s philosophy ” (WWB, 2009; p.31). As Whitehead describes, an actual entity “ is the process of ‘feeling’ the many data, so as to absorb them into the unity of one individual satisfaction ” (PR 117). In the event-logic, prehension initiates as soon as the phonon begins surveying the holographic re-enaction of ‘initial’ into ‘objective data’. The phonon prehends individual items to +/- ends, endowing some with an expectation value and eliminating others. This operation is shown to characterize the creative-stamp Whitehead affixes to prehension.

I have adopted the term ‘prehension’ to express the activity whereby an actual entity effects its own concretion of other things. In Cartesian language, the essence of an actual entity consists solely in the fact that it is a prehending thing (i.e. a substance whose whole essence or nature is to prehend). (PR 77 )

Whitehead formalizes the process of prehension into three factors: “(a) the ‘subject’ which is prehending; (b) the ‘datum’ which is prehended; and (c) the ‘subjective form,’ which is how that

140 subject prehends that datum ” (PR, 35). Here, the “subject” is linked to a phonon qua coupling and “datum” refers to an “objective datum” (free quark) in the chiral bag. The “subjective form” refers to the way in which the subject prehends, as driven by (something like) a harmonic principle. As Sherburne (1966) explains, X prehends the initial data ‰ objective data (Bn) and per each X,B relation forms a unique prehension that is either positive or negative, depending on the initial aim of datum. If it is a positive prehension then it is called a feeling ; all other negative prehensions are excluded from the phases of concrescence but kept track of by Newton’s constant in Verlinde’s model.

In Whitehead, ‘selection’ is linked to the role of ‘decision.’ While there are many decisions that ultimately go into each AO, the initial decisions are one’s made for each ‘objective datum’ making up that data-set (or group) by the coupling. Instead of (1 or 0) of information theory, the measure of prehension is based on selections (+ or -). In this phase, objective data are each prehended individually and determined to be either positive or negative based on their relation with the “original datum” (phonon, coupling).

In their conceptual mode of prehension, actual entities (positively or negatively) prehend the objective data of eternal objects in order to provide an appropriate configuration for the actual entity to concresce. (Christian, 1959)

The objective data (of the initial data) are the only values enabled and eligible for prehension. All the objects prehended in the feeling phase, whose aim is to create a satisfaction, are graded in “relevance.” As Whitehead explains, “ most of them will be felt only vaguely ” (PR 66) and each will be prehended only in a certain perspective; but importantly, “ each will be prehended in some determinate way ” (Christian, 1959); Thus, each value in that actual world is distinguished into a positive and or negative prehension. In this, prehension filters the contents so that only some are chosen for combinatory synthesis into the further phases. In order to be positively prehended, an objective datum must be in some degree of aesthetic symmetry with the phonon. Positive feelings are regarded as simple, conformal feelings (PR 189-190) granted positive inclusion into the phases of concrescence, while negative prehensions are excluded from the phases of concrescence.

Negative Prehension = Off-Diagonal Open Strings Integrated-Out of Matrix

There is a transition from the initial data to the objective datum effected by the elimination. (PR 221)

Some of the items “implicated” in the objective data (free quarks) are prehended negatively. The values not-used are called negative prehensions and are withheld from the combinatorial phases of concrescence of that actual occasion. That is, some entities are not positively felt, but are “excluded from positive contribution to the subject’s own real internal constitution ” (PR 66). “Feelings,” or positive prehensions, “ contribute their subjective forms and their data to the formation of novel integral prehensions but negative prehensions contribute only their subjective forms ” (PR 39). As Lango explains:

141

Each entity has the potentiality for being prehended by (or ingressing into) an actual entity. But some elements need not be elements in the internal process of concrescence of an actual entity… and are excluded. (Lango, 1972; p.22-23)

Per Verlinde, integrating-out certain open strings in the off-diagonal modes of matrix (box) leads to the gravitational effect as an adiabatic reaction force. In Whitehead’s framework this is recognized as the exclusion of certain values during prehension from the further phases of concrescence—referred to as ‘negative prehension.’

A negative prehension is the definite exclusion of that item from positive contribution to the subject’s own real internal constitution. A positive prehension is the definite inclusion of that item into positive contribution to the subject’s own real internal constitution. This positive inclusion is called its ‘feeling’ of that item. All actual entities in the actual world, relatively to a given actual entity as ‘subject,’ are necessarily ‘felt’ by that subject, though in general vaguely. (PR 66)

While Whitehead only faintly considers a physical role for negative prehensions, in Verlinde’s model we encounter it as the reason for the gravitational effect. Here, the degrees of freedom in UV are responsible for inducing long-range forces given phase space is influenced by the long-range description a la UV/IR mixing (see Verlinde, 2011). The exclusion of values in coarse graining is linked in Verlinde to Newton’s potential, ɸ; in the process, negative prehension gains a partial functionality and identity: it’s shown to be conceptually linked with the gravitational potential in an EG scenario, serving as fodder for the gravitational acceleration force. Taking out negative prehensions propels the concrescence group closer to the horizon. Once a value thermalizes onto the screen it should also send an adiabatic, reactive ‘kick-back’ force qua acceleration.

The items prehended negatively in the satisfaction are nevertheless implicated in its objective datum. Thus a negative prehension of X would have as its datum not-X, and this would add to the determinateness of those items that are positively prehended as Y. While the data of negative prehensions may thus be said to be negatively implicated in the objective datum, the subjective forms of these prehensions make a positive contribution to the subjective form of the satisfaction. A negative prehension would contribute “aversion from X” as an element in the complex subjective form of the satisfaction. This element of subjective form would contribute to the way Y is positively felt by the subject. (Christian, 1959)

Here, the spring-back of ‘aversion’ resembles the adiabatic reaction force. It is fitting that Whitehead’s negative prehension—the part left underdeveloped in the AE description—is precisely the physical component of an emergent gravitational force. Like the stone that the builder refused, it becomes the head corner stone in an emergent gravitational interpretation. Whitehead’s gravity description was right there in AE’s the whole time! It’s truly an elegant resolution.

142

Positive Prehension = Off-Diagonal Open Strings Acquiring Expectation Values

Whitehead allows only a positive feeling to belong to the species of prehensions (see Ford, 1984, p.213). Each off-diagonal open string selected or integrated-out is chosen on the basis of its aesthetic appropriation with the phonon: its feeling. “A ‘feeling’ belongs to the positive species of prehensions. An actual entity has a perfectly definite bond with each item in the universe. This determinate bond is its prehension of that item ” (PR 59) . Thus, if positive, it becomes a feeling that goes on to prehend along with the others positively prehended to synthesize in subsequent foliations of concrescent, collective-values harmonizing towards the maximal intensity of that state value. As Whitehead explains, “ the term feeling is a technical equivalent 0f ‘positive prehension’ ”(Lango, 1972, p. 20) “chosen to suggest that functioning through which the concrescent actuality appropriates the datum so as to make it its own ” (PR 220).

In Verlinde’s model, each ‘feeling’ positively-selected creates an expectation value in eigenstate on the diagonal of matrix. To these ends Whitehead describes how “ simple physical feelings embody the reproductive character of nature ” (PR 214). Such feelings, he says, are like “conformal feelings.” To be positively prehended is to become the feeling of a new subject for later phases of concrescence. As Whitehead explains, “ a feeling refers to the integration of an actual entity into the internal constitution of a subject ” (PR 298). To build off of this description we locate an earlier quote from Whitehead: “A feeling appropriates elements of the universe, which in themselves are other than the subject, and absorbs these elements into the real internal constitution of its subject by synthesizing them in the unity of an emotional pattern expressive of its own subjectivity ” (KPR 8). This ‘emotional pattern’ is understood as the tonal and affective phonon, a collective boson serving as coupling for the phases of prehension and concresce (or renormalization). In another sense, the notion of a ‘causal feeling’ in Whitehead is also defined such that “ simple physical feelings will be called ‘causal’ feelings [or feelings of causal efficacy]. The ‘power’ of one actual entity on the other is simply how the former is objectified in the constitution of the other ” (PR 363).

In addition, Whitehead introduces “comparative feelings” (PR 291-294) in a way we can describe here as the progressive coarse-graining of positively-prehended ‘conformal feelings’ over the phases of concrescence towards the maximal harmony/intensity of the unitary group of that occasion into a satisfaction. If ‘simple comparative feelings’ refer to one connection, then ‘complex comparative feelings’ refer to the network alignment of an entity within the larger framework of that generative satisfaction in flux of creation.

Whitehead also refers to feelings as vectors, saying “ they feel what is there and transform it into what is here ” (KPR, 8). “ It is a feeling from the cause which acquires the subjectivity of the new effect without loss of its original subjectivity in the cause ” (KPR, 11 ff. 12). As Whitehead says, “there is a vector transmission of emotional feeling of a sensum from A to B ” (PR 479 ff. 480). Vectors are also used in musical set theory as “interval vectors” to determine “ the intervallic content of a pitch-class set ” (see Schuijer, 2008). In this case, the vector quality (veho ) refers to the wave-pattern (frequency) of one tone as it reaches out to a subsequent one to form a uniquely-shared “feeling.” As Whitehead explains:

143

A simple physical feeling has the dual character of being the cause’s feeling re-enacted for the effect as subject. But this transference of feeling effects a partial identification of cause with effect, and not a mere representation of the cause. It is the cumulation of the universe and not a stage-play about it. By reason of this duplicity in a simple feeling there is a vector character which transfers the cause into the effect. It is a feeling from the cause which acquires the subjectivity of the new effect without loss of its original subjectivity in the cause. This primary phase of simple physical feelings constitutes the machinery by reason of which the creativity transcends the world already actual, and yet remains conditioned by that actual world in its new impersonation. (PR 237)

Given the property of asymptotic freedom, whichever values the phonon positively prehends create a disturbance when they pull that value out of position towards the phonon. When the coupling “pulls” a datum (free quark) towards it, it creates a spatial disturbance leading to a charge value, per asymptotic freedom. 60 Compared to Whitehead’s explanation: “ In this way B feels the sensum as derived from A and feels it with an emotional form also derived from A. This is the most primitive form of the feeling of causal efficacy. In physics it is the transmission of a form of energy ” (KPR 13). Now compare this in language to that of Verlinde in describing the one-loop amplitudes between two D-branes; we witness in both cases a distinct sense for the term ‘transmission of energy.’ Formally, Whitehead develops the notion of a feeling as “ a transition effecting a concrescence analyzable into five factors that express what the transition consists of, and effects ” (PR 221):

(vi) the subject which feels (vii) the ‘initial data’ which are to be felt (viii) the ‘elimination’ in virtue of negative prehensions (ix) the ‘objective datum’ which is felt (x) the ‘subjective form’ which is how that subject feels that objective datum.

To sequence the chronological integrity of these categories within an event-logic, we swap the first two items such that the initial data represent the data “to be felt” (as objective data) by the “subject that feels” – that is, the phonon as coupling.

The holographic dual (or reenactment) of the initial data represents the objective data eligible for prehension, where each objective datum is “felt” by the “subject that feels.” As Whitehead explains “ It is in virtue of its subject that the feeling is one thing. If we abstract the subject from the feeling we are left with many things. Thus a feeling is one aspect of its own subject ” (PR, 221). Initial data, which are to be felt as ‘objective data,’ can be modeled in AdS/QCD correspondence of open strings on Dn-brane into a large-QCD gauge theory of free quarks in a chiral bag (on a conformal horizon). This means the initial data are not actually felt in themselves, but only in the holographic dual, thus they are “to be felt.” This leads to either an: “elimination in virtue of negative prehensions” or else the positive inclusion of that value for feeling in the further phases of concrescence.

60 This disturbance might quickly grow if it were not for the miniscule radius of a bag.

144

In his fifth point, the coupling (or subject) is what transforms positively-selected data into the elements of the ‘subjective form’ per the subject (as coupling). As Lango states, “during its process of concrescence, an actual entity positively prehends only some of the already created entities, while eliminating others from positive relevance ” (Lango, 1972, p.18). As we’ve considered it, the conformation of “subjective emotion” from occasion B to A could refer to the mixing of waveforms—between the coupling and each datum of the objective data set—where two notes feel each other in combining to form an interval-vector.

6.3 — CONCRESCENCE = FOLIATION

The open strings are not prét-a-prehension until they have become closed-strings. In the prehension phase the elements shift in modality from multiplicities of initial data into objective data. The process of the coupling “feeling other data” serves as the “subject” of the holographic renormalization phase. Closed strings (objective data) gaining expectation-values are each lifted from the Coulomb branch into eigenstates of the Higgs branch (Verlinde, 2011)—or the phases of concrescence, for Whitehead.

In this scenario, values synthesize by foliating into the Fast Phase Space; Higgs branch emergent dimension of space, or collective eigenvalue of the (on-diagonal) matrix states. Whitehead refers to phases of concrescence taking place in ‘genetic’ time (PR 305-311); this corresponds with foliation into emergent dimension of Small Phase space; holographic renormalization. Coulomb branch

For all intents, the phases of concrescence begin as soon as values positively-prehended by coupling begin taking-on expectation values in the matrix. Each newly positive- prehended feeling makes a new equipotential surface of foliation, or in concrescence. The negative values, on the other hand, are shown in Verlinde to underwrite the gravitational force as emerging from an adiabatic reaction force.

Phases of Concrescence

Actual entities emerge through the concrescence of relations of prehension. (Lango, 1972)

An actual occasion is a “ concrescence of prehensions ” (PR 35). Concrescence flows from the coupling and evolves as it takes on positively-prehended values obtained in prehension. Taking Whitehead’s language, ‘simple conformal feelings’ become ‘complex conformal feelings’ as they become progressively more coarse-grained into one dynamic ‘satisfaction’ over the course (or subjective aim) of the phases of concrescence, or layers of foliation. As Christian explains, it’s “ a process of activity in which many entities function as objects ” (1959). Whitehead expresses similar sentiments: “ The real concrescence of many entities into

145 one actual entity is the internal process whereby those prehensions come together into a unity ” (PR 48). The relation of original datum to the many is what governs the discovery of unity in prehension as a question posed to each: harmonic or not? Those values positively selected are what constitute the elect feelings “ transformed into a unity of aesthetic appreciation ” in the further phases of concrescence ‰ satisfaction. “ The responsive phase [which follows upon this initial phase] absorbs these data as material for a subjective unity of feeling ” (PR 172). This describes positive feelings being absorbed into the phases of concrescence. As Whitehead states:

The mental pole introduces the subject as a determinant of its own concrescence. The mental pole is the subject determining its own ideal of itself by reference to eternal principles of valuation autonomously modified in their application to its own physical objective datum. (PR 248)

The phases of concrescence involve only those values of objective data as are positively prehended by coupling ‰ feelings available for the later phases; foliation and genetic phases. These positively prehended feelings enumerate the “live options” (James) of data for the later phases of concrescence that go into the formation of a maximally- harmonized amalgam of organized entities: the satisfaction of that occasion. As Sherburne explains, “ The process of concrescence is divisible into an initial stage of many feelings, and a succession of subsequent phases of more-complex feelings integrating the earlier simpler feelings up to the satisfaction which is one complex unity of feeling ” (KPR, 36).

Per Whitehead, positive prehensions in the synthetic phases of concrescence are such that selected values “grow together,” they do not grow apart: they grow for each other. Further, the positively-selected values foliate over progressive phases of coarse-graining into an emergent dimension of space (in bag) during renormalization until a final phase-product accrues after all ‘objective data’ have been exhausted in analysis by coupling. As Whitehead explains:

The first analysis of an actual entity, into its most concrete elements, discloses it to be a concrescence of prehensions which have originated in its process of becoming. All further analysis is an analysis of prehensions. (PR 49)

Feelings link-up with (follow sequentially from) other feelings in order to form “complex” and “comparative” feelings in the further phases of concrescence. As feelings take shape, the “subjective form” (PR 256-263) is said to emerge and help guide, or orchestrate ‘feelings’ into a unified bundle of satisfaction. These further phases of concrescence (foliations) take place in an emergent, hidden phase space, until all have been exhausted and final harmonic packet of values resounds into one massively raw, complex tone marking the final, real value of the phase: the satisfaction, for Whitehead.

146

Foliation into Emergent Dimension of Space

During renormalization, Verlinde’s model demonstrates how space emerges at a macroscopic level only after coarse graining (2010). Here, like in AdS/CFT, there is one special direction corresponding to scale or a coarse-graining variable of the microscopic theory in which space is emergent. The screens that store the information are like stretched horizons, “ on one side there is space, on the other side nothing yet ” (Verlinde, 2010). Translated into Whitehead’s language, this means that space emerges at a macroscopic level only after prehension and concrescence. Holographic screens are located at (and correspond with) equipotential surfaces. Verlinde’s EG correlates the genetic phases of concrescence with foliation in holographic renormalization into an emergent dimension.

Space emerges at a macroscopic level only after coarse graining. Hence, there will be a finite entropy associated with each matter configuration. This entropy measures the amount of microscopic information that is invisible to the macroscopic observer. In general, this amount will depend on the distribution of the matter. (Verlinde, 2010)

Positive prehensions are associated with foliation screens in the genetic phases of the occasion, each going deeper into emergent dimension of bag and thus with a natural damping effect, like foliations deeper in the emergent dimension of space, in Verlinde. Each layer of foliation is like another phase of concrescence in progressive coarse-graining of conformal feelings (of objective data) towards satisfaction.

The coarse-grained data live on smaller (holographic) screens obtained by moving screens further into the interior of the space. “ Just like in AdS/CFT, there is one emerging direction in space that corresponds to a coarse graining variable, something like the cut-off scale of the system on the screens ” (Verlinde, 2010). The information removed by coarse-graining is replaced by the emerged part of space between the two screens. In this way we arrive at a nested or foliated description of space by having surfaces contained within surfaces. Foliation implores deeper as coarse-graining continues into an emergent dimension of space in the bag, likened to Whitehead’s “genetic time.” Verlinde describes Newton’s potential as “ the natural variable that measures the amount of coarse graining on the screens ” (2010). Continuing:

The amount of coarse graining is measured by the ratio -φ/2c2. This is a dimensionless number that is always between zero and one. It is only equal to one on the horizon of a black hole. We interpret this as the point where all bits have been maximally coarse grained. Thus the foliation naturally stops at black hole horizons. (Verlinde, 2010)

Verlinde explains that “ the holographic screens correspond to equipotential surfaces. This leads to a well-defined foliation of space ” (2010). Within the event-narrative constructed so far, we develop the nature of foliation and emergent space in Verlinde’s account in two particular ways: (1) the “ enclosed surface on screen in not yet emerged part of space ” (Verlinde, 2010) represents the holographic dual of the initial snapshot as a chiral bag of de-confined

147

(free) quark matter residing on the horizon in AdS/QCD; and (2) the conditions inside the bag follow asymptotic freedom. “ Physics away from thermodynamic equilibrium is asymptotic freedom in a bag model phase ” (Hubeny, 2007). In the most common formulation, the chiral bag model replaces the interior of a skyrmion with a bag of quarks. Because quarks are treated as free quarks inside bag, radius-independence validates the idea of asymptotic freedom (see Gubser, 2009; Shuryak, 2008; Gürsoy, 2010; Khodadi, 2014).

“The genetic process that produces the satisfaction is not itself in physical time ” (Christian, 1959). What if instead we follow Verlinde’s model to say not in physical (but emergent) space ? Opting for the emergent dimension in Verlinde’s account, we can also translate into the genetic phase description in Whitehead. Positive prehensions pass-on into values for phases of concrescence qua genetic time. Indeed, for an event-framework, we can now envision that both time (via genetic time) and space (and metric) are emergent from the underlying process, which necessarily includes the succession of events. Wheeler-Dewitt could also relate to “genetic time” in Whitehead: it speaks to how values in the bag, not considered statistically but kinetically, behave in principle.

6.4 — Verlinde’s Matrix Theory

One motivation for string theory is to understand what gravity is. (Verlinde, 2011)

Matrix string theory describes a nonperturbative framework. In 2011, Verlinde utilized this to make the case for gravity emerging from the integration-out of certain off-diagonal degrees of freedom in a hidden phase-space group—which can be defined as a box in a matrix theory of strings. Certain off-diagonal degrees of freedom in a matrix are integrated-out while the remaining values represent the gravitational self-energy. Included values (kept) in coarse-graining are progressively foliated while excluded values are kept track of by Newton’s potential, ɸ. The off-diagonal degrees of freedom integrated-out enumerate an emergent gravitational force meanwhile eigenstates of positive energy reside on the diagonal and describe the gravitational self-energy.

In the geomodal model we consider a chiral bag of free-quarks (QCD matter), like a matrix of open strings and eigenvalues in Verlinde’s model, where some strings can be integrated-out and others go on to coarse-grain in further foliations of comparative values into one ultimate, synthetic value corresponding to the thermalization of that matrix onto the horizon as a real value added.

String (or M) theory has a large moduli space of vacua, M, parametrized by the size and shape of the compact manifold and the string coupling (as well as the values of other background fields). At generic points in M the theory is eleven dimensional and inherently quantum mechanical while at certain degenerations it has different weakly coupled string expansions. (Kutasov, 1998)

148

Moduli space refers to the matrix, or chiral bag; in general we can refer to it as a moduli space. “The term moduli space is sometimes used in physics to refer specifically to the moduli space of vacuum expectation values of a set of scalar fields, or to the moduli space of possible string backgrounds ” (Viehweg, 1995; Simpson, 1994). As Kutasov explains, “ the Higgs branch of the gauge theory, corresponding to non-zero expectation values of the fundamentals, can be thought of as the moduli space of instantons ” (1998). Verlinde also describes how gravity is really about going from the Coulomb branch to the Higgs branch (see 2011). Kutasov explains that “the moduli space of instantons is the full Higgs branch of the theory ” (1998).

Additional phase space arises from the introduced coordinates proving responsible for gravity (2011u): integrating out open strings in off-diagonal modes induces gravity. “ We know this very well: ” off-diagonal modes carry a positive gravitational energy (Verlinde, 2011u). Eigenvalues are positions of D-branes and the integration-out of open-string matrix elements leads to the gravitational effect. As the off-diagonal modes gain expectation values, the eigenvalues are lifted into the Higgs branch. This is linked to gravitational collapse; as Verlinde explains, “ Gravitational collapse is about going from Coulomb branch to Higgs branch ” (2011).

Polchinski also describes the closed string exchange between two D-Branes. As Verlinde specifies, “ matrices aren’t fundamental fields but the expectation values of certain coordinate operators ” (2011u). Thus, there are matrix elements between D-Branes, and positions between D-branes that form a matrix-valued coordinate, and there are thus matrix elements between two D-brane states of the coordinate operator (2011). Verlinde demonstrates how the open/closed string correspondence can be realized by the equivalence between the one-loop amplitude of an open-string between two D-branes (D2 ‰ D4), and the exchange of a closed string between those two branes (2011u). Verlinde says that these one-loop amplitudes, stretching between two D-branes, sets up the logic for choice in renormalization of which some are chosen, others not, and this process leads to a gravitational effect qua adiabatic reaction force (2011).

Verlinde’s linking of the holographic screen to a Dn-brane helps when we consider open/closed string correspondence between two D-branes. From this perspective it represents the correspondence between a one-loop amplitude-channel of an open string in matrix model equivalent to a closed-string exchange between two D-branes (see Verlinde, 2011). Each open-string element stretches between a D2/D4 brane. The fact that Verlinde says we can also consider the holographic screen as a D-brane means that we can think of it as a D4 brane qua chiral bag model of quarks and gluons. This also describes the “multiplicity of initial data,” holographically projected into their correspondence-modes as objective data, available for prehension, in Whitehead’s AE’s.

Here, the diamond with open-strings represents a D2-brane, and the egg with dots represents a D4-brane. Within the constructed event-narrative, we take the D4-brane to represent a chiral bag in QCD, or a matrix in string theory. We arrive at closed strings from

149 open strings by the open/closed string correspondence. Once this occurs the next- echelon of gauge-gravity: the AdS/CFT, and AdS/QCD correspondences, also take effect.

In string theory the number of high-energy open string states is such that integrating them out indeed leads to long-range effects. Their one-loop amplitudes are equivalent to the tree level contributions due to the exchange of closed string states, which among other things are responsible for gravity. This interaction is, however, equivalently represented by the sum over all quantum contributions of the open string. In this sense the emergent nature of gravity is also supported by string theory (Verlinde, 2010).

In the microscopic universe there are many degrees of freedom that we generally ignore. This is because they do not think they influence the IR physics based on separation of scales, but this is not entirely true. UV/IR mixing does occur and fast to slow systems lead to an adiabatic reaction force and a book-keeping device to keep track of fast degrees of freedom integrated-out during prehension. Thus, gravity is not really a force: “ gravity is a book-keeping device to keep track of microscopic phase space .” Instead, inertia is approximately the force; it is the leading-order adiabatic force (Verlinde, 2011u).

The off-diagonal open-strings integrated-out of the matrix component in Verlinde’s model are shown to accord with what Whitehead calls “negative prehensions.” These negative values, we learn from Verlinde’s model, are kept track of in nature, serving to underwrite and propel the emergent gravitational force. As such, the under-developed concept of Whitehead’s “negative prehension” is clarified in Verlinde’s account. Finally, we encounter an almost direct correspondence from Whitehead’s very early work, PRel, and consider the language similarities between this and Verlinde’s 2011 description of emergent gravity arising in matrix theory (see 2011u).

The transmission of gravitational forces is grounded on the transmission of prehensions (and the inheritance of eternal objects), from actual entity to actual entity. (Whitehead, PRel 76; PR 267)

For Whitehead to say that the process is “ grounded on the transmission of prehensions ” is to recognize in Verlinde the transmission of open strings on one-loop amplitudes into closed strings, some given an expectation-value and others integrated-out of the matrix. For the values that are given a positive expectation-value in the matrix, these go on to synthesize in Coulomb branch of phase space to generate the gravitational self-energy of matrix values (see Verlinde, 2011).

6.5 — SATISFACTION —synthesis in phases of concrescence until max prehension/CG ‰ satisfaction; thermalization of polymer onto screen; gravitational effect

The prehension and concrescence phases cannot be expected to go on ad infinitum. After all, the initial snapshots are predicated on a finite number of EO’s in the causal frame. Owing to the existence of a threshold of perception, “ no finite extension is capable of containing an infinite number of parts ” (Hume, op. cit.; p. 30). A snapshot occupies the finite

150 space of the hypersurface and strands therein; applying Hume, we are reminded that there can only be a finite number of strands in the snapshot of a hypersurface.

In this chapter we will complete the process begun by the last two chapters and bring the AE’s to a “final real value” as a satisfaction. This process bears in likeness to the quantum mechanical situation described by Avron and Elgart where:

At some initial time a quantum-mechanical system has an energy given by the Hamiltonian ; the system is in an eigenstate of labelled . Changing conditions modify the Hamiltonian in a continuous manner, resulting in a final Hamiltonian at some later time . The system will evolve according to the Schrödinger equation, to reach a final state . (Avron and Elgart, 1999)

In addition, we will show how this notion aligns with Verlinde’s maximization of coarse- graining into a final real value on horizon, and its analog representation as a polymer thermalizing onto the horizon. Finally, we will describe how the gravitational force emerges as a result of an adiabatic reaction force as product of entropy (information) dynamics during renormalization ‰ final configuration, as the difference between the initial and final states of mass distributions of the data (see Verlinde, 2010).

Satisfaction, Thermalization, Emergent Gravity

The maximization of coarse-graining in Verlinde’s model is conceptually inlaid over the final phase of concrescence to represent the precipitator to Whitehead’s satisfaction. Once the phase of objective data has been maximally coarse-grained, or prehended, it is said to have reached its final phase of concrescence, or layer of foliation, out of which arises the satisfaction as a final, real value onto the horizon. It is the final, real act as the final product of concrescence and “ as a real value added to a horizon ” (Verlinde, 2010).

In a corresponding sense, the maximization of coarse-graining is also linked to the summation of Newton’s constant, φ, leading to the emergent gravitational effect. As such the maximization of coarse-graining is shown to produce two results: a gravitational self- energy qua satisfaction in the mode of positive prehensions; and an emergent gravitational force in the mode of negative prehensions. Pictorially we represent this scenario as follows:

Describing concrescence, Whitehead says: “ the many feelings, derivatively felt as alien, are transformed into a unity of aesthetic appreciation immediately felt as private ” (PR 212). This describes how the “initial data” and their holographic, “objective data” are in essence, ‘alien’ insofar as each is a moment from a different strand; however, through the renormalization-like operations of prehension and the phases of concrescence these

151 values are creatively-transformed into a “ unity of aesthetic appreciation ” qua satisfaction of maximal intensity, a value felt in that system privately. Positively-selected “feelings” undergo progressive concrescence until reaching maximization of prehension ‰ final phase of concrescence ‰ satisfaction, as final phase of concrescence: the actual occasion.

Only after all the antecedent entities have been prehended does the satisfaction commence, as the final-evaluation of the parts into an occasion. As Whitehead states, “The process itself resides with the two former phases,” prehension and concrescence (PR 212); thus, it is only natural that the completion of the process rests in the prehensive exhaustion of eligible components (objective data) available in that set. Every feeling tone , or “original datum,” prehends until it reaches the maximal of coarse-graining and initiates a final concrescence (foliation) whose contents lead to the “ final, real value ” that Whitehead refers to as a satisfaction. “The satisfaction […] is the completion of that occasion ” (Cobb; WWB, 2008).

Compare this to Verlinde’s example where, beginning from a particle one Compton wavelength away from the horizon of a black hole, Bekenstein showed how “ by pulling out the particle a bit further, one changes its energy by a small amount equal to the work done by the gravitational force. If one then drops the particle in to the black hole, the mass M increases by this same additional amount 61 ” (Verlinde, 2010). Once a particle in heat bath reaches the horizon it has become part of that thermal state. As a value on the screen, it represents an element added to it. As Verlinde explains:

In the holographic description, the particle can be thought of as immersed in the heat bath representing the black hole. This fact is particularly obvious in the context of AdS/CFT, in which a black hole is dual to a thermal state on the boundary, while the particle is represented as a delocalized operator that is gradually being thermalized. By the time that the particle reaches the horizon it has become part of the thermal state. (2010)

Thus, we do not arrive at a truly fundamental entity until we have reached the satisfaction. As Whitehead states, “ The many entities of the universe, including those originating in the concrescence itself, find their respective roles in this final unity…called the satisfaction. This satisfaction is the culmination of the concrescence into a completely determinate matter of fact ” (PR 212). In another sense, “ The satisfaction is the attainment of the private ideal which is the final cause of the concrescence ” (PR 212). Whitehead’s satisfaction represents the final concrescence of the feelings into a collective product. Prehension and concrescence take place therein until exhaustion of phase ‰ satisfaction.

In Verlinde’s matrix model, the final phase of concrescence reflects the eigenstate configuration of the final Hamiltonian; in Whitehead it represents the maximized and most-refined satisfaction of the AE generative cycle. In physics, the Berry phase is

61 Continued: “Consistency of the laws of black hole thermodynamics implies that the additional change in the Bekenstein Hawking entropy, when multiplied with the Hawking temperature TH, must be precisely equal to the work done by gravity” (2010)

152 a phase difference acquired over the course of a cycle when a system is subjected to cyclic adiabatic processes, which results from geometrical properties of the parameter space of the Hamiltonian (Verlinde, 2011). In this model, open strings in phase space of Higgs field are within the parameter space of the Hamiltonian; thus, in Whitehead we could imagine the Berry phase would speak to the difference between the objective data and the values acquired in prehension and concrescence leading to the satisfaction.

This accords with Verlinde’s model where certain open-strings acquire a positive expectation value and go on to influence eigenstate on diagonal of matrix. These values go on to generate the gravitational self-energy of the matrix values. In a concurrent sense, the integration-out of certain, off-diagonal strings in matrix and the coarse- graining (concrescence) of remaining values continues until polymer (box) thermalizes onto the horizon 62 , becoming a real value added to it (Verlinde, 2010). 63 This “becoming a real value” on the screen is then shown to parenthetically ignite a back-reaction (force) leading to the gravitational acceleration effect spurred by the sum of all negatively prehended coarse-graining values qua Newton’s potential, ɸ. Verlinde remarks:

Space cannot just emerge by itself. It has to be endowed by a book-keeping device that keeps track of the amount of information for a given energy distribution. It turns out, that in a non-relativistic situation this device is provided by Newton's potential, ɸ, and the resulting entropic force is called gravity. (Verlinde, 2010)

In Verlinde, the entropy gradient represents changes leading to the acceleration, as a final real value. The entropy gradient is related to acceleration. Acceleration happens as a result of the entropy gradient. Thus, formation of a satisfaction leads to an acceleration on the basis of entropy gradient causing change in inertia to yield force; as Verlinde explains, inertia means no entropy gradient (2010). Mäkelä furthers the point:

A specific attention in the model was paid to the so called acceleration surfaces. In very broad terms, acceleration surface may be described as a space-like two-surface

62 See also the “Membrane Paradigm” (Thorne, Macdonald, Price): Horizon interpreted as a fluid membrane with certain dissipative properties: (e.g. electrical conductivity, shear & bulk viscosity, etc.). On the spacetime boundary. (AdS/CFT). Fluid dynamics describes the full spacetime, not just horizon (Hubeny). 63 In the combined event-narrative, as soon as the occasion attains satisfaction it becomes an objective datum for successor occasions (Cobb, 2008). This is represented by the emergent, dimensionless number representing the total coarse graining ‰ satisfaction as a critical string (strand) for successor occasions (contemporary strands in vacuum mode). This value gets atomized onto the extensive continuum of that system, becoming available for ingression into future EO’s ‰ AE’s. Compare this to Whitehead’s description where: “ In the conception of the actual occasion in its phase of satisfaction, the entity has attained its individual separation from other things; it has absorbed the datum and it has not yet lost itself in the swing back to the decision whereby its appetition becomes an element in the data of other entities superseding it ” (PR 233). This describes how the satisfaction ‘becomes an element in the data of other entities superseding it.’ In this event logic, when decay crosses causal nexus it deposits a new strand value. 153

propagating in spacetime such that each point of the surface accelerates with the same constant proper acceleration a in a direction orthogonal to the surface. (Mäkelä, 2010)64

In conjunction with the horizon, this sets an effective limit on the final phase of concrescence and the creative phase into emergent space. From a Whiteheadian perspective, the maximum of coarse graining at horizon can be seen as providing a natural limit upon the extent of the creative phases of prehension and concrescence per a given AE cycle. The emergent identity of the synthesized feelings reaches maximal intensity in identity and culminates in a satisfaction. “With the attainment of the satisfaction, the creative urge which has driven the process of concrescence from phase to phase is exhausted or contented so that the internal process terminates ” (PR 335; AI 248). Compare this with the logic of the maximization of coarse graining and foliation on the equipotential surface associated with a real value on the horizon, as we saw in the last section. As Christian explains, “ No phase of internal change succeeds the feeling of satisfaction. It is the final outcome of the concrescence ” (Christian, 1959). This compares to the maximization of coarse-graining and foliation where no other transformation takes place once values reach (1) on the horizon.

The satisfaction contains the whole of the temporal duration of the occasion. Feelings create a nexus reflecting the gravitational self-energy of that matrix; satisfaction is final nexus. In EG it is the set of values selected in the matrix. In AE’s it is the special set of positively prehended values (qua gravitational self-energy) that synthetically combine over the phases of concrescence until forming into one, final value. As Verlinde explains:

Consider a microscopic state, which after coarse graining corresponds to a given mass distribution in space. All microscopic states that lead to the same mass distribution belong to the same macroscopic state. The entropy for each of these states is defined as the number of microscopic states that owe to the same macroscopic state. (Verlinde, 2010) 65

Similarly Whitehead describes how the satisfaction “provides the individual element in the composition of the actual occasion ” (PR 129). Here, the macroscopic state represents the actual occasion, which is the satisfaction. The microscopic state represents the objective data that after coarse-graining corresponds to a given mass distribution of positive prehensions in (Coulomb) space. Gravity is clarified as an emergent force arising from changes in matter distributions of matrices leading to entropic force as an adiabatic reaction force (see Verlinde, 2011).

In our geomodal model developed here, the final product (or actual occasion of the phases of concrescence) forms a pearl-like ‘satisfaction.’ The macroscopic state is like the

64 “An example of an acceleration surface is provided by a t = constant slice of a time-like hypersurface r = constant of the Schwarzschild spacetime equipped with the Schwarzschild coordinates r and t. Indeed, each point of such a slice accelerates with the same proper acceleration. In the Newtonian limit this gives the acceleration of particles in a free fall in the gravitational field created by a point-like mass ” (Mäkelä, 2010). 65 “This phenomenon is clearly entropic in nature, and the consequence of a statistical process driving the system to a state of maximal entropy ” (Verlinde, 2010). 154 satisfaction. The egg-like oval is like an oyster shell whose aim it is to grow a pearl. In string theory, following Verlinde, we consider the egg as the Higgs branch and the pearl as the Coulomb branch. Objective data that acquire an expectation value in Higgs phase space are lifted into the Coulomb phase space for the phases of concrescence. This foliates until forming a pearl. The pearl is the satisfaction.

6.6 — SUMMARY

The purpose of these last two chapters has been to provide a comparative analysis of Whitehead- and Verlinde’s models. The goal was to reconcile their dynamics within a conceptual framework grounded ultimately in processes and events. Within this, spacetime also is derived from the same, ultimate source (an initial process).

We assumed at the outset that Whitehead and Verlinde’s models could be interrelated in a productive manner; that is, if they could be integrated, we would have a philosophical grounding for emergent gravity as well as a physical and gravitational grounding for actual entities. As we discovered in comparing the two models to each other, a derivation of EG from string theory allows us to provide a fuller and more-adequate philosophical account of EG in terms of Whitehead’s AE’s: it provides a gravitational-signature within the AE’s themselves as a remarkable role for negative prehensions underwriting the emergent gravity model as an adiabatic reaction force (see Verlinde, 2011). In a physical- signature, the initial conditions of concrescence, as ‘a multiplicity of initial data,’ are correlated into a string-theoretic account bearing the same ontological role in an initial process underlying spacetime, gravity, and string theory.

The significance of the phonon (as a closed string) serves a two-fold purpose: for one, it generates an emergent quale with immediate physiological, affective, and other subtle effects. Secondly, in a rather elegant, organic feature of the dynamics, the phonon—as synthetic composite of the initial snapshot data (open strings) into one long-wavelength value—serves as the ‘coupling constant’ and measure of each open/closed-string holographic transform into ‘free-quarks’ in a ‘chiral bag’ on the ‘QCD horizon’. 66 Closed strings—transformed through dimensional analysis into free quarks—don’t burst but instead persist inertially under conditions of asymptotic freedom and prét-a-prehension. As such the composite phonon created by the ingression of the left-hand copy of initial data becomes the measure against each individual transform-value as it appears in the right-hand copy and is selected for either a positive or negative role.

After the projection of initial data into objective data (AdS/QCD) the initial snapshot undergoes spontaneous ingression into a phonon. The phonon preserves form after resounding by serving as the “coupling constant” in bag of dual-elements for prehension. During this phase, each element is individually measured against “the coupling” to

66 As Eastman notes in a personal correspondence, experimental results concerning quarks always point to “bound” quarks, and yet Verlinde’s model requires “free” quarks for the initial open/closed-string phase.

155 determine its fate as either “positively” or “negatively” valued. For rigor we would draw a topological ‘trace’ between each element in the diamond to one in the circle.

Through this it is shown that coarse-graining and prehension can be correlated, as can foliation and concrescence; for a method, we pursue Verlinde integration-out of certain open-strings in a matrix model. This is also the option nearest in dynamic similarity with Whitehead’s program where the phases of concrescence ‰ satisfaction, after initial fine- tuning (selection) by an “original datum” during the phase-operation of prehension.

An “ enclosed region on horizon ” (Verlinde, 2010) of pre-emerged space, is hypothesized to occur in a chiral bag of quark (QCD) matter, where the bag interior has the property of asymptotic freedom. Using asymptotic freedom, a physical mechanism is modeled according to Whitehead’s dynamics of prehension showing how two values are said to prehend into each-other. This creates zero-point energies as “feelings” describing further phases of concrescence ‰ final phase of satisfaction.

Furthermore, Whitehead’s account adds theoretical insight into what becomes of the actual dynamics of values in the bag, held as inessential in a statistical mechanics regime; but as Jibu and Yasue note, in living systems the statistical mechanics paradigm breaks down and we must replace it with QFT (2011). As such, Whitehead’s descriptions of prehension and concrescence can also offer a physical/philosophical construct to the QFT-based description of living systems dynamics. This provides a platform for dynamics otherwise valued statistically per equipotential surfaces. Specifically, “prehension” adds details to the conceptual narrative on how positive elements are selected in coarse- graining procedure and synthesized thru phases of concrescence qua layers of foliation.

Implementing the holographic principle we recover the objective datum of Whitehead’s Gifford-lectures (1927 ) as the primary synthesis of the initial snapshot of data. This datum sets off the phases of concrescence via prehension of the dually-represented chiral bag on a conformal horizon. This overcomes the problem of how to still have something left to prehend if the initial multiplicities synthesize to form the original datum: namely, the primary datum prehends the holographic dual of the initial data, as a chiral bag of free quarks, or what Whitehead calls the ‘objective data’ eligible for prehension (PR 24, 66). This is shown to provide a concise and elegant resolution to the data/datum dilemma, and both of Whitehead’s proposed starting-points for phases of concrescence are mutually acknowledged by virtue of the holographic principle. In a small sense this is said to have time-locked the full description of his theory.

In another sense, Whitehead also struggled to come to terms with the concept of coarse- graining under the guise of “negative prehensions.” He realized that some values get integrated-out but he couldn’t render what their purpose was; this led Whitehead to both suggest a deeper role for negative prehension as well as to (sort of) sweep it under the carpet. Verlinde’s EG finds just such a reason “ for nature to keep track of this information ” (2010), accounting for the gravitational effect qua negative prehension where it is shown

156 to be kept-track of by nature and corresponding with Newton’s gravitational potential ɸ (2010). As such, negative prehension gains a purpose and function not realized in Whitehead’s account of AE’s through a correlational identity and association with ɸ, and linked to the gravitational effect as an adiabatic reaction force in Verlinde (2010, 2011).

In our approach we develop the selection criteria of prehension into a physical capacity by relating each objective datum to the phonon (little Higgs as coupling) to either a positive or negative outcome. Chiral symmetry breaking could be considered as the distinction of state-evolution dynamics of contents between the two snapshots: the initial snapshot and its holographic dual. Where the contents of the initial snapshot ingress into a phonon, to serve then as the coupling (like a little Higgs boson), the dual of snapshot takes on a different role where each individual value is compared with the coupling (as product of first, left-hand copy) and given either a positive or negative outcome.

From this we consider a theory of emergent gravity hidden in Whitehead’s AE’s all-along as two elements representative of the same dynamical process: the maximization of coarse graining ‰ final phase of concrescence and satisfaction; plus, a gravitational back- reaction (adiabatic), as an acceleration-effect due to entopic dynamics of information linked to the collection of removed, negative prehensions kept track of by Newton’s potential, ɸ. To get here, a renormalization and course-graining/foliation procedure (prehension and concrescence) must proceed. Out of this maximally course-grained product, gravity emerges as an adiabatic reaction-force due to information-changes in entropy distributions (Verlinde, 2011).

6.7 — DICTIONARY

Pictorial Events Whitehead Verlinde Sea of Strands Continuous Potentialities; EO’s; dG2 Microscopic Data Snapshot of Strings Initial Multiplicity of data; event; duration Dn-brane of Open Strings; large-N QCD (1/n YMT) Phonon Original/Primary Datum Emergent Graviton, Closed String, Coupling Holographic Dual Objective data of Initial data; Reenactment AdS/CFT(QCD); open/closed string; gauge/gravity Selection Prehension: positive and negative Coarse Graining +/- Combination Concrescence of feelings, genetic phases Foliation into emergent dimension of space Collective Synthesis Satisfaction Max coarse graining ‰ Φ; gravitational self-energy

Emergent Gravity/Matrix Theory Actual Entities Integration-out of off-diagonal strings in matrix Negative Prehensions Expectation value acquired by off-diagonal string in matrix Positive Prehension Coarse-Graining into Emergent Foliation Space Phases of Concrescence in Genetic Time Maximal of Coarse-Graining; Box ‰ Horizon Final Phase of Concrescence ‰ Satisfaction Box Thermalizes onto Horizon Satisfaction becomes a Final, Real Value-Attainment Off-Diagonal Degrees of Freedom in phase space Objective Data Prét-a-Prehension Gravitational Self-Energy (Positive-valued) Positive Prehension (Feelings -- Satisfaction)

157

Chapter 7 — Discussion

The goal of this chapter is to introduce Whitehead’s alternative theory of relativity along with a set of philosophical and conceptual distinctions between (his and) Einstein’s theories of relativity, and to evaluate them in light of Verlinde’s emergent gravity. This is predicated on the basis that neither Einstein nor Whitehead’s principles of relativity represent an ultimate theory of gravity, but that of the two, Whitehead’s demonstrates the greater affinity to an emergent narrative propounded recently by Verlinde and others, spanning back to Sakharov’s initial model (1967).

7.1 — The Principle of Relativity

“The doctrine of relativity affects every branch of natural science, not excluding the biological sciences.” (PRel, 1922)

The nascent years of the twentieth century were marked with the paradigmatic shift of scientific and public worldview into that of Relativity, via Einstein and (the select influence of) Minkowski. As Whitehead explains, “ Relativity, in the form of a novel formula relating time and space, first developed in connection with electromagnetism, including light phenomena. Einstein proceeded to show its bearing on the formulae for gravitation ” (PRel 12). Indeed, Whitehead was also thinking about relativity while forming his categoreal scheme of the actual entities, even seeking-out Einstein’s counsel on one occasion (see Desmet, 2007, 2010). 67 As Desmet explains, “ in Whitehead’s interpretation of the physics of relativity, the primary meaning of the word ‘relativity’ is ‘relatedness,’ and the theory of relativity is above all a theory of the interrelatedness of events ” (2007). 68 We take events as basic predicates.

In 1922 Whitehead wrote The Principle of Relativity with Applications to Physical Science “with the aim of reformulating Einstein’s theory of gravity in such a way that gravity would no longer be identified with the alleged, variably curved space-time, but with a physical interaction (Whitehead’s gravitational impetus) that can be defined against the uniform background of Minkowski’s space-time ” (Desmet, 2007). This begins with the invocation:

The present work is an exposition of an alternative rendering of the theory of relativity. It takes its rise from that “awakening from dogmatic slumber”—to use Kant’s phrase—which we owe to Einstein and Minkowski. But it is not an attempt to expound either Einstein’s earlier or his later theory. The metrical formulae finally arrived at are those of the earlier theory, but the meanings ascribed to the algebraic symbols are entirely different. (PRel, v)

67 Whitehead credits Minkowski for much of his own model. As he states: “ a tribute should be paid to the genius of Minkowski. It was he who stated in its full generality the conception of a four-dimensional world embracing space and time, in which the ultimate elements, or points, are the infinitesimal occurrences in the life of each particle. He built on Einstein’s foundations and his work forms an essential factor in the evolution of relativistic theory ” (Whitehead, February 12, 1920; qtd. in Johnson, 1961). 68 The first of Whitehead’s writings explicitly discussing the mathematical physics of relativity actually appears in “Space, Time, and Relativity,” a paper read to the Manchester Meeting of the British Association for the Advancement of Science in 1915, and later, before the London Aristotelian Society (Desmet, 2007) 158

Whitehead readily acknowledged the genius of Einstein’s formulation, but not without cautioning against a carte blanche acceptance without due diligence paid first.

I think that no one can study the evidence in its detail without becoming convinced that we are in the presence of one of the most profound reorganizations of scientific and philosophic thought. But so many considerations are raised, so diverse the character, that we are not justified in accepting blindfold the formulation of principles which guided Einstein to his formulae. (PRel 67)

Whitehead’s theory “ holds a different paradigm from Einstein's, elegant and simple in mathematical formulation with its own philosophical background. It has been called as ‘a thorn in Einstein's side’, because it agrees with Einstein in its prediction for all the classical tests ” (Tanaka, 1987). Indeed, Fowler's famous interpretation of Whitehead's theory makes it an alternate, mathematically equivalent, presentation of GR. As he states: “both theories reduce to the Schwarzschild metric, and therefore are equivalent in their predications of the four classical tests of relativity ” (Fowler, 1975; p.59). As Tanaka and Eastman explain:

It is a well-known fact since Eddington (1924) that Whitehead’s gravitational theory gives equivalent results to Einstein’s concerning the one-body problem. As the corresponding solution of general relativity is given as Schwarzschild’s space-time, Whitehead’s theory of gravitation can pass all classical tests of general relativity (Tanaka, 1987).

In 1924, Eddington pointed out a remarkable formal equivalence between Whitehead’s theory and Einstein’s general relativity: for the simple case of the gravitational field due to a single particle at rest Whitehead’s theory leads to a metric which is algebraically equivalent to the Schwarzschild solution of Einstein’s field equations (Eddington, 1924). […] The implication of this equivalence was that the predictive power of Einstein’s and Whitehead’s theories would be identical with regard to standard tests of Einstein’s theory. […] In addition, Whitehead’s theory is shown to predict delays in radar ranging, the perihelion precession of Mercury, and the gravitational red-shift, when electromagnetism was included in the theory in a natural way (Synge, 1951). Most other theories of gravity except Einstein’s general theory of relativity fail these standards tests. (Eastman, 2009)

Whitehead develops this under the notion of the universal principle of relativity wherein “the geometric structure of nature grows out of the relations among actual occasions ” (Fowler, 1975). Lango follows: “ the all-pervasiveness of gravitational fields not only resembles but is also explained by (and therefore is evidence to) the universal relativity of actual entities ” (Lango, 1972, p.20). Tanaka expresses similar beliefs:

The principle of universal relativity does not remain a physical principle, but it is generalized as the most universal metaphysical principle in the philosophy of organism. The principle of relativity in Whitehead’s metaphysics is not, as Einstein’s, a physical principle concerning the choice of spatiotemporal coordinates system, but the ontological principle of metaphysics which stipulates essential relatedness of actual entities with the distinction between actuality and potentiality, subjectivity and objectivity: actuality as subjectivity implies becoming, and being as objectivity implies mere potentiality. (Tanaka, 1975)

From these quotes we recognize the central role of relativity in a philosophy of organism.

159

7.2 — Philosophical Distinctions Between Einstein and Whitehead

To expect to reorganize our ideas of time, space, and measurement without some discussion which must be ranked as philosophical 69 is to neglect the teaching of history and the inherent probabilities of the subject. On the other hand, no reorganization of these ideas can command confidence unless it supplies science with added power in the analysis of phenomena. (PRel 4)

The major distinctions between Einstein and Whitehead’s models aren’t physical but philosophical and conceptual; as Eastman explains, Whitehead offers a “ theory of gravity which differed significantly, both in its philosophy of nature and in its mathematical construction, from the general theory of relativity given by Albert Einstein in 1915 ” (2009). Fowler initially- furthered this, determining how “ no empirical test can decide the issue of the adequacy of Whitehead's basic theory of relativity. This issue must be settled on other grounds ” (Fowler, 1974). As Tanaka points out, “ the physical content of Einstein's theory can be deduced without relying on Einstein's principles ” (1987). Tanaka also notes how the principle of relativity “ plays the central role not only in [Whitehead’s] physics, but also in his metaphysics. The physical principle of relativity is generalized to the metaphysical one. The more we understand his metaphysics, the more we comprehend his physics ” (1987).

Here we’ll consider Whitehead’s alternative to Einstein’s relativity on the basis of philosophical distinctions. Between the two we’ll offer evidence that the type of differences between the two show Whitehead’s theory of relativity to be more characteristic of Verlinde’s EG than Einstein’s, even though both Whitehead and Einstein’s models are subject to reformation in Verlinde’s account. In the process, Whitehead’s principle of relativity is shown to house Verlinde’s EG within an AE cycle. In addition, Whitehead offers an entire process philosophy to accompany his model whereas Einstein’s proves less-developed.

The difference between Einstein’s and Whitehead’s approaches to science may be characterized as a difference between a deductivist approach by Einstein and an inductivist approach by Whitehead. For an added discussion of induction in Whitehead see Plamondon, “Metaphysics and Valid Inductions” (1973). The central idea in Einstein's epistemology is the distinction between freely-created concepts and sense experience; however, as Fowler explains, “ mathematics can be devised without any clear correlation with sense-experience ” (1975, p.44). Indeed, this is the case with Riemann’s n-dimensional geometry. As Schild explains:

Einstein’s general theory of relativity is a field theory in a Riemannian spacetime and the gravitational potentials Guv are intimately linked with local space and time measurements. As a direct consequence of this, the two theories differ slightly in their prediction of a gravitational red shift. On the other hand, both theories give the same effects for the advance of perihelion of

69 Whitehead qualifies the use of the term, ‘philosophical’ for these purposes: “ it has nothing to do with ethics or theology or aesthetics. It is solely engaged in determining the most general conceptions which apply to things observed by the senses. Accordingly it is not even metaphysics: it should be called pan-physics. Its task is to formulate those principles of science which are employed equally in every branch of natural science ” (PRel 4ff.). 160

planetary obits and for the deflection of a light ray passing near a heavy sun at rest. Since gravitational red-shift measurements are barely within the limit of experimental accuracy, both theories are in good agreement with the meagre experimental facts known to day. (Schild, 1956)

Einstein himself admits this when he says that the initial hypotheses of his theory of relativity “ become steadily more abstract and remote from experience ” (Einstein, Ideas, p.275). “Since Einstein’s law of gravitation does not incorporate electromagnetic fields, and since gravitation is reduced to geometry, the metaphysical understanding of Einstein’s GTR requires consideration only of matter and geometry ” (Fowler, 1975, p.53). In his defense, however, “Einstein’s genius in GTR is his ability to correlate abstract mathematical concepts with the physical world. As his thought matured he was taken farther and farther away from experience and nearer to the abstract nature of the fundamental postulates and principles of physics ” (Fowler, 1975, p.44).

Einstein’s Special Theory of Relativity is based on the foundational principles of the: 1) Constancy of the velocity of light in a vacuum in all inertial frames of reference, independent of the velocity of the emitting source; and 2) Independence of the laws of physics from the choice of the inertial system (see Fowler, 1975). The foundational principles of Whitehead’s theory of relativity, by comparison, are marked by: 1) an appeal to direct experience; and 2) recognition of the observable properties of relative motions between alternate time-systems, “ demanding an awareness of simultaneity for its basis ” (Fowler, 1975).

In Einstein’s GTR the concept of a gravitational force is given up. Gravity is not a force, but a pseudo-force…there is no gravitational force, but only the interaction of geometry and matter. It would be better to say that the metric itself determines motion, and drop the term ‘gravitational force’ completely […] In both the epistemological and metaphysical foundations of [Einstein’s] theory, physical reality is equated with pure mathematics. (Fowler, 1975, p.55-56)

Contrary to this, Whitehead establishes the scenario where space-time doesn’t depend on local, physical values, or else in order to know any part of the world you’d have to know the whole world. To get here we have to first undo the simplification of two tensors into one, in Einstein, and instead untangle them back to their original constituents; one for gravity and the other for geometry. As Whitehead explains: “ It is inherent in my theory to maintain the old division between physics and geometry. Physics is the science of the contingent relations of nature and geometry expresses its uniform relatedness ” (PRel, xv). Desmet clarifies:

For Whitehead, Einstein’s theory of relativity represents an opportunity to transform classical physics into a new science: the science of the internal togetherness of natural entities. […] In the physics Whitehead aimed at, the geometry of space-time is an expression of the uniform, internal relatedness that binds together all elementary events in one world, and the laws of physics express contingent, internal relations that bind together elementary events in social networks within the worldwide network of all elementary events. (Desmet, 2011)

Given the differences are essentially philosophical, this means that the conceptual models are different between approaches. Whitehead’s theory of gravitation comes complete with a theory based within a comprehensive philosophy of nature. Einstein’s seems to resemble little approaching the likes of experience, however.

161

Next we will identify and discuss six properties compared between both Whitehead and Einstein’s models of relativity and relatedness. These include: 1) Experience; 2) Two Metrics, not One; 3) Space and Time; 4) Measurement; 5) Uniformity; and 6) Simultaneity. From here we will then consider how Whitehead’s treatment of these properties draws it closer in semblance with Verlinde’s emergent model of gravity.

7.2.1 — Experience

"Experience" is the watchword of Whitehead’s approach to relativity theory. His description of simultaneity, his doctrine of the uniformity of nature, his doctrine of alternate time-systems, and his principle of kinematic symmetry all exemplify his appeal to our direct, immediate experience of nature. Experience is also the central method of his approach to problems in metaphysics. (Fowler, 1975)

Whitehead distinguishes his position from Einstein’s in holding that “ scientific concepts are abstractions form the world, and not the reality itself; therefore, he opposes the equivocation of mathematics and physical reality ” (Fowler, 1975, p.63). Following his method of abstraction, “Whitehead’s theory of relativity is guided by the question: Is relativity theory consonant with our experience of nature? ” (Fowler, 1975, p.63). Whitehead “ proceeds by examining our immediate experience in the world ” (ibid, p.60). In the preface to PNK, Whitehead “ stresses that the modern theory of relativity, because of its union of space and time, has opened the possibility of a new answer to the question of how the space of physical geometry can be conceived as the logical outcome of generalisations from experience ” (Desmet, 2007). This point is crucial to grasping the crux of Whitehead’s process-ontology and experiential metaphysics.

To these ends, “ Whitehead’s theory of relativity involves two steps: 1) an interpretation of our immediate experience of the world, and 2) the formulation of the laws of physics as abstractions from this immediate experience ” (Fowler, 1975; p.60). We show that the laws of physics are abstracted from out of the principle process defined in an event-logic. Whitehead’s AE’s, suitably rendered, describe this same process.

The first step involves formalization of something like Kant’s “space and time of the mind.” Lockwood defines Kant’s space and time of mind more exactly in his initial inversion of Minkowski’s spacetime lightcone (1999). An event-logic grows out of Lockwood’s insight as a generalization into space and time of consciousness. From this we next specify how our ontologically-immediate experience of the world takes place one level below experience and prior to conscious, sensory awareness. This describes the snapshot mechanism that Whitehead speaks to as durational events.

Our datum is the actual world, including ourselves; and this actual world spreads itself for observation in the guise of the topic of our immediate experience. The elucidation of immediate experience is the sole justification for any thought; and the starting point for thought is the analytic observation of components of this experience. (PR 6)

The clarification we offer is that “immediate experience” occurs and configures one-level below (or before) human sensory-awareness. This takes it out of mode of sensory- awareness but precisely into the domain of a principle process underwriting experience

162

(plus science and nature) qua “initial conditions.” The initial conditions are represented by a snapshot of strands appearing as strings.

By contrast, the key foundational principles of Einstein’s theory—the constancy of the velocity of light and the equivalence principle—“are postulates which are the free creations of the mind and not open to immediate experience ” (Fowler, 1976). As he explains, the basic creed of Einstein’s epistemology evolves around “ the free invention of concepts which are correlated with sense-experience. Einstein bifurcates nature into the conceptual, which is freely constructed by thoughts, and the empirical. There is no logical connection between the two realms ” (1976). This renders Einstein’s mental-construct model out-of-touch with cellular and conscious facets of experience, thereby also making it less amenable to modern physics.

In Principles of Theoretical Physics, Einstein offers that the free invention of concepts leads to the formulation of simple postulates upon which the conclusions of the theory are based. As Fowler explains, “ The postulate of the constancy of the velocity of light in the STR and the equivalence principle in the GTR are excellent examples of postulates. These are arrived at intuitively and then their implications are sought out in the empirical world ” (1975). Fowler concludes that these fundamental postulate are thus heuristic given “ they are not a result of direct experiences, since no gravitation-free space, or empty-space exists ” (Fowler, 1975. p.31). Einstein’s notion of freely-formed concepts can be seen as a reaction to Minkowski’s development of imaginary time in four dimensions.

I see on the one side the totality of sense experiences, and, on the other, the totality of the concepts and propositions which are laid in books. […] the concepts and propositions get meaning viz. content only through their connection with sense-experiences. The connection of the latter with the former is purely intuitive, not itself of a logical nature…the system of concepts is a creation of man together with the rules of syntax, which constitute the structure of the conceptual system. (Einstein, 1934; p.13)

Minkowski’s influence caused Einstein to convert his theory from a physical one to a purely geometric one—an often overlooked detail. While in his earliest paper Einstein held to a sensationalist view, by 1908 he had sided with Minkowski’s choice of a geometrized space-time. Fowler adds that Einstein worked to formulate a well-developed epistemology but never developed a corresponding metaphysics. As Einstein explains:

I am well aware that no causality exists in relation to the observable; I consider this realization to be conclusive. But in my opinion one should not conclude from this that the theory, too, has to be based on fundamental laws of statistics (Einstein, 1934; p.163).

Fowler states, “ the role of the scientists is envisioned as one of intervention between the multiplicity of sense-experiences and the dynamics of constructive thought ” (1975). This proves to be profound for an event-logic, described as a modal predication for the dynamics of constructive thought. The initial difficulty is in the fact that Einstein sees no connection between these two realms, but taken in an emergent paradigm we might resolve this as

163

an early sign of the statistical emergence of the theory on the outcome of the gravitational force in an emergent paradigm.

This also provides an answer to Fowler’s question: “ if the conceptual realm which is freely created is not logically connected with the empirical realm, how are the concepts related to the external world? ” (1975). This is a profound question that grips to the heart of metaphysics and philosophy qua the relation of Plato’s forms to reality and the relationship between Kant’s ‘noumenal’ and ‘phenomenal’ modes at the same time. It seems Einstein is asking the same questions as Kant and Plato: as Fowler explains, “ Einstein recognized this problem, but was convinced that there was some ultimate way in which the world was structured ” (Fowler, 1975, p.30). He continues, “ Einstein was convinced that there was an objective world which could be discovered by a scientist. ” Einstein himself is quoted in an essay “On the Method of Theoretical Physics” (1934) asking:

If it is true that the axiomatic basis of theoretical physics cannot be extracted from experience but must be freely invented, can we ever hope to find the right way? Nay, ore, has this right way any existence outside our illusions? Can we hope to be guided safely by experience at all when there exist theirs (such as classical mechanics) which to a large extent do justice to experience, without getting to the root of the matter? I answer without hesitation that there is, in my opinion, a right way, and that we are capable of finding it.

While Einstein maintains there is no logical path from sense-experience to concepts, he still believes concepts are derived in some way from the real world. As Kiley explains:

Einstein insists on sensible beginnings for scientific investigations. There is no logical path, he contends, from sensible things to the first concepts and axioms of a system of scientific deductive thought and experiences can do no more that suggest, but the origins are firmly rooted to the real world of sensible realities. There is no stronger credo in the whole of Einsteinian epistemology. (Kiley, 1961; p.xiii)

By contrast, Whitehead’s theory is predicated on the method of abstraction, wherein mathematical expressions and scientific objects are derived from the “ elucidation of our immediate experience of the world ” (PR 30). To these ends we would provide a reinterpretation of our ‘immediate experience of the world’ to reflect a pre-conscious operation, in this study and its subsequent event-logic.

7.2.2 — Two Metrics, Not One

It is inherent in my theory to maintain the old division between physics and geometry. Physics is the science of the contingent relations of nature and geometry expresses its uniform relatedness. (PRel, vff.)

If gravity is emergent, so is space time geometry. Einstein tied these two concepts together, and both have to be given up if we want to understand one or the other at a more fundamental level. (Verlinde, 2010)

Einstein is often quoted as saying “ we should take things to be as simple as possible, but no simpler ” (1934). As it turns out, we might find his (own) model subject to critique by this

164 very same decree. In the choice of gravitational metrics, where Einstein chose only one, compounding geometry and physics, Whitehead, like Minkowski, held fast to the conventional notion of two separate metrics, one for geometry and one for physics. By the time we acquire the holographic principle (at the end of the 20 th century) we see that to take only one metric instead of maintain two is to make things simpler than necessary. 70

Contrary to Einstein’s analysis, Whitehead’s metric contains both a non-dynamical flat background metric by which measurement is interpreted unambiguously and a physical term in which the gravitational action is evaluated along the null cone. (Eastman, 2009)

The differences between Whitehead’s theory and Einstein’s theory have been examined by Palter (1960) and Llewellyn (1973). Both authors approach Whitehead’s theory from its mathematical nature, focusing particularly on its uniform metric structure. A mathematically equivalent expression is used by Einstein, except the metric dG 2 is defined as ds 2. Although mathematically equivalent, the metrics dG 2 and ds 2 embody different explanatory content. As Fowler explains:

For Einstein, ds 2 is associated with the "proper time" of a particle, describing geodesics, and consequently reflecting his theory of measurement based on the contingent facts of nature. For Whitehead, dG 2 represents the uniform structure of a background Minkowski spacetime which describes the congruence properties between alternate time-systems. Thus, for Einstein, the metric embodies a physical content, while, for Whitehead, it embodies a geometrical content which is independent of the contingent physical world. (Fowler, 1975)

Unlike Einstein’s theory, “ Whitehead’s law of gravitation is expressed within the framework of the uniformity of Minkowski space-time, and it’s called a two-metric theory of gravitation ” (Fowler, 1975, p.58). For Whitehead, gravity isn’t an effect of geometry but rather an expression of real, causal relationships in the physically contingent world. The formula Whitehead adopted for the gravitational field involves both the flat metric of Minkowski spacetime and a dynamical metric dependent on the presence of source masses.

The metric (dG 2) represents the uniform structure of geometric, Minkowski spacetime (see Bain, 1998) and is therefore identical in predictive power to Einstein’s STR. To find a mathematical expression for the law of gravity, Whitehead introduces a second metric expressing physical content. The metric, dJ 2, represents the gravitational field of a particle and describes the way a particle “pervades its future.” As Fowler explains, “Whitehead’s law of gravitation requires both metrics for its expression ” (1975; p.59). This “two- metric” foundation “rests on the division of geometry and physics [and] represents a third alternative to understanding relativity physics ” (ibid, p.61). This predicts the same results as does Einstein, but by preserving the notion of two distinct (though not-wholly unconnected) metrics, Whitehead’s model proves more closely related to Newton’s law of

70 This intuition is ex-post-facto verified after we came into possession of holographic principle, AdS/CFT, and emergent gravity hypotheses at the end of 20 th century and ten years into the 21 st . 165 gravitation and Maxwell’s electromagnetic phenomena; thus, compared to Einstein’s move, Whitehead’s approach is seen as a conservative one (see Eastman, 2010).

Whitehead’s explanation of gravitational forces as real forces maintains the distinction between geometry and physics; as Desmet explains (recent correspondence), Whitehead wants “ to create a gravitational theory that respects the distinction between geometry and physics as all classical theories do, and which is similar to Maxwell's electromagnetic theory ” (2014).

In Einstein’s interpretation of the physics of relativity the bifurcation of classical physics is reconfirmed: nature is split in two worlds; on the one hand, the world of so-called objective science, a postulate of our theoretical thought; and on the other hand, the world of so-called subjective perception and common sense, the basis of our praxis. (Desmet, 2007)

Whitehead “ replaced Einstein’s interpretation with one that is coherent with the presupposition of common sense that the geometry of space-time is uniform, and independent of the physics of gravitation ” (Desmet, 2007).

Whitehead was convinced that geometry should be distinguished from physics. Geometry represents the uniform relatedness of nature, especially of spatiotemporal relations. Physics treats the contingent properties of nature. These convictions were related to his rejection of scientific materialism and of the bifurcation of nature. The theme of physics, according to him, is not the material things themselves cut off from the perceptual data but the perceived phenomena which show themselves "contingently" in the uniform framework of space-time. (Tanaka, 1987)

Within Whitehead’s works, the first metric defines geometrical relations is attended to in PNK and CN; in PRel, Whitehead covers the second metric, defining physical relations.

The First Metric

As Fowler explains: “ the first metric represents Whitehead’s analysis of the geometrical (spatial) relations existing between events ” (1975; p.59). Event-particles, for Whitehead, are identical with Minkowski’s world-points (see Fowler, 1975, p.90). Whitehead correlates this metric with kinematic elements, which he defines as “ a stretch of an historical route, as thus employed in proceeding to a limit ” (PRel 74). On this basis, kinematic elements aren’t reducible to event-particles, and nor are they identical with events, which are represented by the entire historical route. This speaks to the qualification of supersymmetry in (ch.5).

The regulation of future adjectives of appearance by past adjectives of appearance is expressed by this intermediate distribution of character, indicated by the past and indicating the future. I call this intermediate distribution of character the "physical field." (PRel 71)

He provides a link to the “inessential” microscopic data in Verlinde’s model, explaining how the physical field “ expresses the unessential uniformities regulating the contingency of appearance ” (PRel 8). Whitehead describes the physical field as the " interweaving of the individual peculiarities of actual occasions on the background of systematic geometry " (PR 507). The physical field is atomic (PRel 72). In an event-narrative, “unessential uniformities”

166 refer to the happenstance collection of eternal objects in the causal frame that make up a snapshot: they’re the open strings attached to a D-brane.

The first metric of Whitehead’s theory of relativity concerns the derivation of the Lorentz- Einstein equations, and therefore corresponds with Einstein’s STR. It is recalled that Einstein’s original formulation of STR involved the relative motion of electrically charged bodies until Minkowski persuaded him otherwise in 1908 with his 4d spacetime (see Fowler, 1975; p.60).

Thus, the " individual peculiarities of actual occasions " represent the properties of the physical contingent world (dJ 2) while the " background of systematic geometry " represents the (second) metric of uniform, background spacetime (dG 2), as the “physical (gravitational and electromagnetic) relations existing between events ” (Fowler, 1975; p.59) and describes the “contingent relationships between events in the world ” representing Whitehead’s law of gravitation, and accordingly in the context of foundational physics (Fowler, 1975, p.62).

The Second Metric

In order to find a mathematical expression for the law of gravitation, Whitehead introduces a second metric embodying the physical content. This metric, dJ 2, represents the gravitational field of a particle and describes the way a particle “pervades its future.”

The first metric of Whitehead’s theory has to do with geometry which is uniform. The second metric, dJ 2, deals with the measure of the physical character of the kinematic element which shares in the contingency of nature (see Whitehead, PRel). The physical character of a kinematic element certainly belongs to the second metric, as a snapshot of “continuous potentiality” into “atomic actuality.” Fowler and Whitehead are therefore not wrong to say that the “continuous potentiality” of kinematic elements is found in concert with the uniform geometry of inverted lightcones. These kinematic elements traverse the background field in which the inverted lightcone represents a frame, or manifold within the field. Upon snapshots, the continuous potentialities in frame are frozen into atomic actualities, and these are what represent the physical character of kinematic elements and the starting-basis of the AE process. As Fowler explains:

The inertial physical field modifies this abstract measure of process into the more concrete potential impetus dJ 2, and full concreteness, so far as it is ascribable to nature, is obtained in the realized impetus dJ 2. (Fowler, 1975, p.97)

Keeping the two tensors separate, Whitehead allows for Verlinde’s move of then recognizing one of them at a more fundamental level. As Wheeler (et al) explain, “ many authors argue that the first metric, dG 2, defines a prior geometry. If "prior" merely means that geometry and physics are separate in Whitehead’s theory of relativity, then their interpretation is accurate ” (1973; p. 430). Whitehead establishes: "If space-time be a relatedness between objects, it shares in the contingency of objects, and may be expected to acquire a heterogeneity from the contingent character of objects " (PRel 58). As Fowler instructs, “ notice that the physical relations between events create physical objects which endure through time ” (1975). He goes on:

167

Whitehead correlates the metric dG2 with kinematic elements rather than a background geometry (see PRel 78, 81, and 87). I believe that the distinction between geometrical and physical relations and the correlation of the first metric with the "abstract measures of spatiotemporal process" (PRel 87) indicate that in PRel Whitehead already has made the distinction between physical space and the extensive continuum which characterizes his treatment of these issues in PR. (Fowler, 1975)

The gravitational effect is real, it just doesn’t derive from a fundamental force; instead, it is shown to emerge from an underlying substratum of kinematic elements, eternal objects, adjectives of events, and other factors: what Whitehead calls “the physical field” and labeled by the metric dJ 2. The physical field can thus be seen as the underlying substratum out of which emerge the values underwriting the gravitational effect. Whitehead defines the physical field as the " interweaving of the individual peculiarities of actual occasions on the background of systematic geometry " (PR 507). The "individual peculiarities of actual occasions" represent the properties of the physical contingent world (dJ 2) while the "background of systematic geometry" represents the metric of uniform background spacetime (dG 2).

Whitehead insists that space-time geometry should be distinguished from contingent influences of gravitational field. Gravitation would cause physical singularities but not space-time singularities in Whitehead’s theory. On the other hand, Einstein’s theory does not distinguish between the space-time metric and gravitation. (Tanaka, 1987)

Geomodally we represent the first metric and second metrics as follows:

Second Metric, dJ 2, gravitational

First Metric, dG 2 qua special uniformity of XT

The second metric represents a duration (event) where physical field components are simultaneously captured in manifold in a moment. The hypersurface exhibits the property of uniformity; only the contents are unique. This accords with Whitehead’s understanding of the physical field. The first metric doesn’t come into play in the mainstay of this study but for background purposes is recognized as the uniform space and time description per person. This is achieved on the basis of equivalence between the first metric and Einstein’s STR, and the predication of both on Minkowski’s spacetime lightcone. Chiefly, this approach operates on Lockwood’s accord and inverts the lightcone to reveal the space and time of the mind.

168

7.2.3 — Space and Time

The scientific concepts of space and time are the first outcome of the simplest generalizations from experience, and that they are not to be looked for at the tail end of a welter of differential equations . (PNK, vi)

For Einstein, " space and time are modes in which we think, not conditions in which we live. " As Ushenko explains: “ Einstein rejects the 'idea that the fundamental concepts and postulates of physics were not in the logical sense free inventions of the human mind but could be deduced from experience by ‘abstraction’- that is by logical means ” (Ushenko, p.632-633).

In Riemannian geometry the path of shortest distance between two points is called a geodesic. In Einstein’s "pseudo-force" explanation objects move along geodesics (i.e., straight lines in a curved space-time). Since they follow straight lines (not curved lines) no forces are required; however, this does not explain why the objects move rather than remain at rest. As Tanaka explains:

The mathematical formulation of Whitehead's theory is, as in Einstein's case, supplied with tensor- analysis. But it is to the physical structure of gravitational field that the Riemannian theory of differentiable manifold with variable curvature is applied in Whitehead's theory […] According to Einstein's theory of general relativity, the metric properties are decided completely by matter. Space-time is said to be "warped" by matter: The "curvature" of space-time is variable, and it may be said "fiat" only when the gravitational field caused by matter is negligible. (1987)

Whitehead’s use of the term, “adjectives of events” fits perfectly into the event-logic of a “frozen string of a strand”. What gives the strand its depth are the internal degrees of freedom coded into its oscillating and dynamical value, with a dynamical tail.

Whitehead’s philosophical objections to Einstein’s variable spatiotemporal relations anchor in part on the premise that “ Einstein’s notion of a variable spacetime geometry contingent on the presence of matter (a) confounds theories of measurement, and, more generally (b) is unacceptable within the bounds of a reasonable epistemology ” (Bain, 1998). Whitehead distinguished his principle of relativity such that:

The physical field is merely that character of nature which expresses the relatedness between the apparent adjectives of the past and the apparent adjectives of the future. It therefore shares in the contingency of appearance, and accordingly cannot affect spatiotemporal relations. (PRel 72)

Whereas Einstein’s theory identifies gravitation with space-time curvature, Whitehead’s background space-time is the “flat” Minkowski space-time. Desmet spells it plainly:

Under Will's presentation (inspired by John Lighton Synge's interpretation), Whitehead's theory has the curious feature that electromagnetic waves propagate along null geodesics of the physical spacetime [...] while gravitational waves propagate along null geodesics of a flat background represented by the metric tensor of Minkowski spacetime. This means that curvature isn’t required in Whitehead’s theory. The gravitational potential can be

169

expressed entirely in terms of waves retarded along the background metric, like the Liénard–Wiechert potential in electromagnetic theory.

Instead, Whitehead develops a causal theory of motion whereby the kinematic elements (elements making up a world-line) do not move. Rather, motion is derived from the transference of common “adjectives” along the path of motion (an historical route or world line). The character of this transference depends on the propagation of gravitational forces.

The special theory of relativity correlates space to time through the Lorentz Transformation, which Einstein deduced from the combination of the special principle and the principle of the constant velocity of light. Whitehead, on the other hand, deduced the same transformation from the weaker principles of kinematics and geometry, i.e. (1) the uniformity and symmetry of space-time, (2) the symmetry and transitiveness of transformation, etc. (Tanaka, 1987)

Whitehead’s theory is fully-consistent with special relativity, using the space-time (or the “Galilean”) metric in its tensor algebra (Eastman, 2009). As Tanaka explains (1987):

Whitehead rejected the very idea of the priority of matter over space-time. As was stated before, matter was considered by him as an "adjective" of events, and it cannot exert any influence on the essential characteristic of space-time, which should be determined only on the level of events. The existence of matter only concerns accidental qualities of space-time.

Gibbons adds that “ from today’s perspective we can recognize the principle difference between Einstein and Whitehead is the latter’s insistence on fixed a priori spatiotemporal relations, which in practice meant the adoption of a fixed, and in particular unobservable, background Minkowski spacetime ” (2006). The fact that Whitehead’s background Minkowski spacetime is unobservable puts it directly into alignment with the representation in this study of “vacuum” fluctuations as microscopic data patently unobservable in consciousness. 71

Considering the contingency involved in the configuration of matter, Whitehead rejected the effect of matter on space-time metric: the very idea that the curvature of space-time is variable should be irrelevant in Whitehead's theory. (Tanaka, 1975)

Whitehead recognizes the properties of space and time as expressing “ the basic uniformity in nature, which is essential for our knowledge of nature as a coherent system ” (PRel, 1922). This position is also reiterated by Tanaka: the fact that “actual occasions do not happen in space-time is the fundamental stand-point of process metaphysics. Space-time is an abstraction from the interrelations of actual occasions, and not the absolute framework in which actual occasions happen ” (1987). Whitehead’s theory describes how “ space-time itself is an abstraction from the concrete relatedness of events; ” as he explains, “ space and time are relations between the material objects implicated in events ” (PRel 58) . In his later work, Whitehead further notes how “ our lowest, most concrete, type of abstractions whereby we express the diversification of fact must be regarded as events, meaning thereby a partial factor of fact which retains process ” (PR 197 ).

71 There seems to be a good evolutionary reason for this: if the underlying physical values were apparent then it would overwhelm sense-data and we’d not have coherent consciousness of the extended world. 170

This represents a predication of space and time of events as relations. Space and time are abstractions from events, and as such are also shown to be emergent. We encounter this in physics via open strings on a D-brane and large- gauge theory correspondence via AdS/CFT. Thus, like (Dijkgraaf, et al) explain, at the start of spacetime, and even before it, these D-branes with open strings are what predicate space and time. The language and ideas resonate clearly. Explaining more in depth, Whitehead continues:

I give the name 'event' to a spatiotemporal happening [...] by this I do not mean a bare portion of space-time. Such a concept is a further abstraction. I mean a part of the becomingness of nature, coloured with all the hues of its content. Thus nature is a becomingness of events in terms of space and time. Thus space and time are abstractions from this structure. (PRel 22)

Such an explanation and correlation within science, an event-logic helps bring one step closer into coherence and clarity. Specifically, the initial events corresponding to the “diversification of fact” represent “partial factors of fact which retain process .” This same description applies to the “frozen moments of strands” on a derived hypersurface of the manifold. As such they represent the “partial factors of fact” qua partial snapshots (strings) of a much more complex and dynamic (strand) values.

This speaks to the emergence of space-time via the pre-existing large-values in SYM environment; thus, we move to define large-groups in SYM as Whiteheadian “events.” Drawing from chapter two we recall that space-time is modeled as emergent from large- distributions in a super Yang-Mills environment of a black hole horizon. We meld the two into one explanatory mold such that the abstraction of space-time is reinterpreted as characteristic sign of an emergent nature. In addition, an event is defined as a large- group in SYM environment of near-horizon limit of a black hole, as the pre-space-time elements out of which it is abstracted.

Whitehead, adopting Minkowski's idea that four-dimensional manifold should give the framework of relativity theory, tried to deduce [Minkowski's four-dimensional manifold] from the interrelated structures of events. This procedure was called by him "the method of extensive abstraction", according to which the elements of Minkowski's manifold, event-particles without extension, were mathematically re-constructed from becoming events with spatiotemporal extension. (Tanaka, 1975)

Here we promote the view that space-time exhibits relations between objects in events. Relations between events take place prior to the emergence of space-time (sub- consciously) at the vacuum, Planck scale. As such, space-time emerges from the uniform instants of the physical metric contingencies of data, in Whitehead’s model. Prior to this, relatedness is between events, as strands in vacuum scale and as the dynamical components of quantum fluctuations. As Whitehead explains, relatedness between events has “ the character of a systematic uniform relatedness between events which is independent of the contingent adjectives of events. In this case we must reject Einstein’s view of a heterogeneity from the contingent character of objects ” (PRel 58).

171

We can take this to indicate the role of statistical mechanics in emergent systems qua Verlinde’s resolution that the microscopic details of the theory don’t matter for gravity itself, as an effective property of entropy and inertia. Whitehead’s above quote also indicates the effective nature of gravity insofar as he defines the microscopic details as “independent of the contingent adjectives of events .” We can therefore acquire a strong proclivity to the representation of emergent space-time in modern physics and especially, Verlinde’s model of emergent gravity and string theory.

The divergence between the two points of view as to space-time, that is to say, as to whether it exhibits relations between events or relations between objects in events is really of the utmost importance in the stage of physical science. If it be a relatedness between events, it has the character of a systematic uniform relatedness between events which is independent of the contingent adjectives of events. In this case we must reject Einstein’s view of a heterogeneity from the contingent character of objects. (PRel 58)

In this study we consider snapshots to signify Whitehead’s ‘relatedness between events’ as a “systematic” and “uniform relatedness,” “independent of the contingent adjectives” (or virtual objects) of these snapshots. Given what we now understand about pre-space-time models qua large-gauge theories and 1/n expansion in YMT (from chapter two), we are able to model these snapshots of ‘adjectives’ (open-strings) as the underlying basis and initial conditions from which space-time conforms into value as an emergent abstraction out of uniform snapshots with differential adjectives. To these ends Whitehead describes:

The constitutive character of nature is expressed by the ‘contingency of appearance’ and the ‘uniform significance of events.’ These laws express characters of nature disclosed respectively in cognizance by adjective and cognizance by relatedness. This doctrine leads to the rejection of Einstein’s interpretation of his formulae, as expressing a causal heterogeneity of spatiotemporal warping, dependent upon contingent adjectives. (PRel 64 ff. 65)

Whitehead’s metaphysical anchoring of gravitation states that the transmission of gravitational forces is grounded on the transmission of prehensions from actual entity to actual entity (PR 258). Here, the language “ from actual entity to actual entity ” can be linked to Verlinde’s description of one-loop amplitude channels of open strings between two D- branes. We give Whitehead the last two quotes:

I call this intermediate distribution of character the “physical field” (PRel 71 ff.) […] The physical field expresses the unessential uniformities regulating the contingency of appearance. In a fuller consideration of experience they may exhibit themselves as essential; but if we limit ourselves to nature there is no essential reason for the particular nexus of appearance. Thus times and spaces are uniform. (PRel 8)

We can only know that distant events are spatiotemporally connected with the events immediately perceived by knowing what the relations are. In other words, these relations must possess a systematic uniformity in order that we may know of nature as extending beyond isolated cases is benefited to the direct examination of individual perception. I will refer to this fact as the uniform significance of events. (PRel, 64ff.)

172

7.2.4 — Uniformity

We can discern in nature a ground of uniformity, of which the more far-reaching example is the uniformity of space-time and the more limited example is what is usually known under the title, the Uniformity of Nature. (PRel 14)

The properties of time and space express the basis of uniformity in nature which is essential for our knowledge of nature as a coherent system. (PRel 8)

As we saw in the last section, Einstein explicitly interprets gravity as identifiable with the variable curvature of the structure of space-time, whereas Whitehead shows that measurement tacitly presuppose the structure of space-time as independent and uniform of the structure of the gravitational field.

“The basic theme of Whitehead’s approach is recognition that for knowledge to be possible, nature must disclose a basis of uniformity ” (PRel 8). The geomodal model provides a uniform heuristic-basis for agency from a first-person perspective predicated on Lockwood’s inversion of Minkowski’s 4dXT lightcone into XT of mental events (see Jammer, 1999). As Fowler explains, “ through an examination of our experience Whitehead deduces that there is a uniformity of spatiotemporal relations ” (1975, p.61). Here, the uniformity of XT relations is on account of a uniform construct (manifold) and vacuum space out of which snapshots of unique data emerge as the events underwriting XT.

Whitehead’s strongest criticism of general relativity involved the interpretation of gravity in terms of the non-uniformity of spacetime. This criticism was the fruit of his detailed analysis of perception as he endeavored to uncover the essential concepts in the foundations of physical theory. In particular he distinguished between the uniform significance of events, which according to his analysis is reflected in the stratification of nature into time systems and their relation to each other by a flat Galilean metric tensor, and contingent objects which go to make up the basis for masses, forces and fields in nature. (Eastman, 2009)

The significance of Whitehead’s definition of simultaneity and his appeal to immediate experience is not confined to the first metric of his theory, but is equally important for the law of gravitation. In opposition to Einstein, Whitehead constructs his law upon the principle of uniformity embodied in the metric dG 2 (defining Minkowski spacetime). For Einstein, uniform (Minkowski) space-time is curved in the presence of matter; for Whitehead, however, the uniformity of space-time is not contingently warped and as a result, “physical contingencies do not affect geometrical relationships of extension in the contemporary world ” (Fowler, 1975). The implications are that, in Einstein’s theory, gravity isn’t actually a "real force" but instead an expression for space-time curvature.

Objects " falling in a gravitational field " are not being pulled by a force; instead, they are following geodesic grooves in space-time. By contrast, Whitehead maintains a physical rather than a geometrical interpretation of gravity. The difference is that gravity is taken here as a real force propagated with a finite velocity: an impetus (see Fowler, 1975). So too in Verlinde’s account we take gravity not as a fundamental force but stemming from

173 an (adiabatic) reaction-force underwriting inertia and entropy. As Whitehead explains, “The structure is uniform because of the necessity for knowledge that there be a system of uniform relatedness, in terms of which the contingent relations of natural factors can be expressed. Otherwise we can know nothing until we know everything ” (PRel 29ff.). He continues:

We can only know that distant events are spatiotemporally connected with the events immediately perceived by knowing what the relations are. In other words, these relations must possess a systematic uniformity in order that we may know of nature as extending beyond isolated cases is benefited to the direct examination of individual perception. I will refer to this fact as the uniform significance of events. Thus, the constitutive character of nature is expressed by the ‘contingency of appearance’ and the ‘uniform significance of events.’ This doctrine leads to the rejection of Einstein’s interpretation of his formulae, as expressing a causal heterogeneity of spatiotemporal warping, dependent upon contingent adjectives. (PRel 64ff.)

Accordingly, he believes that Einstein’s theory (and theoretical physics in general) is doomed to failure if it demands (in principle) that in order to know anything we must know the state of the entire contingent universe.

The differentiation between uniformity (geometry) and contingency (physics) parallels the shift of emphasis between PNK/CN and R. In the first two works Whitehead was primarily concerned with the nature of uniformity as expressed in the first metric, dG 2. In PRel, Whitehead moves from considerations of geometry to physics; consequently, PRel is an analysis of the physically contingent relations in nature expressed in terms of the second metric, dJ 2. (Fowler, 1975)

Fowler instructs us to notice that “the movement from uniformity to contingency corresponds with a movement from the analysis of the contemporary region to an analysis of the relationship between the past and future ” (1975). In this study we consider it on account of the “reenactment” or holographic dual representation group of snapshot onto horizon.

When we follow Dijkgraaf’s line of reasoning we’re led to similar grounds that the geometrics ultimately filter to a quantum description. This implies, along with Verlinde, that there is no curvature to speak of. Whitehead’s principle of relativity makes this plain from the start, since 1922. For Whitehead, gravitational effects share in the contingency of nature. The uniformity of nature expressed in the geometry of Minkowski space-time applies only to "cognizance by relatedness" (presentational immediacy in PR). “Cognizance by adjective” (causal efficacy in PR) does not demand uniformity (PRel, xiv).

Consequently, Whitehead’s philosophy of nature does not demand that gravity is propagated along the straight lines of a prior geometry, and hence the value of the gravitational constant is not a function of the prior geometry, as Will and Ariel claim. This fact should be obvious from Whitehead’s separation of geometry and physics (Fowler, 1974): thus, gravity is a real force propagated with a finite velocity.

174

7.2.5 — Measurement

Another facet of Whitehead’s criticism of Einstein’s theory can be uncovered through a contrast of his theory of measurement with Einstein’s, whose theory is based on operational procedures involving the transmission of light signals. “ The arguments are (1) Einstein gives light signals too prominent a place in our lives; (2) there are other means of sending messages; (3) Einstein does not take account of the agreement within one time-system of the meaning of simultaneity ” (see PNK 53ff.).

Instead, Whitehead’s theory of measurement highlights the distinction of time and space. Measurements are made in the contemporary region. The relations that exist between events are spatial relations, not temporal ones; consequently, measurement in Whitehead’s theory is a measurement of spatial relations, and geometry is the science of space. In contrast, Einstein’s measurements are made by calculating time delays in signal transmission (Fowler, 1975).

The "bending of space" was and is a favorite phrase used by many physicists to explain the meaning of the verification of Einstein's gravitational theory at the time of eclipse. It can be paraphrased more exactly by saying that non-Euclidian geometry holds in the neighborhood of the sun. It was this thesis that Whitehead wanted to replace by his own theory of measurement. (Tanaka, 1975)

For Einstein, simultaneity has a calculative meaning. Measurements of time and space are calculated in the past world along the past-facing light cone, and then projected into the contemporary world. For Whitehead, measurements are made in the contemporary world, or as he says, “ all measurements are made in the mode of presentational immediacy ” (PR 257). We take this in our event-logic to represent how all initial conditions qua samples (rather than measurements) are made in the mode of presentational immediacy of the snapshot of frozen strands ‰ open strings on a D-brane.

Whitehead believed that the ability to interpret local measurement without appeal to a global result was a major advantage of his theory over Einstein’s theory. (Eastman, 2009) 72

In Whiteheadian terms, the weakness of Einstein’s theory of measurement is that it limits perception to one mode: causal efficacy. For Einstein, the only knowledge we have is confined to our awareness of the world as causally past (see Fowler, 1975).

Whitehead deduced the constant c of Lorenz-transformation from the purely formal postulates representing uniformity of nature independently of the light signals in his Principles of Natural Knowledge. That the speed of light always equals with c is a contingent fact of nature, and strictly speaking, it does not hold in the presence of matter. (Tanaka, 1987)

72 In addition, since his theory is essentially a Lorentz-invariant quasilinear action-at-a-distance theory, it is in principle easier to solve. This means that the speed of gravity is equal to that of light (see Poincare 1906; Minkowski, 1908; Sommerfield, 1910). 175

For Einstein, measurements are made along the past-facing light-cone. Accordingly, measurement is made in the past world. Distances are calculated by multiplying the velocity of light by the interval of proper time. However, in the real world of particles (the General Theory of Relativity) the velocity of light is not a constant, but is affected by the presence of the gravitational field (matter); therefore, the measurements involving the transmission of light depend on the contingencies of the physical field (PRel 36, 65). This was proven in the light bending observation made on mercury, as Tanaka explains:

Whitehead set about constructing a gravitational theory according to which rays of light are bent through the physical (contingent) effects of the gravitational field. Whereas Einstein's theory states that rays of light pass straightly (along a geodesic line) in the "warped' space, Whitehead's theory states that they pass literally along a crooked curve in the "flat" space. The speed of light cannot have the constant value c in the gravitational field, but varies as if the space is filled of the medium whose refractive index is changeable with gravitational potential such that Rcm 221γ+ where γ is Newton’s gravitational constant, m is the mass of the sun, and R is the polar coordinate from the sun. This leads to the same result as Einstein’s theory concerning the angle of the bending of light from distant stars around the sun. (Tanaka, 1975 )

Whitehead voices how:

By identifying the potential mass impetus of a kinematic element with a spatiotemporal measurement, Einstein, in my opinion, leaves the whole antecedent theory of measurement in confusion, when it is confronted with the actual conditions of our perceptual knowledge . . . measurement on his theory lacks systematic uniformity and requires a knowledge of the actual contingent field before it is possible. (PRel 83)

For Whitehead, all measurement is made in the contemporary world in the mode of presentational immediacy. As such he points to the fact that nature is given to us in sense awareness as now present. Simply put, presentational immediacy represents " our perception of the contemporary world by means of the senses " (PR 474). Presentational immediacy gives " no information as to the past or the future. It merely presents an illustrated portion of the presented duration " (PR 255).

This is what D-branes of open strings do in our event-logic. It is a physical feeling which displays the real extensiveness of the contemporary world. " It involves the contemporary actualities but only objectifies them as conditioned by extensive relations " (PR 494). Presentational immediacy has an eternal object in the contemporary world as its datum. This complex eternal object " is analyzable into a sense-datum and a geometrical pattern " (PR 475) which illustrates the presented duration.

In STR, light is transmitted with a constant velocity. However, in the real world of particles (the GTR) the velocity of light is not a constant but is affected by the presence of matter, which causes a arping of the geometry of spacetime. Consequently, in GTR light is not transmitted in straight lines, and its velocity is affected by the contingent characteristics of the world. Therefore, measurement in the GTR is only possible if the entire conditions of the contingent world are known. Whitehead argues that it is impossible for a finite being to know the total state of the physical world, and therefore, measurement is impossible in Einstein’s theory. (Fowler, 1976)

176

Whitehead speaks of this mode of awareness as "presentational immediacy," which " is our perception of the contemporary world by means of the senses " (PR 474). Presentational immediacy gives " no information as to the past or the future. It merely presents an illustrated portion of the presented duration " (PR 255). Since the presented duration defines the world as simultaneous, ‘presentational immediacy’ functions to define a preferred meaning of simultaneity within one time system (Fowler); thus, it represents a snapshot. “ Since the presented duration defines the world as simultaneous, presentational immediacy functions to define a preferred meaning of simultaneity within one time system ” (Fowler, 1976).

Hence, measurement in Whitehead’s theory is not affected by the contingencies of the physical world, as it is in Einstein’s theory. Triangulating this with the insight from Verlinde’s emergent gravity hypothesis that the details of the microscopic theory aren’t necessary for understanding how gravity arises from an entropic force, while it isn’t a direct connect, it does indicate that it isn’t necessary to know everything about the microscopic states of the contingent universe.

Instead of in “immediate experience” however, we distinguish between the sensory now and ontological immediacy. To the latter we link the “immediacy” of snapshots to the degree that they are in advance of signal processing times underwriting sensory-systems. Thus, snapshots occur in a fifth-dimension and in a mode once-removed from (or prior to/below) our common threshold of awareness and observation.

7.2.6 — Simultaneity

For Einstein, the Lorentz transformations and the definition of simultaneity depend on the postulate of the constancy of the velocity of light as an ultimate feature of physical reality. Einstein, accepting this counter-intuitive postulate, rejects the primacy of our felt experience of simultaneity. For Einstein, simultaneity has only a calculative meaning: “measurements of time and space are calculated in the past world along the past-facing light cone and are then projected into the contemporary world ” (PRel 37). Milič Čapek voices his discontents with the calculative definition of simultaneity and its semantic obscurity:

Relativists continue to speak about the simultaneity of distant events, although such simultaneity is a mere conceptual entity, created by definition, intrinsically unobservable, and when computed, different in different systems. It is questionable whether the continued use of such a ghostly and fictitious term is fruitful or even meaningful. It appears to be an effect of sheer semantic inertia, a simple concession made to our traditional and outdated linguistic habits. (1961)

By contrast, Whitehead defends the immediate experience of simultaneity and uses this as the foundation of relativity (Fowler). For Whitehead, simultaneity is defined in terms of sensory experience given directly in immediate awareness of the contemporary world:

Our sense-awareness posits for immediate discernment a certain whole, here called a ‘duration’ ... a duration is discriminated as a complex of partial events... a duration is a

177

concrete slab of nature limited by simultaneity which is an essential factor disclosed in sense-awareness. (CN 53)

Wilcox introduces the issue by reference to the problem which grows out of " Einstein’s theory of relativity . . . that under certain conditions there is no unique physical meaning of ‘simultaneous’ " (1961). Ford elaborates the problem in Einsteinian terms in an even more compelling case, explaining how "the whole thrust of relativistic physics renders the notion of an absolute inertial system no more meaningful than the notion of an absolute center to space-time" (1968). Similarly, Fitzgerald states that " special relativity modifies our concepts of space and time it implies the relativity of simultaneity " (1972). As Fowler explains:

For Einstein, simultaneity is defined using a signal theory of light transmission. But […] Whitehead’s emphasis on the now-ness of experience highlights the phenomenological nature of our feeling of simultaneity. By appealing to our immediate experience, Whitehead argues that Einstein’s definition of simultaneity is faulty since the very definition is made to depend on light signals. (1976)

Whitehead’s definition of simultaneity—based on his inductive approach—gives importance to our immediate sense awareness and avoids the obscurity and ambiguity of Einstein’s definition: simultaneity is in the context of data in events. In our event- narrative we recognize this as the snapshot of strands into open strings on Dn-brane or as an initial multiplicity of data. Here, simultaneity is a result of the mutual membership of strands in the manifold frame at the moment of a snapshot.

Having considered these six properties we will now turn to a conclusion discussing their comparative significance in light of Verlinde’s emergent gravity model. From this we should recognize a clear conceptual-compatriotism between Whitehead and Verlinde.

7.3 — Comparing Whitehead to Verlinde

Whitehead’s critique of Einstein is above all a philosophical—and not scientific. As Desmet explains, “Whitehead did not want to surpass Einstein’s general theory of relativity with respect to mathematical formulae and empirical success; his aim was to harmonize its interpretation with common sense. ” (Desmet, 2007). Whitehead's theory of relativity is closely connected with his philosophy of nature and speculative metaphysics such that we cannot attempt to understand it without paying due attention to his philosophy. For instance, as Bain suggests, the ontological relationship between the two approaches must be fleshed-out in the context of Whitehead’s philosophy of nature (1998).73 In this, Whitehead is trying to fine-tune gravitation to more-closely suit his version of relativity theory and a common-sense philosophy.

73 The relationship is of importance, not only in casting Whitehead's theory within its proper metaphysical context vis-à-vis Einstein, but also in judging how the theory fairs empirically with respect to general relativity. It makes the same predictions as general relativity with respect to the perihelion advance, the deflection of light rays and the gravitational red-shift; Eddington (1924) showed that it’s equivalent to the Schwartzschild solution of Einstein’s field equations for the one-body problem. 178

Despite the updates to Einstein’s model, Whitehead’s Principle of Relativity proves not to be the final word in gravitation and cosmology; however, the categorical design laid out in his “essay on cosmology,” Process and Reality, turns out to foretell many of the same dynamics presented in Verlinde’s emergent gravity model, eighty-two years later. Verlinde’s goal sets itself to finally dispense with the idea that gravity is a fundamental force (2010). To do so will require an accompanying philosophical theory. We propose this can be found in Whitehead’s process philosophy of the AE’s in an event-logic.

Whitehead’s chief critique of Einstein in each case can be meta-posed as a depravity of link to experience. This is predicated in first part on Whitehead’s choice of two tensors vs 1 of Einstein's and also two tensors of Minkowski. By showing that it is possible to write a law of gravitation in flat Minkowski space-time, Fowler famously produced “ additional justification for characterizing aspects of Einstein’s theory (such as the description of gravity in terms of a curved geometry) as explanatory, rather than as empirical. It also demonstrates that the explanatory content of a scientific theory pervades the very meaning of the mathematical formulae ” (Fowler, 1975, p.59). Whitehead recognizes gravity as the result of a real force, not an apparent one. Einstein’s rendering of gravity as a pseudo force due to curvature and particles following geodesics represents a mental construct, not a real force.

Maintaining the notion of gravity through a real, physical interpretation can ultimately be seen as coming closer to Verlinde’s development of gravity also taken as the result of a real force, however, one that arises through an entropic force—rather than with Einstein’s notion of gravity solely due to a mental construct of geodesics.

While Whitehead’s gravitational effect is real (like the entropic force), in Verlinde’s model it doesn’t derive from a fundamental force; instead, it is shown to emerge from an underlying substratum of EO’s and other elements: what Whitehead calls “the physical field” and labeled by the metric dJ 2 (PRel 81, 83). The physical field can thus be seen as the underlying substratum out of which emerge the values underwriting the gravitational effect. The physical field represents Whitehead’s second metric. This stands in distinction from the first metric. In keeping these two metrics separate Whitehead also brings them into coherence on the holographic order. The only adjustment to Whitehead’s model is to consider the gravitational effect, while real, not as the result of a fundamental force, but instead an emergent one. Verlinde makes the case for this move:

Of course, Einstein's geometric description of gravity is beautiful, and in a certain way compelling. Geometry appeals to the visual part of our minds, and is amazingly powerful in summarizing many aspects of a physical problem. Presumably this explains why we, as a community, have been so reluctant to give up the geometric formulation of gravity as being fundamental. But it is inevitable we do so. If gravity is emergent, so is space-time geometry. Einstein tied these two concepts together, and both have to be given up if we want to understand one or the other at a more fundamental level. (Verlinde, 2010)

Secondly, with regard to the properties of measurement, uniformity, and simultaneity, Whitehead’s approach is crafted on a logic shown to correspond with Dn-branes of open

179 strings, and large-QCD gauge theory as explanations for a value underwriting XT and matter (see chapter two). We recognize this under the logic of snapshots and their contents, postulated in our event-logic constructed in chapters five and six. In Verlinde’s approach, given Minkowski’s flat space-time we can then reduce it at the quantum scale into large-gauge theory as a sub-QM-level theory where XT emerges from. Thus, we can maintain XT as emergent while also accounting for dynamical source masses precisely of the type we’d expect form open strings and large-SYM (see Verlinde, 2011).

Thirdly, as Schild explains, classical electrodynamics can either be formulated as a field theory (Maxwell’s equation) which is analogous to the general theory of relativity for gravitation, or it can be formulated as an action at a distance theory (Leonard-Wiebert potentials) analogous to Whitehead’s theory of gravitation (Schild, 1956). Tanaka furthers:

Whitehead’s theory of gravitation in the narrow sense is sometimes referred as "a theory involving action at a distance with the critical velocity c". This characterization of Whitehead's theory is due to Synge, who located Whitehead's theory between the two extremes of Newtonian theory on the one hand and the general theory of relativity on the other. Such a middle-way character comes from the peculiar definition of the physical field in Whitehead's theory. (Tanaka, 1975)

Schild takes Whitehead’s theory as an ‘action at a distance’ theory in flat Minkowski spacetime where particles interact through the entropic gravitational tensor potentials, Guv (1956). Verlinde’s theory is also non-local. As he explains (2010):

The microscopic degrees of freedom on the holographic screens should not be seen as being associated with local degrees of freedom in actual space. They are very non local states. This is what holography tells us. In fact, they can also not be only related to the part of space contained in the screen, because this would mean we can count micro states independently for every part of space, and in this way we would violate the holographic principle. There is non-locality in the microstates.

Like Einstein distinguished, there is gravity in local relativistic frames as well as in general, cosmic frame. SR local is in context of weak gravity; GR is in context of strong gravity. This is a good distinction because it sets aside the opportune, and experientially- negotiated description of gravity in SR qua experiential domain of quantum gravity. Gravity in SR is weak and short-ranged. Phonons explain this in SR and offer a reason why they are short ranged. The claim is this that quantum gravity is contextualized in terms of SR experiential negotiation or gravitational interface dynamics in local (vacuum to physiological) space: that is, it adds a personal element to gravity in SR. As such it comes from proto-experience in some sense, or at least is negotiated through it.

The critical insight is that you cannot arrive at an emergent paradigm unless you first decouple Einstein’s notion of spacetime geometry and gravity. As Verlinde explains, “ If gravity is emergent, so is space time geometry. Einstein tied these two concepts together, and both have to be given up if we want to understand one or the other at a more fundamental level ” (2010). Whitehead’s move to maintain the old distinction between metrics is therefore shown in this light to maintain the conditions for ultimately converting to an emergent paradigm,

180 in Verlinde. Einstein’s basic approach must be undone in order to arrive at a successful, emergent model. Whitehead’s model proves to be the right format, providing an example demonstrating its closer alignment to Verlinde’s model—rather than Einstein’s. We then propose that Whitehead’s process metaphysics provides the proper philosophical context and backdrop for Verlinde’s EG. Thus, Verlinde’s EG speaks to the future paradigm of process metaphysics first described by Whitehead in 1922 and throughout his later works.

The most measured scenario reflects the fact that both Whitehead and Einstein’s theories ultimately assimilate into Verlinde’s emergent model; however, retroactively we’ll recognize Whitehead’s model as closer in conceptual similitude with Verlinde’s approach. Neither Whitehead nor Einstein’s models will be the final version but between the two we’ll recognize Whitehead’s as having been the more-accurate harbinger of the principle of relativity and nature of gravity, as the result of an emergent force.

7.4 — Summary

In this chapter we introduced Whitehead’s alternative theory of relativity and traced the philosophical distinctions between his and Einstein’s theories of relativity. Verlinde’s theory of emergent gravity served as a sort of guide and arbiter of the discussion.

What we discovered, simply put, was that Whitehead’s theory fits better into the paradigm of an emergent ontology than Einstein’s. We consider this in light of three points. Firstly, the choice of two metrics v. one makes the approach amenable to the holographic principle and to the emergent approach built up by Sakharov (1967), Jacobson (1997), Padmanabhan (2010), Mäkelä (2012), and chiefly, by Verlinde (2010). Namely, they all endorse the same conclusion that the metric should be recognized as dynamical, not static. Static metrics are associated with QFT, but in string theory we find the dynamical variant.

Secondly, we recognize Whitehead’s development of the notion of space and time as derived from events. As such he serves as a harbinger by almost a century to the emergent view of space and time advocated by Dijkgraaf, Verlinde, and others. Thirdly, the measurement Whitehead develops in PRel is shown to be compatible and corresponding with the geomodal presentation of a snapshot logic meant to portray the event-logic of PR. In addition we recognize this same logic in string theory and gauge theories of physics. Simultaneity is recognized on account of the elements of a snapshot. To these ends, even simultaneity is derived from these snapshots. The whole picture and story is emergent. This is truly the making of an emergent paradigm. To these ends we require a background model and narrative in which to house all these components; this is the purpose of the geomodal model (and method). Uniformity, as such, relates to the identity of a soular background underwriting experience, and from out of which this event-logic and process-ontology arise. Each element can be correlated to another one in modern physics. The dictionaries are provided to make these connections plain.

181

Chapter 8 — Conclusion: Précis; Denouement

Philosophy appears as a criticism and a corrective, and […] as an additional source of evidence in times of fundamental reorganization. (Whitehead, Principle of Relativity, 1922; p.5)

In this study we have carried out a detailed, philosophically-founded study of Verlinde’s emergent gravity model in the context of Whitehead’s process/event-ontology of Actual Entities (AE’s), bringing them together under the logic of a foundational process underwriting both material/physical logic and proto-experiential dynamics. To these ends a geomodal convention has been applied in the capacity of an exegetical standard capable of identifying the structure of a process and its components with a parallel narrative drawn out of both Verlinde and Whitehead’s models.

Underwriting the epistemological paradigm of physics resides a set of ontological assumptions of which most know very little about; indeed, we forget that there are specific philosophical and conceptual views underwriting each paradigm, and that often an update and development of ontological assumptions is what spurs new paradigm shifts. In our present epoch of science we are transitioning out of a substance metaphysic into one of event and process logic.

This approach yields a framework for simplifying Whitehead’s AE’s and string theory (qua emergent gravity) into two branches of one process, and much to the extent that Whitehead first heralds in calling for “the doctrine…that neither physical nature nor life can be understood unless we fuse them together as essential factors in the composition of ‘really real’ things whose inter-connections and individual characters constitute the universe ” (MT 205). If we are to take Whitehead and others seriously in describing the speculative nature of experience-based metaphysics, then we must be prepared to recognize a theory of experience suitably generalized within an event-ontology as most-primitive; as Nobo explains, “ Humans are part of this universe, and in no way can we experience except as part of the universe ” (2004). He continues:

Every act of theorizing begins with human experience [and] addresses some problem, puzzle, or difficulty encountered in human experience….this fact requires the recognition that human experience has primacy for the construction of any theory whatsoever. But truly to recognize the primacy of human experience is to recognize that human experience must itself be adequately conceptualized and elucidated, and thus must be the subject of a theory more fundamental and more comprehensive than any other. (Nobo, 2004)

We find initial precedent for such a theory in comparing Verlinde’s physics with Whitehead’s philosophy given that Verlinde's model is predicated on general ideas, first principles, and universal phenomena; in fact, this sets it precisely into the scope of a speculative venture enabling a fertile comparison to Whitehead’s AE’s qua general concepts and (first) principles underwriting both narratives. Whitehead further describes how “ a philosophical system should present an elucidation of concrete fact from which the sciences abstract. Also the sciences should find their principles in concrete facts which a philosophic system presents ” (AI, 1959, p.150). Whitehead’s AE’s, as a philosophical system, are shown to present this ‘ elucidation of fact’ lock-step in conceptual narrative with string theory and

182

Verlinde’s EG. In this move we offer a (re)union of philosophy and physics within the proposed event-ontology encompassing AE’s, space-time, string theory, emergent gravity, and possibly quantum mechanics and the Standard Model—and then ask whether the combined effect hastens a paradigmatic shift from a substance and material ontology into emergent models within a process and event-ontology.

8.1 — Science in an Emergent Paradigm

In the process of tracing the conceptual foundations of emergent gravity for linking Verlinde and Whitehead’s models, we discover a host of phenomena all considered under the heading of emergent values. If gravity, string theory, and space-time (plus AE’s) are all shown to follow an emergent ontology, it seems at some point we would venture the postulate that perhaps the concept of emergence itself represents a fundamental property (or principle) of a process ontology. Bringing these together, an extension of this study has been to identify a critical-mass of foundational parts of physics leveraged to shift paradigms into an emergent, process ontology. We explore this briefly now. Beginning with space as emergent, we recall Moskowitz’s quote from chapter two:

We often picture space and time as fundamental backdrops to the universe. But what if they are not fundamental, and built instead of smaller ingredients that exist on a deeper layer of reality that we cannot sense? If that were the case, space-time’s properties would “emerge” from the underlying physics of its constituents, just as water’s properties emerge from the particles that comprise it. (2014)

The notion that the fundamental structure of space-time might be something other than a continuum has been around for many decades. One indication in quantum theory recognizes the Riemannian smooth manifold as an inaccurate depiction of XT according to quantum mechanics at the smallest scale, which resembles more of a bubbling chamber of virtual particles and vacuum quantum fluctuations (see Gross, 2014; Dijkgraaf, 2012). This means that space-time at smallest scales isn’t smooth anymore but resembles more an ocean of activity. We consider this in light of the vacuum-level sea of strands.

Gravity in an emergent setting arises in the mode of an event-ontology (process) and experiential-metaphysics underwritten by a sea of streaming strands (microscopic data) in a fifth-dimension. Seen in this light, Verlinde’s model is shown to rely on an ontological reinterpretation according to a process and event logic depicting emergent dynamics. Patently, this is correlated to Verlinde’s estimation of gravity, not as a fundamental force, but an emergent one. In addition, he reinterprets the graviton as a phonon: not a fundamental particle belonging to a fundamental force, but rather as a collective excitation, or quasi-particle. Dijkgraaf offers that space-time is also emergent (2012). ‘Emergence’ is also vividly characteristic of Whitehead’s AE’s/process philosophy. All together this provides a compelling rationale contrary to substance ontology.

183

In light of gravity we recognize the emergent status Verlinde gives it in his model. For this, the holographic principle and information theory have to be assumed (Verlinde, 2010) and information is represented as bits on a screen, or in a container, like a bag or matrix. In this scenario fermionic open strings are dependently-originated from bosonic strands and closed strings are patently emergent like phonons, not fundamental like gravitons. This also has decided implications for the status of string theory as a fundamental theory of everything; as Verlinde explains: “ gravity is seen as an integral part of string theory that cannot be taken out just like that ” (2010). So if gravity is reinterpreted into an emergent force then what does this say about string theory? Verlinde replies that:

It should also be emergent, and it is nothing but a framework like quantum field theory. In fact, I think of string theory as the way to make QFT into a UV complete but still effective framework. It is based on universality. Many microscopic systems can lead to the same string theory. The string theory landscape is just the space of all universality classes of this framework. (Verlinde, 2011)

This means that gravitons cannot exist as fundamental entities in a theory where space- time and gravity are emergent—string theory must also be recognized as emergent. As Verlinde adds, “ if gravity is emergent, so is space time geometry. Einstein tied these two concepts together, and both have to be given up if we want to understand one or the other at a more fundamental level. […] The description of gravity as being due to the exchange of closed strings can no longer be valid. In fact, it appears that strings have to be emergent too ” (2010). To these ends, Verlinde identifies string theory as useful , much like QFT, but not fundamental (2011). As Liu explains, string theory could still “unify” the four forces, three of which are fundamental, but it wouldn’t be fundamental itself; rather, emergent (Liu, 2010). In Verlinde’s setting, space also proves to be emergent in a holographic renormalization. Renormalization in the mode of coarse-graining leads to foliation of an equipotential surface into creative, emergent dimension of space qua anti-de-Sitter space.

The emergent identity of string theory is also demonstrated in the event-logic narrative of EG and AE’s. Here, it is proposed that nature uses a snapshot mechanism of a causal manifold to create sampled approximations of open strings on a D-Brane as a by-product of frozen moments of strands. This speaks to the supersymmetry transformation from bosons into fermions. Additionally, snapshots offer a clear explanation for initial data as “dependently originated” from frozen moments of strands in the causal manifold, as a set of incidental “adjectives” or “factors of a fact” (see Whitehead, PRel, 1922). This precedent explains how the open strings of string theory lead to approximations of sampled moments: the “initial data” of an AE, also described by Whitehead as “durations” and “events,” where string theory is seen as an emergent phenomenon stemming from snapshots. The snapshot mechanism offers a reason for open and closed strings as approximative, emergent values derived from samples of universal microscopic values.

A causal manifold serves as a frame out-of-which generate uniform, derivative hypersurfaces, like durations, events, or D-branes. Whenever a strand is within the frame of a manifold and a snapshot occurs, the frozen strand resembles a string that can be considered as an “ open string whose…end-points are attached to a single brane ” (D’Hoker and

184

Freedman, 2002). Snapshots of eternal objects generate the illusion of enduring values qua frozen moments of otherwise dynamic and evolving superpositional strands resembling open strings on a Dn-Brane. In this sense, an event-logic offers insight into the ontological nature of open strings: not as fundamental values, insofar as representing the entirety of a kinematic element, but instead as derived ‘moments’ (or samples) of a kinematic element generating virtual samples of strings. To call these fundamental would be to commit Whitehead’s ‘fallacy of misplaced concreteness’ (PR 44, 118). Instead, strings identify in Aristotelian language as “accidental” features of snapshots (frozen moments of a causal manifold upon a derived hypersurface) and recognized as ‘dependently originated’ values.

These multiplicities of initial data immediately ‘ingress’ to form an emergent quasi- particle: a phonon that bursts from off a stretched horizon (like a graviton radiating off a D0-brane) with a resounding gravitas. Verlinde links the closed string to a phonon rather than a graviton. Considered as spin-2 gravitons, they aren’t fundamental values able to be quantized, but emergent values with no fundamental (classical) fields. Rather, they emerge from the generative collection of contents in the snapshot as intermediating values between the pre-material and material, between UV and IR, between Planck scale dynamics and the mesoscopic scale of organic systems.

While the phonons (closed strings) represent a patently emergent value, the open strings, however, aren’t explicitly emergent, but instead, virtual samples of pure strands, like ‘adjectives of events,’ as Whitehead describes (SMW, 1925). In the closed-string sector the ontological interpretation of fundamental gravitons is substituted for emergent phonons. In AE’s, the ingression of a ‘multiplicity of data’ into an ‘original datum’ represents the generation of a prehensive factor for the determination of ‘objective data’ that will pass into the phases of concrescence. In the pictorial description of an event-narrative this represents the open-strings of snapshot coalescing into a closed-string/phonon.

In non-perturbative string theory we encounter this in Sen’s second interpretation of tachyon condensation where a radiating graviton is associated with a D0-brane (1998). Like with the original datum, in all three cases we ascribe this value the role of ‘coupling constant’ during renormalization, or in Whitehead’s model: the phases of concrescence. The job of the coupling is to fine-tune, or distinguish which values will acquire a small mass (expectation-value) and which ones will be eliminated. 74

While the emergent status of string theory is a major step in the ongoing conceptual revolution in physics, the ability to demonstrate that Whitehead also predicted this same line of reasoning in his AE’s delivers an additional boost for speculative philosophy. Indeed, the same basic logic that leads Verlinde to his claim that string theory is emergent can also be found to bear in Whitehead’s Gifford-to-PR drafts between 1927 and 1929 when he corrects the “categories of existence” to reflect that ‘objects’ and

74 For all intents we consider the attrition rate during this phase to be very high.

185

‘multiplicities’ like open/closed strings, aren’t fundamental, but derived values. This led from a list of eight to a corrected list of six: actual entities, prehensions, nexus, subjective forms, eternal objects, and contrasts. Given that AE’s are the ‘final realities,’ “each proper entity in the universe is an instance of one and only one of these basic types ” (Lango, 1972, p.15). A ‘proposition’, because it is the most important kind of contrast, is not really basic (PR 36), as nor are a ‘multiplicity of entities’, because it is not a ‘proper’ entity (a single entity) but rather many entities (PR 44-45); as such both were taken out of the original set. This serves to bolster the case for the emergence of string theory as here indicated and developed in the context of AE’s.

Kallfelz’s paper, “ Physical Emergence and Process Ontology ” (2008) provides a rigorous little portrait of emergence in the context of quantum systems. Kallfelz spends the balance of the paper traversing the advantages of emergent theory over classical logic, citing as cases: mereological over supervenient logic (Bishop, 2004); emergence in the context of phase transitions and non-linear optics (Batterman, 2003, 2005); ala property fusion (Humphreys, 1996); diachronic vs. synchronic emergence (Huttermann, 2005); dynamical vs. radical and programmatic emergence (Krontz and Tiehen, 2002), and epistemic vs. ontological emergence (Silberstein and McGeever, 1999).

Focusing his trajectory on emergent phenomena within quantum mechanics and “ the role played by operations of the different senses of product and sum characterized by quantum theory’s mathematical formalism ,” Kallfelz’s letter takes a tack from Finkelstein (1996, 2001) in advancing the case that “ process thought can provide a more comprehensive framework for characterizing emergent quantum phenomena in comparison with what is typically presented in contemporary philosophical and scientific literature ” (2008).

Turning to the mathematical structure of field theories, Kallfelz employs finite- dimensional Clifford algebras (ala Baugh and Finkelstein 2001, 2003) as an update to Grassman algebra (itself an update to set theory) as providing a more versatile formulation of particle vs. space-time structure through “an action-based formulation of quantum physics ” (Kallfelz, 2006). Such an update yields the framework whereby space- time structure emerges from a discrete system of elementary quantum process (Kallfelz, 2006) echoing also Whitehead’s instincts for understanding how a continuous topology can arise with discrete tenants (Kallfelz 1997). In addition, Whitehead’s Ontological principle and Relativity principle are also highlighted to lend themselves to the dynamical elements and explanation of emergence (PRel, 1922; PR, 1929).

Given that all these descriptions generally bear in the same arena, we consider emergence as a general property of a process and event ontology in nature. Geomodally we recognize a process underwriting nature at the level where the smallest scale and larger scales come together (e.g. UV-IR mixing qua black hole logic). This process persists prior to both space-time and matter; out of this process, space-time and matter are derived abstractions. Within this, string values in a snapshot are effectively derived values as

186 frozen moments of superpositional, bosonic-strands streaming through local vacuum; phonons emerge from a collection of open strings on a Dn-brane.

The geomodal sequencing of this cycle equips us to recognize the same dynamics at work in Whitehead’s AE’s and Verlinde’s EG models. These sequences are mutually-aligned via dictionaries of terms as-well-as in a pictorial narrative. The gambit of these topics generates a solid narrative for the foundation of material systems emerging from the collective result of a fundamental quantitative process. This line of reasoning also lends credence to the initial track of this study in opting for frameworks predicated on the non- quantization of gravity at the quantum level. Generally, you cannot quantize something that isn’t fundamental to begin with; plus, it is unnatural to expect classical fields to quantize quantum one’s (see also Wuthrich, 2006).

Finally, we can use all this to make a prediction about the Higgs boson, updating it from a fundamental particle, like a graviton, into a pseudo-Nambu-Goldstone boson resulting from the fine-tuning of a little Higgs boson (like coupling of prehension) into a unified, collective result; as Schmaltz and Smith explain, “Little Higgs theories are realizations of an old idea to stabilize the Higgs mass by making the Higgs a pseudo-Goldstone boson resulting from a spontaneously broken approximate symmetry ” (2005). As such, the ostensible “God particle” (first value) of the SM can be identified, not as a fundamental value, but the result of an initial process leading to a collective-value.

We make this claim based on the symmetrical logic demonstrated between a little Higgs model and the AE/EG models in a geomodal model. Specifically, we link the little Higgs as “fine-tuning” operator that orchestrates the Higgs as a collective value, to the phonon as coupling for prehension of the “objective data” into positive and negative values where positive values further synthesize into a final satisfaction. Here, the satisfaction is none other than the Higgs boson as fine-tuned by the little Higgs, or phonon as coupling. As such we quip, there is no God…particle; the Standard Model, like nature, begins with a process, not a substance. The Higgs boson, as the first value of the Standard Model, is also the final result of the fine-tuning by a little Higgs. Making this one change, the rest follows naturally. This gives precedent for provisionally shifting the logic of paradigms under this one move. Resolving the Higgs as pNGB and final result of a process shifts the paradigm without having to change anything else. The rest of the SM remains the same except for the narrative of the Higgs boson, adapted into a process logic.

8.2 — Précis

In order to acquire a physical address and foundation for this study, we began in chapter two with an introductory exposition narrating the physico-conceptual foundations of emergent gravity. Here we identify with the choice for non-quantization models of quantum gravity approached from the perspective of an emergent rendition. In this, gravity emerges from changes of entropy (information) distributions at the microscopic scale. The choice of selecting non-quantization methods over quantization approaches in

187 quantum gravity represents the first, major step of this study. As Wüthrich asks, “ Does the need to find a quantum theory of gravity imply that the gravitational field must be quantized? Physicists working in quantum gravity routinely assume an affirmative answer, often without being aware of the metaphysical commitments that tend to underlie this assumption ” (2006).

This study identifies the underlying ontological and metaphysical assumptions hailing from Whitehead’s process paradigm. In following the road less travelled, our selection of non-quantizational methods positions us into the ontological domain of dependent origination, process, and emergence. This stands in contrast with the otherwise popular quantization method whose philosophical assumptions draw from a substance approach. The difference between these two alternatives can be stated from the non-quantizational method in a positive case: nature begins most fundamentally from a process, not a substance. Therefore, Wüthrich (2006) is right, and from our perspective, non- quantizational approaches are precisely the earmark of a process (v. substance) paradigm.

With this in place we then turned to uncovering a series of mathematico-physical elements of cosmology and string theory, each of whose components are able to be developed within a process and geomodal narrative. As Fowler explains: “ scientific concepts are abstractions derived from experience ” (1975, p.63). In this study, for example, we consider black holes and horizons as local, relativistic frames and hypersurfaces (of a manifold) in 5d linking each individual to a local vacuum (connection) space, in a mode unseen in consciousness and prior to sensory systems.

In string theory we survey the basis out of which the theory originally arose, as well as what major developments led to the state of the theory as it is currently. In this we recognize a set of major elements and properties of string theory, but as both Verlinde and Gross explain, without any sense for how they all go together. To these ends, string theory is a field that lacks a narrative and dynamical identity. We developed these here for the sake of using them to organize into an epical narrative in chapters five and six. From here we naturally transitioned into a survey of Verlinde’s theory itself, before then turning to a reciprocal survey of Whitehead’s AE’s and the philosophical climate they distinguish themselves within. As such, chapters two, three, and four follow the same structure: conceptual foundations and the model itself. In Verlinde’s case we build up to an examination of emergent gravity and in Whitehead’s case we build up to the AE’s.

After establishing both theories, we focused the next two chapters enunciating the conceptual concomitance that practically falls-into-place within the geomodal narrative. The principle claim is that Whitehead’s AE’s in the context of a process philosophy provide just the right type of philosophical basis for a theory of emergent gravity. When we apply Whitehead’s paradigmatic claims to gravity we end up with the same results as Verlinde concludes in his approach to gravity as an entropic force; thus, not as a substance, but the result of an event plenum. This lends additional credence to the choice of non-quantization approaches as selected in chapters two and three.

188

Over the course of this study the mainstay has been to depict both narratives of Verlinde and Whitehead’s models, showing how they are related to each other to the extent that we consider whether they are in fact two aspects of the same process. We spent chapters two through four developing the narrative bases for both theories, and the next two chapters showing step by step how they both align with each other under a geomodal narrative. These prove to not only be intertwined but also co-informative of the other.

Beginning from a mutual location in strands and microscopic data, two roads diverge in coarse-graining and prehension, one following a development of the “positive species” and the other following that of negative prehensions. As such, Verlinde’s string theory qua EG defines the dynamics of negative prehension, while Whitehead’s AE’s provide the generative dynamics of the positive gravitational self-energy of a matrix group.

In Whitehead’s case, after naming them, negative prehensions are largely left undeveloped and the AE’s (almost) categorically follow the narrative trajectory of positive prehensions into ‘simple physical feelings’ during the phases of concrescence (renormalization) into more complex syntheses of feelings, finally culminating into what he calls a ‘satisfaction,’ as a final value.

Applying Verlinde’s model, negative prehension is given a major role not realized in Whitehead and linked to the emergence of gravity qua coarse-graining procedure kept track of by Newton’s potential, ɸ (gravitational potential). This provides a synthesis of PRel and PR—as well as Verlinde’s EG with the AE’s—into one, descriptive-account for acquiring gravity (at this level) as the natural result of a reaction force generated from the accumulated coarse-graining of renormalization values qua negative prehensions integrated-out of matrix; or with Whitehead, as excluded from the phases of concrescence. Feelings combine to generate the gravitational self-energy of a string matrix during the phases of concrescence, while EG describes negative prehensions ‰ gravitational effect emerging from an adiabatic reaction-force arising from values integrated-out of a matrix model.

In-so-doing we also find that Whitehead’s categorical scheme, as a speculative philosophy, is capable of portraying gravity as an entropic force in the dual-context of the AE’s as a creative force. In addition, it articulates a processual setting wherein the primary elements of string theory are ‘strung’ together under one narrative cycle of Whitehead’s foundational process. To these ends, six essential tokens represent the categoreal series of a process cycle in the geomodal narrative. These have been evenly distributed over chapters five and six. Chapter five covers (origination, emergence, and reenactment), while chapter six covers (selection, creativity, and synthesis: renormalization). All together they make a cycle.

189

Chapter Five

“Insofar as philosophers have succeeded, to that extent scientists can attain an understanding of science.” (FR 58-59)

In chapter five we acquainted the reader with the fact that Verlinde’s model leaves out (or makes space for) the philosophical roots locating the nature of a microscopic theory. We see this take shape in Verlinde’s account under the notion that specific knowledge of the states of microscopic data aren’t explicitly necessary for apprehending the associated, emergent gravitational values. This move also locates the physically-contingent (dependently-originated) values to the complex string-like strands described by Chew 75 (2004) and in the geomodal model—understood as not needing to be known in detail to still know the resulting gravitational effect as a synthetic product.

Our ‘mode of being’ in relation to (encounter with) the wave-function is indirect, as if through the lens of a camera. This lens creates a 4d frame we can refer to as a ‘soular system.’ This is the mind’s eye, the third eye, the all-seeing eye, from the local frame of each person. It is the primordial and responsive element through which generate ongoing cycles of a process underwriting organisms as a seed of connection to cosmos, vacuum, et al. The causal manifold possibly operates at the Planck scale, taking pictures of strands (Plato’s ‘pure forms,’ Jung’s ‘archetypes,’ etc.) as they evolve in a flat 5dXT through the frame of the manifold. This is like the “sea of relations” described in Whitehead (PRel, 1922) and as a ‘sea of tranquility’ in Buddhism and Taoism.

In the geomodal logic, strands cannot come into existence in experiential mode of universe except as sampled approximations made available for concrescence and prehension into definite form—crafted and drafted out of an initial multiplicity of sampled approximation values; therefore, we only ‘become informed’ of eternal objects from their approximations into initial data, via snapshots. This clues us into their nature as ‘dependently-originated’ and derived values.

This begins with the recognition of an initial ‘measurement’ or ‘sampling’ process transforming potential values into actual values. To consider a potential value think of quantum fluctuations, tachyonic bosons, and zero-point energies (in physics) at sub- atomic and Planck scales. This is a step towards aligning Whitehead’s AE’s with the predicating quantum logic upon which he endeavored to conform in its earliest stages. The geomodal model portrays a string-theoretic scenario in agreement with Whitehead’s description of an initial multiplicity of data in the context of a snapshot of frozen strands

75 Chew’s approach is important because of his own contributions to physics by way of introducing the “bootstrap method” to America. As Freund explains, the bootstrap method takes the position that everything is emergent (1995). Bootstrap theory was the original entry into string theory; now perhaps it has made its way back around (Freund). This is another nod to emergent phenomena of the type we expect to find throughout these next three chapters. In yet another vote of confidence, Verlinde’s model has also been applied to QM to yield something like the bootstrap approach (see Freund, 1995). 190 resembling open strings on a D-brane. Snapshots cause strands to appear in frozen form as if nearly-exact string values: the approximate, virtual-elements of strands.

Measurement is through snapshots of the wave-function, per hypersurfaces of a local, manifold system at cosmos/vacuum level. The pre-material manifold is in part like a camera: a paparazzo of the pre-material ocean in local sphere. All encounters with the global wave-function are local, per relativistic frame of a person’s autonomic dynamics. 76

Pictures, or ‘snapshots’ render these instants (instantiations) of time and their actual world of contents as if they were real forms; as such they are the initial data (a multiplicity) of EO’s that ingress into data forms. To put it in physics language clarifies what was just said: here, the EO’s are like strands of digits whose tails are always in flux (changing numbers) to reflect evolution over space-time and other contents. The important feature to grasp is that the wave-function never collapses because it is never directly “touched” or “accessed.” All valuation based on the wave-function is obtained through a snapshot and therefore locally, as a sampling of the wave-function through the medium of a picture. A picture tells a thousand words, after all.

In the geomodal narrative, local manifolds in 5d produce derived hypersurfaces (snapshots) as durations, or events representing the initial conditions of a process. This serves to relativize the global wave-function into “apparent” or “effective” perspectival (relativistic) events. The wave-function doesn’t (itself) collapse, but local hypersurfaces are instead generated out of a manifold that effectively produces (something like) ‘local collapses’. As such it’s more like the freezing of a moment (as an instant of time), or what we might call a sampling of the wave-function into a holographic replica of a local frame of the wave-function. Local frames are generated, or derived, out of frozen moments of the global wave-function ala snapshots of a manifold. These are the basis of the AE process and what Whitehead refers to as “instants of time” (PRel, 1922).

The snapshot mechanism of manifold explains how certain collections of ‘eternal objects’ suddenly take the approximation of a large collection of open strings on a D-Brane. The snapshot imposes instantaneous limitations of strands, and causes each to give up some adjective, or factor of a fact, that together form into one multiplicity of values of initial data, per snapshot. Thus, these values are like samples and approximations of the eternal objects and an adjectival array of values. This sampling combines into one synthetic phonon value, as closed string, and ‘sets the tone’ as the coupling for prehension of data.

This reinforces the old wisdom: no being but in a mode of being; thus, no direct access to the global wave-function but through the (derived) local frames of a manifold: the best we do is take snapshots of the wave-function. The snapshot is holographic: the manifold generates holographic snapshots; as such the snapshots are recognized as derived values.

76 Snapshots could be based on some minimal feature of quantum gravity qua OOR in Penrose and Hameroff. Measurements, or samples, produce snapshots. 191

The wave-function does not actually collapse; instead, local systems take pictures of it (in local frames): thus, local sampling of the global wave-function.

The starting points of the AE and geomodal models are shown to coincide with the description of “kinematic elements” (qua eternal objects) in the mode of “continuous potentials” (Whitehead), and in the geomodal narrative as “dynamical numeric-strands,” or stands of dynamical digits adumbrating like adjectives around a central core, or code, as they stream through background vacuum. These are likened also to Chew’s “strands” and in Verlinde, as the microscopic data predicating the information distributions.

The aim of this chapter was to correlate: (1) the eternal objects of Whitehead’s theory with pre-space-time, pre-material, pre-quantum strands that through a snapshot device render emergent approximations of frozen strands into the semblance of exact strings, or open strings on a D-Brane—or (2) described in Whitehead’s scenario, as a multiplicity of initial data as a result of the initial ingression of EO’s via the (appended) snapshot mechanism of the geomodal model.

In doing so we demonstrate how Whitehead’s logic can be inserted into the microscopic degrees of freedom that Verlinde’s theory ‘ignores,’ and shown to smoothly coincide with the logic of string theory, UV/IR mixing, and the open/closed string correspondence. This leads to the emergence of closed strings qua phonons, in the physical theory, and to event-particles, or an ‘original datum,’ in Whitehead’s framework. The emergence of these initial values are shown to stem from the same generative process in the logic employed by both Verlinde and Whitehead’s frameworks in a geomodal narrative.

This progression also clears up the quandary of two solutions to the initial dynamics that Whitehead ultimately opted for the multiplicities as we see in PR. For a summary of this thought process and changes see (Ford, 1984). The two solutions Whitehead envisioned were the ‘primary datum’ and ‘multiplicity of initial data’. He reasoned that to uphold the ingression into an event entity would necessarily force the creation of an adjoint process. To avoid this he just kept to the initial multiplicity of data (see Ford, 1984).

The major ideas elucidated in this chapter are: 1) the open/closed string correspondence; 2) the snapshot mechanism as “reason for” or “explanation for” the measurement problem as a no-collapse theory; 3) closed strings are emergent and Whitehead’s correction of the categories of existence also makes the same point; and 4) the revival of Whitehead’s original-datum and clarification in context of process by way of the closed string, which is more-fully developed for the sake of space, in the appendix section.

A final point can be developed in the context of the transition from potential to actual represented by the measurement problem in physics, explicitly modeled by a “snapshot mechanism” in H20, and predicated in mathematical logic by the “instantaneous moments of time,” described in Whitehead’s earlier writings (PRel, CN, PNK). This is understood as the mechanism that creates 4) the initial “samples” of EO’s as a multiplicity

192 of data; frozen strands; or closed strings on a D-Brane. These samples, or snapshots of EO’s in the form of frozen strands/open strings on a D-Brane, are next shown to ingress, into a closed string value understood a lot like a collective boson, or phonon. By virtue of participating in a holographic universe, the initial snapshot projects onto the horizon in corresponding dual form (in one less dimension) as a chiral bag of QCD/quark matter qua “free quarks” observing asymptotic freedom. 77 In string theory this is simplified to a matrix box of off-diagonal open strings and eigenvalues on the diagonal axis.

While tracing an initial trajectory from the underlying, continuous potentialities of eternal objects to the closed string counterparts of the open/closed string correspondence, we encounter two important findings. The first of these involves what can be called the “data/datum dilemma” between accounts of the beginning of concrescence process from Whitehead’s Gifford lectures to the final PR drafts. Through the applied properties of the holographic principle (a predicate and starting point assumption of the EG model) we are able to resolve the ‘data-datum dilemma’ and model how both starting points can be retained for the most part in descriptions and in roles.

Retroactively we consider the worry that the holographic principle and open/closed string correspondence was something final and complete, in itself, is what kept Whitehead from including his first instinct of the “event entity” (or “primary/original datum”) in the overall description, instead option to dissolve the event entity into the multiplicity of initial data from out of which it emerged. In the geomodal model we are able to revive the event-entities and redefine them as the phonons, or closed strings of the snapshots of open strings, a la the open/closed string correspondence. The geomodal framework describes how the snapshot coalesces into a phonon; the phonon then interacts with holographic dual of snapshot, projecting onto boundary screen as a bag or box per AdS/QCD. This represents the ‘creative phase’ where space and time are both emergent and fashioned during prehension and concrescence ‰ satisfaction: foliation and coarse- graining phases of renormalization in Verlinde ‰ creativity & prehension in Whitehead.

Chapter Six

In chapter five we are left with the first snapshot forming into a collective boson of the initial data as a phonon ready for prehension. In chapter six the holographic dual of the snapshot forms into a superconductive hot drop of plasma comprised of quark matter in asymptotic freedom in bag residing on the underbelly of the horizon, facing (and growing) into the side. This accounts for the ability for and environment of creativity and emergence of space before gravitational effect a-la acceleration.

77 This provides an insight into YMT insofar as addressing an exception to the condition of quark confinement during renormalization qua chiral bag of free quarks. In the bag model, free quarks observe asymptotic freedom and only enervate when selected by the phonon during the interval of prehension.

193

It is this second set of the initial data that we attribute to Whitehead’s objective data of the initial data which is now subsequently eligible for and in the wings of prehension by the product of the first snapshot: the phonon qua open string, as coupling constant for prehension phase of free quarks in a chiral bag. Given the environmental property in a chiral bag with asymptotic freedom, whichever values the phonon positively prehends create a bulking effect on screen when they pull that value inwards towards the phonon.

Prehension and Concrescence = Coarse-Graining and Foliation

Positive prehension dynamics in Whitehead speak to the evolution of values in the foliation screens of phases of concrescence, in genetic time. Define genetic time. Explain how it is precisely suited to this foliation screen scenario in Verlinde. It speaks to how the values in the bag, not considered statistically, but kinetically, behave in principle.

Included values in coarse graining are progressively foliated while excluded values are kept track of by Newton’s potential, ɸ. In the AE’s we see the same logic at work in prehension and concrescence phases where positively prehended values give rise to enumerative phases of concrescence while negatively prehended values are excluded. During the process of concrescence “ an actual entity positively prehends only some of the already created entities while eliminating others from positive relevance ” (Lango, 1972, p.18).

Given asymptotic freedom, the results of moving any one of the free quarks creates a disturbance that would quickly grow if they weren’t miniscule in the radius of a chiral bag (and just samplings by the coupling). These prehensive disturbances generate bulging on the screen and can in theory be regarded as creating spectral line disturbances in a Reggeonic sector leading to a phase-completed pomeron value as a collective total value (not taking into consideration how each of these interacts and synthesizes with the others to form a collective representation of maximal order: the satisfaction.

The positive component of the chiral bag elements constitutes the creatively evolving AE through the phases of concrescence into a satisfaction. The negatively-prehended components accumulate into a gravitational force (acceleration) as an emergent effect based on a percent of the “actual world” negatively-prehended. In addition to showing how the maximal of coarse graining precipitates the final phase of concrescence qua satisfaction, this satisfaction, as a “final real value,” is also likened to the dynamics of polymer thermalization onto screen, as described in Verlinde.

The original elements of the atonal value are selected according to an internal harmonic consistency such that out of an atonal value there emerges, through foliation and course graining qua renormalization, a harmonic satisfaction. Out of chaos comes order: out of atonality are prehended packets of harmony. Creativity is in the mode of feeling and negatively-prehended elements give rise to the emergent gravitational effect, whereas those positively prehended progressively-synthesize into final, real values as we see next.

194

Satisfaction = Max of Coarse-Graining; Thermalization of Polymer on Horizon

In this last phase of both processes the two theories are shown to interlace with each other to weave into one coherent sequence and description of both accounts forming a more-elaborate description leading up to (and including) this final phase of AE’s and EG.

The maximization of coarse-graining in Verlinde’s model is conceptually-inlaid over the final phase of concrescence to represent the precipitator to Whitehead’s satisfaction. Positively prehended values ‰ feelings ‰ further phases of concrescence. Negatively prehended values are integrated-out, or excluded, but serve to propel the bag closer towards the horizon qua adiabatic reaction force. Once all ‘objective data’ have been prehended by the closed string/phonon (as coupling), concrescence comes to a final phase and the bag thermalizes onto the horizon, becoming a final, real value added to it.

Once the phase of objective data has been maximally coarse-grained (prehended by the coupling) the cycle reaches a final phase of concrescence, or layer of foliation, out of which arises the satisfaction as a final, real value. It is the final, real act as the final product of concrescence and as a real value added to a horizon (Verlinde, 2010). As Whitehead describes, an actual entity “ is the process of ‘feeling’ the many data, so as to absorb them into the unity of one individual satisfaction ” (PR 117). The satisfaction can also be visualized in Verlinde’s example of a polymer (or mass particle) approaching the thermal screen of a black hole horizon.

In addition to representing the natural precursor leading to Whitehead’s satisfaction, the maximal of coarse-graining is also understood in Verlinde’s model to kick-off an emergent gravitational reaction as an acceleration force. This happens on account of the fact that nature is shown in Verlinde’s model to keep-track-of the negatively-prehended, or coarse-grained values summing at the end of each phase. Taken all together, negative prehension leads to a gravitational effect at the end of phase. As Verlinde explains (2010):

Other authors have proposed that gravity has an entropic or thermodynamic origin […] but we have added an important element that is new. Instead of only focusing on the equations that govern the gravitational field, we uncovered what is the origin of force and inertia in a context in which space is emerging. We identified a cause, a mechanism, for gravity. It is driven by differences in entropy […] and a consequence of the statistical averaged random dynamics at the microscopic level. The reason why gravity has to keep track of energies as well as entropy differences is now clear. It has to, because this is what causes motion!

In the geomodal method we propose the gravitational acceleration effect as the thick line running to the causal nexus (horizon).

Putting the two together we could suppose that the maximization of coarse-graining (prehension) leads to the final phase of concrescence (renormalization) wherein the

195 gravitational self-energy acquires a value that is then propelled by the phase accumulation discharge (of all negative prehensions) to the horizon where it acquires a real status thanks to an adiabatic reaction force qua emergent gravitational effect. In this scenario gravity works in concert with gravitational self-energy in order to launch it to the horizon where it becomes a formally-real value added to it (see Verlinde, 2010, 2011). This leads to the possible synthesis of both models into one account.

Here we consider an order of operations where positive, or gravitational self-energy is summed first, and this value represents the sum used against (φ) to form the final, adiabatic reaction force whose emergent acceleration is shown to transport, or act as a transport for the gravitational-self energy, carried to the horizon where it becomes a real value added (Verlinde, 2010) and a final, real value for future inclusion in new (snapshots and cycles of) AE’s. Within the geomodal model, the reason for the gravitational effect as an acceleration is to transport the GSE (in whatever capacity 78 ) to the horizon, where it becomes a real value added back into the strands: a new (coefficient) strand. We can thus reinterpret Whitehead’s famous precept, “ the many become one and are increased by one ” (PR 47) into its gravitational signature to denote that ‘the many’ are coarse-grained and foliated into a maximal value/satisfaction (the one) that then thermalizes onto the holographic screen where it takes the form of a real value (the plus-1). 79

And that’s it. That’s as far as we need to go to describe one generative cycle of the core process of both Verlinde and Whitehead’s models. While this doesn’t exhaust the full- extent of the geomodal process, it covers the ground of both Whitehead and Verlinde’s theories and thus serves as the effective culmination of the process. In Verlinde’s case we describe in a string-theoretic sense how D-branes of open-strings originate as the first- values of the generative cycle of an AE.

This describes where the information distributions (Verlinde, 2010) come from: from durations, or events qua snapshots—as well as where the matrix group comes from: from the holographic reenactment of initial data (open strings on a Dn-brane) into objective data. There are several options for modeling this in mathematical physics: as a Large N conformal group on CFT, as free quarks in a chiral bag (D3-brane), or as closed strings.

Having a narrative basis leading up to matrix group in renormalization leads to coarse- graining (prehension) upon which two scenarios reveal: one for positive prehensions and another for negative prehensions. Following the first leads to a description of feelings concrescing until maximization of prehension leads to satisfaction, in Whitehead, or as gravitational self-energy, in Verlinde. Alternatively, following the second leads to a

78 If the satisfaction (GSE) is a “collective” Higgs qua pNGB then the “Ziggs” (see Susskind) of Higgs in decay mode is what gets transported to the horizon by the gravitational acceleration (in EWSB conditions). 79 Whitehead’s quote could also be taken as “the many” = initial data; “one” = phonon; increase by one = fine- tuning by phonon of objective data into a final, satisfaction (the plus-one).

196 description of values kept-track of by Newton’s constant, φ, whose sum at maximization of coarse-graining leads to an adiabatic reaction force qua emergent gravity.

After characterizing the parallel development of both logical theories into one narrative, including a set of dictionaries, we spent a chapter discussing the philosophical differences between Einstein and Whitehead’s Relativity theories with the aim of showing the proclivity for Whitehead’s theory to be re-developed into Verlinde’s model. To end we propose narratives for sting theory and the AE’s.

8.3 — DENOUEMENT

“The human capacity itself is an extension of nature and the cosmos.” (Yang, 2011)

We began with processes and events. On this view, space-time and gravity emerge from this fundamental reality. This particular derivation has formed the primary focus of our entire argument. This entire project made three central assumptions from the beginning. In a sense, much of what we did was to spell out the implications of these assumptions.

First, we assumed that there was a deeper connection to be found in the comparison between Whitehead’s AE’s and Verlinde’s EG. In order to demonstrate this we called upon a tertiary, geomodal logic (detailing a foundational process) to draw out the alignment between both theories. Second, we assumed that closed strings, the AEs, gravity, and space-time are all emergent, and we showed what happens when one treats them in this way. Third, this assumes a cyclic process taking place at the seat of nature and experience, out-of-which space-time, gravity, and strings are all derived. With these assumptions we were able to generate a narrative-cycle for string theory and the AE’s.

Over the course of this study we’ve set-out to paint a landscape around the central push for a process (over substance) paradigm in science (see Eastman, 2008, 2009). Given the findings just illustrated, the fact that string theory, gravity, space-time, and the AE’s can all be developed in emergent models beckons consideration of a process ontology.

The geomodal convention, as an exegetical technique, demonstrates a process in experience underwriting both Verlinde’s EG and Whitehead’s AE’s in such a way that we can learn about both from each other, supplying a cornerstone logic for an experiential ontology, plus a platform of the physical world and our relationship within-and-to it. As Nobo explains, experience is a foundational predicate describable in nature (2004). To these ends we seek to describe it.

When we look back from the ending to the opening quote of this study, we cannot help but to consider Verlinde and his predecessors as the one’s whom Emerson predicted (1838) would come “ to see gravity with a purity of heart .” 80 Coming to terms with gravity in

80 Putting on my psychologist’s hat for a moment: the connection is affectively amplified when we realize that in fact this insight came only for Verlinde after having been robbed while on vacation and laptop stolen, etc at the 197 an emergent paradigm, the purpose of this study has been to then take Emerson’s quote the next step and “ show how it is one thing with science, with beauty, and with Joy .” In this study we began with the beauty of ‘ correlation,’ and after developing the two programmes, showed how both Verlinde and Whitehead’s models align conjointly in a geomodal narrative. Specifically, we recognize the generative process of AE’s under the same description of the elements of string theory incorporated into a narrative order.

Three main points follow: the first-two involve the ability to locate string theory (qua EG) and the AE’s within epical narratives describing one cycle of a process scheduled to reside at the seat of experience and nature. Third we consider whether both models are actually branches of one-and-the-same process. From this we sequence string theory and AE’s into epical narratives describing the cycle of a process. In this we seek to provide both models with the narrative order they require in order to provide an ontological basis for their foundations. This cycle can be sequenced into six diagrammatic steps, shown here:

The first step refers to the origination of the initial conditions in a snapshot from a sea of strands in vacuum. Next the snapshot projects a holographic dual while the initial contents ingress into a phonon that bursts off the brane. The phonon then serves as coupling in dual representation group. Whitehead refers to this as a selection process qua ‘prehension.’ Positively-selected values are added to the phases of concrescence where they foliate into emergent space until forming a final value, once all values have been prehended. This generates a reaction-force propelling the value to a local horizon. We retrace the steps of this cycle within both models now, beginning with string theory.

8.3.1 — String Theory Epical Narrative

String theory is understood as having many useful components, however to borrow Emerson’s phrase, there is no ‘epical integrity’ to it, and it is not seen in any original order or arrangement within a logical structure. As Verlinde states, “ we need a new paradigm within string theory in which to think about gravity and cosmology ” (2011u). We don’t need all details of string theory, he explains, only the general principles that apply to the real world. By going through the general properties of string theory we acquire a landscape of process and generative-events, as interpreted through the geomodal model.

The construction of an event-cycle (logic) offers an epical integrity and logical ordering of phenomena underwriting string theory into a cohesive, coherent, and applicable

end of vacation, causing him to extend it for a few more days during which time he had the insight. This adds weight to the claim that Verlinde is the one who Emerson predicted would penetrate the observation of gravity. 198 dynamic. Applying string theory within a geomodal convention we are able to locate it within a foundational process fulfilling Whitehead’s speculative philosophy. As Leclerc explains, “ Whitehead’s fundamental divergence for the philosophical tradition lies in his thoroughgoing acceptance of process as a basic metaphysical feature of actuality ” (Leclerc, 1958). Rather than as a fundamental theory of the strong force, or gravity, we locate string theory at the most-fundamental of all dynamics: those regarding a process underwriting experiential and material systems: the summum bonum or primary-framework sought in both Whitehead and Aristotle’s speculative- and first-philosophies. This discloses an intermediary process. Whitehead refers to the associated contents as AE’s; Verlinde realizes gravity as an adiabatic reaction force. We situate the basic order of narrative here:

A. Strands qua Chew; pre-XT qua 1/n YMT, large-QCD, quantum fluctuations in vacuum, dark energy; like a fluid vacuum qua tachyonic bosons in 25d – [See 1] B. Measurement problem: how to go from bosonic string theory to fermionic string theory (adapted supersymmetry) via a snapshot mechanism in geomodal metric – [See 2] C. Snapshots yield origin of Dn-branes of open strings as precursor elements to XT, just like Whitehead’s second metric as durations and events as the abstracted underliers of XT D. First snapshot ‰ Phonon (closed string) and represents the coupling for renormalization – [See 3] E. AdS/CFT, Open/Closed String Correspondence; Gauge Gravity Correspondence; AdS/QCD = mathematico-physical realizations the holographic principle and property of snapshots such that dual representation is projected onto horizon – [See 4] a. Geomodal models this in chiral bag with large-of quark matter. Here, quarks appear in a special, deconfined state in bag, as free quarks. This allows us to distinguish the chiral bag mode from the conventional dynamics of quark confinement in QCD (and YMT). F. Holographic renormalization involves the procedures of coarse graining and foliation attended to in Verlinde’ model. Foliation builds into an emergent dimension of space and ends at maximal of coarse-graining with two results: a. Gravitational Self Energy (GSE) – [See 5] b. φ qua Newton’s constant leading to an adiabatic, gravitational reaction – [See 6]

The two-dimensional objects of M-theory = snapshots. M-theory itself, as an object, takes place in 5d. This satisfies the two objects associated with M-theory translated into components within an underlying process.

M-theory 2d 5d M-theory

String theory is a framework, like QG, that also attempts to reconcile gravity with quantum mechanics. Translated into a geomodal setting, the fundamental instability of bosonic strings is a sign of the dynamical tail of each strand as it streams through the vacuum. Space-time emerges from snapshots of bosonic strings in a causal manifold, converting them into fermionic, open strings in type-one string theory. In addition, the ‘background dependence’ of string theory is often seen as a difficulty in overcoming it, but here we recognize it as indicating the snapshot basis of initial conditions, per each cycle of a process. This also indicates the perturbative basis of string theory in snapshots

199 as a divergent series of approximation; here, we take the divergent series to represent the multiple, frozen strands resembling exact strings, as approximations (or moments of a strand). This also relates to Whitehead’s ‘multiplicity of initial data.’

Fermionic string theories appear in five different types, plus one bosonic string theory. These five also demonstrate symmetries between them, as shown in the following graph adopted from (Becker, Becker and Schwartz, 2007). Implementing a geomodal narrative we’re able to translate and consider this graph as a blueprint for the symmetries between strings to indicate the blueprint of a process. We use the geomodal model to align them.

Here, the T-duality between type I and SO32 indicates large distance scale of type I, while SO32 indicates the small-distance scale of a chiral bag radius. In the geomodal model this describes the snapshot and dual, for prehension. Secondly, the S-duality between SO32 and E8x8 indicates the weak coupling in Higgs branch (large) qua matrix group of open strings, and the strong coupling in Coulomb branch (small) qua positive prehensions. In the S-duality between type IIa/b string theories, type IIa represents the set of objective data being actively prehended by coupling of type IIb string theory, which is related to itself, and also indicates the weakly-coupled phonon to strongly-coupled in prehension. The link of both branches to M-theory indicates the overall setting within the 1/0 dynamics predicating the geomodal model.

First materials obtain in approximation from the sampling of pre-material/pre-event strands. The general principle depicting the emergence of string theory is predicated on a snapshot with contents, causing them to appear as if approximate, open strings on a D- Brane. As such, open strings emerge from the virtual approximations of frozen moments of dynamic, superpositional strands. If there aren’t snapshot samples of strands then there aren’t any open strings in snapshots. The first snapshot of strands gives rise to open strings on a D-Brane, immediately concrescing into one, long-wavelength value: a phonon with gravitas. Secondarily, closed strings emerge from open strings (snapshot of strands).

Strands (in pre-event space) are frozen in snapshots and take the approximate forms of open strings on a D-Brane. This makes open strings not fundamental but dependently originating values that arise from the virtual approximation of a moment of a dynamical superposition of an evolving strand. Thus, out of a sampling of eternal objects (strands), feeling tones giving rise to actual occasions occur. This sampled snapshot of strands gives

200 rise to virtual approximations of open strings on a Dn-Brane that collectively coagulate into a Nambu-Goldstone boson, like a phonon, little Higgs boson, or closed string.

We arrive at this by considering the ontological role and identity of the first value that ‘dependently originates’ from strands into an initial multiplicity of data. Developing it as a quasiparticle (phonon)—in the context of string theory—grants two special properties: 1) it allows the phonon to be recognized as an emergent, not fundamental, value; and 2) it identifies the phonon as a coupling-constant for the phase of prehension (renormalization). From this we realize that the reason why the emergent phonon doesn’t halt the process (in Whitehead’s DDD 81 ) is because a) it’s only a quasiparticle, and b) because it serves as the coupling for the holographic correspondence group of the initial data. This provides an elegant explanation: the product of the initial data doesn’t halt the process because it represents the fine-tuning factor, or coupling for the holographic correspondence values onto horizon as objective data prét-a-prehension.

The holographic correspondence can also be given a mathematico-physical description on the order of the open/closed string correspondence. Here, the open string one-loop amplitudes are converted into closed string exchanges in the holographic replica group on the horizon (in one less dimension and a distinct representational form). The closed strings appear like “free quarks” in a bag-model on the horizon. This move occurs on the basis of the AdS/QCD correspondence and also exhibits the property of asymptotic freedom, where the free quarks act casually unless disturbed (by coupling) in the bag.

If we were to create an epical ordering out of certain properties in string theory we would sequence 1) the Dn-brane of open strings first, then 2) the open/closed string correspondence, and then 3) the AdS/QCD correspondence. What these three properties and elements in string theory demonstrate is the spontaneous generation of an instantaneous spread of actual entities qua open-strings on a Dn-brane—or what the geomodal model refers to as a “snapshot” of frozen strand elements in the virtual appearance of approximate strings. Via the open/closed string correspondence these open-strings are converted into closed strings, specifically, in a bag model on the horizon qua D4-brane where they appear as free-quarks in the QCD-scale of bag with asymptotic freedom. What allows the closed strings in 5d AdS bulk to appear like free-quarks in 4d bag on QCD horizon is the AdS/QCD correspondence.

There are essentially two values in this process: the phonon and the satisfaction. The critical difference between them is that the phonon is a quasiparticle whereas the satisfaction is Higgs-like, in the context of a pseudo-Nambu-Goldstone boson. In this sense the Higgs is the physical product of a quasi-particle (or virtual, emergent) process.

81 See Appendix

201

Making this one adjustment to the Standard Model we encounter a promising case for redefining the Higgs boson, along with phonons, the original datum, and the satisfaction, as an emergent, collective value resulting as one end-product of a dual-product process (whose other value represents gravity as an adiabatic reaction force) shown to (both) arise in the geomodal model naturally.

Dictionary

String Theories Geomodal qua AE’s Bosonic Strings Dynamical Strands in 5d vacuum; ‘Eternal Objects’ Instability of Tachyon Dynamical Tail of Strand Type I – open and closed strings Snapshot of EO’s ‰ Multiplicity (x) Reenactment Type IIa – closed strings, non-chiral Non-chiral = phonon, ‘original datum’ Type IIb – closed strings, chiral Reenacted AdS/QCD group of free-quarks HO32 – heterotic has left/right moving strings Prehension: (L) = negative; (R) = positive (Higgs) HE(8x8) Genetic Phases of Concrescence (Coulomb) T-Duality – large and small distance-scale mixing Snapshot = large-scale; Chiral Bag = small-scale S-Duality – weak and strong coupling dualities Bursting Phonon = weak; Phonon Coupling = strong Brane-World Scenario 1/0 configuration dynamics of soular system Background Dependence, Perturbative Snapshots of Manifold = pre-selected backgrounds Divergent Series of Approximatives Multiplicity of initial data; frozen Strands ‰ strings Non-perturbative Phonons bursting-off charged D0-brane Supersymmetry between fermions/bosons Bosonic Strands ‰ Fermionic Strings (per Snapshots) Open Strings Frozen Strands Captured in a Snapshot Sample Dn-Brane Snapshot’s Instant Imprint onto Holographic Plate Closed Strings Phonons qua Emergent Gravitons, Closed Strings Open/Closed String Correspondence Snapshot strings into Chiral Bag (free) quarks AdS/QCD Correspondence Snapshot of Strands = AdS; Chiral bag = QCD quarks Regge Trajectory & Reggeon Satisfaction deposits one Strand back into vacuum

8.3.2 — AE’s Epical Narrative

Having a prior acquaintance with a Whitehead’s process philosophy, principle of relativity, and experiential metaphysics predisposes one to the natural recognition of this underlying process and event ontology. The fact that Whitehead defines the AE’s as the least of all values “ behind which there is nothing, nothing, bare nothingness ” (PR 192) represents his poetic way of situating the AE’s at the level of Planck scale, quantum gravity, and string theory—even before his time. When we take them to this level we encounter precisely the pieces we require to describe this primitive, underwriting process.

A. Sea of continuous potentiality; strands qua Chew – [See 1] B. Continuous Potentiality ‰ Atomic Actuality via Durations/Events – [See 2] a. Each event yields a multiplicity of data as initial conditions for the phases of concrescence C. The first snapshot contents ingress into an ‘original datum’ – [See 3] D. Via Holographic principle (AdS/CFT), the projection dual of snapshot on horizon = reenactment of initial data into objective data pret a prehension by the ‘original datum’ as coupling – [See 4] a. This revives the concept of ‘original datum’ and resolves DDD as coupling for prehension b. Original datum as subject that prehends objective data

202

E. Prehension by original datum of objective data yields either +/- consequent for each relation during prehension. Each value is individually related to coupling until all have been prehended. F. Positive values go into phases of concrescence to synthesize and transform with other values and create a “collective value” at end of phase, maxed in harmony and intensity (Whitehead) – [See 5] a. Alternatively, the negative prehensions are eliminated from the phases of concrescence but kept track of (Verlinde, 2010) and added up at end qua φ to give rise to emergent gravitational force that in geomodal narrative propels GSE value to horizon to become a real value: a satisfaction, as collective value as a final real value added to nature – [See 6]

To begin this cycle, an initial multiplicity of data is formed from a snapshot of elements in the derived frame (hypersurface) of a unique manifold in vacuum. In Whitehead’s language, discrete states originate from instantaneous spreads of actual entities in an actual world. In the language of string theory the geomodal model shows how discrete states originate from Dn-branes of open strings. Multiplicities of initial data immediately ingress to form into a collective-valued quasi-particle (phonon) bursting-off a stretched horizon like the graviton of a D0-brane as an emergent phonon with a resounding gravitas. Verlinde links the phonon to a graviton as a closed string and explains that in all three cases we must regard this value as an emergent quantity from the collection of open strings, and therefore not as a fundamental value (see 2010).

Taking place in a holographic scenario, each snapshot generates a dual copy projecting onto local horizon. Following AdS/CFT and open/closed string scenarios, the open strings on Dn-Brane represent snapshots while the corresponding elements are represented on the horizon like little drops in the form of free quarks in a chiral-bag model. 82 We apply the holographic principle to snapshots and resolve Whitehead’s data/datum dilemma such that the original datum and multiplicity of data both appear in the process—as Whitehead originally conceived—without halting it. The simple question boils down to, ‘how can an original datum occur and yet still the process continues-on in development’? The nuance and seeming redundancy is expiated by Whitehead’s description of the phases of prehension and concrescence. In order for initial data to become ready for prehension, Whitehead says they first require a conversion into “objective data” to be brought into the proper mode. Thus, each snapshot generates a chirally-asymmetric, holographic dual, evolving differently in values than the snapshot.

Through identification of phonon as coupling, interaction with the coupling by each datum is converted into either a positive ‘feeling’ and given a positive expectation-value ‰ satisfaction, or else integrated-out of the phases of concrescence ‰ φ of EG. This describes a simple +/- selection operation in both scenarios (EG and AE’s). The geomodal

82 The model consists of putting some version of a quark model in a perturbative vacuum inside a volume of space called a bag. Outside this bag is the real QCD vacuum, whose effect is taken into account through boundary conditions on the quark wave functions. Geomodal compares the bag model to the dense group of hadrons that emerges from the quark matter group of the snapshot. “The chiral bag model couples the axial vector current of the quarks at the bag boundary to a pionic field outside of the bag. In the most common formulation, the chiral bag model basically replaces the interior of the skyrmion with the bag of quarks. Because the quarks are treated as free quarks inside the bag, the radius-independence in a sense validates the idea of asymptotic freedom .” (Chodos, 1974) 203 and Whitehead clarify 1+0’s of information theory into what they repreent; namely, positive or negative prehensions. Positive values further synthesize within bounds of symmetry to form into a collective-valued ‘satisfaction.’ In Verlinde’s model, when a value reaches the screen it becomes thermalized into a real expression.

1. 2. 3.

These three values (geomodally rendered as a snapshot, phonon, and bag) represent the conditions required for initiating the second phase of the process: renormalization qua coarse-graining in Verlinde, or in Whitehead as ‘prehension’ and the phases of concrescence. The first snapshot forms into a collective boson of the initial data: a phonon. We refer to renormalization (prehension and concrescence) as the second phase of the process qua holographic dual of snapshot as a chiral bag of quark matter. This refers to Whitehead’s objective data of the initial data, now subsequently eligible for prehension by the product of the first snapshot: the phonon as coupling constant.

Both cases describe a process of selectively removing some values while keeping and transforming others. In Whitehead, the values removed are called ‘negative prehensions’ while the values kept for concrescence are ‘positive prehensions.’ In Verlinde’s model, some of the open strings in off-diagonal mode of matrix acquire expectation-values whilst others are eliminated out of the matrix (2011). Each element of the data set is considered individually against the coupling to determine its positive or negative relational-status. Positive prehensions describe the gravitational self-energy or eigenstate evolution over the phases of concrescence into the ‘satisfaction’ of positive-valued feelings synthesized into a collective value. This leads to an effective bifurcation of the narrative effectively- overlooked in both Whitehead and Verlinde’s models, each focusing on only one of the two roads. Verlinde and Whitehead’s models thus elicit a back-scratching reciprocity to the degree that each describes a critical feature under-developed in the other.

Dictionary

Actual Entities Event-Ontology Cycle Eternal Objects Microscopic Strands in Flux Objective Species of Eternal Objects The ‘core’ of an inertially-streaming strand Subjective Species of Eternal Objects The ‘dynamical tail’ of a strand Multiplicity of Initial Data The initial set of snapshot elements: a sample Adjectives; Factors of a Fact Each string element as an approximation of a strand Objective Data Holographic dual of snapshot elements in bag Subjective Forms Elements of Prehension in relation to Phonon Subjective Aim Harmonic Aim/Potential of each Element/Set Mutual Sensitivity Harmonic relations between prehended feelings Universe (Actual World) of AE The elements captured in the manifold, per snapshot

204

8.3.3 — Two Aspects of One Process?

The final result we wish to promote in this study is borne through a recognition of Verlinde’s emergent gravity in the context of Whitehead’s paradigm of an experiential, process-ontology comprising AE’s. When we examined both theories side-by-side we discovered a resolute concomitance between the specific logic and descriptions used by each author. These have been spelled-out in the dictionaries of chapters five and six. This leads us to seriously consider the possibility these two models might be even more- special than we, at first, realized. Given the precise way in which each theory appears to selectively-scaffold the other, we look even deeper and ask whether we are in fact observing two aspects, or branches, of one-and-the-same process! We characterize the two models as distinguished operations within the same process.

Observing the parallel-sequence of contiguously-bearing connections slowly reinforces the consideration whether a theory of emergent gravity might naturally arise in Whitehead’s process ontology of AE’s. From Verlinde’s perspective, what if the positively- valued, gravitational self-energy sector can be developed in a processual range of significance? 83 From Whitehead’s perspective, what if negative prehensions could finally be given an adjoint role in AE’s leading to a mechanism for making the ‘satisfaction’ real? Are both descriptions really part of one, deeper process? Is there precedent for such a claim? Whitehead believes so; as Leclerc explains: “ Whitehead’s fundamental divergence from the philosophical tradition lies in his thoroughgoing acceptance of process as a basic metaphysical feature of actuality ” (Leclerc, 1958). Applying the geomodal method we are able to clarify this sequence within Whitehead’s texts to provide a meta-ordered re-narrative.

Both Verlinde and Whitehead’s accounts begin from the same basis but then distinguish descriptively in coarse-graining/prehension: where Verlinde’s narrative follows negative prehensions to arrive at an adiabatic, reaction force qua gravitational acceleration effect, Whitehead follows positive prehensions (feelings) through the phases of concrescence into a synthesis of values as a satisfaction qua gravitational self-energy.

In Whitehead this refers to negative prehension and Verlinde’s values integrated-out of the matrix during renormalization via coarse-graining operation linked to prehension as a selection process from positive and negative. This is shown to underwrite the gravitational effect as an adiabatic reaction force generating an acceleration of ‘satisfaction’ (GSE) to the horizon, like a matrix or polymer, where it becomes a real value added to it (see Verlinde, 2010). Thus, Whitehead’s notion of negative prehension in AE’s is given a proper role and dynamics within Verlinde’s model as basis for emergent gravity.

83 Whitehead claims the AE’s are “a cell theory of actuality” (PR 242) with significance for consciousness and biology. The geomodal model is also poised to show how both are correct. In this position, the gravitational self-energy represents a proto-experiential element of consciousness. 205

Verlinde identifies the gravitational self-energy of matrix on account of renormalization dynamics (in string theory, matrix dynamics) of off-diagonal open strings acquiring positive expectation-values in the Higgs branch, moving into Coulomb branch (see 2011). While he identifies it, he doesn’t develop it, either, just like Whitehead treats negative prehensions. Instead Verlinde focusses on the emergent gravitational effect (an acceleration) as the result of an adiabatic reaction force due to differences in mass distribution of matrix group after coarse-graining as kept-track of by Newton’s potential.

Whitehead’s reciprocal contribution to Verlinde’s model is recognized on account of providing a narrative identity to the gravitational self-energy (GSE) as a ‘satisfaction,’ a maximized intensity and harmony of the positive values attributed an expectation value in that matrix group that go on to synthesize during phases of renormalization qua concrescence and foliation into the ultimate satisfaction and GSE. Thus, the narrative of positive-prehensions as feelings, to more complex relationships giving rise to a collective valued satisfaction at the end of phased, as a maximization linked to the maximization of coarse-graining in Verlinde’s model as bearing both a gravitational self-energy as well as the more-pronounced, emergent gravitational effect.

What’s more, these two procedures remain contemporary in process on the basis that both are said to value into final results only after all-values have been maximally coarse- grained (prehended) into an enumerated, phase-accumulation product. In positive prehension this leads to the satisfaction whereas in negative prehension it leads to the emergent gravity effect as an adiabatic reaction force (see Verlinde, 2011).

In Verlinde’s account, the components taken out by coarse-graining are kept-track of by Newton’s potential, ɸ. This translates into a component excluded from concrescence, as a negative prehension. In Verlinde this represents an off-diagonal open string integrated- out of a matrix. The gravitational effect is emergently powered by the (coarse-grained) negative prehensions, meanwhile AE generation is empowered by the positively prehended values conformally enumerating as the gravitational self-energy on eigenvalue. This demonstrates an ultimate efficiency in nature. Like the Native Americans and Inuits, nothing is spared; the discarded elements are still worthwhile in themselves, just not in tune with the present phase, and so they are recycled as values making up the gravitational effect. Nothing is discarded; it all goes to the total process.

It’s like working in an apple orchard and having to pick all the apples in a certain grid in order to bring them back to make apple pies. The grid is an abstracted space and time section of the orchard. The good apples get thrown into the buggy for later synthesis, meanwhile the ‘sub-prime’ apples are fed to the horses for fuel, to keep them powering the operation. This is like the role that negative prehension plays for the phases of concrescence: negative prehensions are the sub-prime apples that you feed to your horse to power the operation of lugging all the good apples (positive prehensions) back to the kitchen for synthesis/baking (concrescence) into a final product (satisfaction): a delicious apple pie.

206

Examining Verlinde’s account we identify the narrative where the values integrated-out propel the polymer (bag) to the horizon where it becomes a real value added to it in definiteness. Again, in both cases we consider a two-fold dynamics where the phase accumulation of positive values signifies an ongoing synthesis whilst the phase accumulation of negative values is kept-track of (by Newton’s potential) but isn’t valued until the very end, as an adiabatic reaction force ‰ acceleration effect of bag to screen.

Within an event-logic we find a narrative that could explain this; namely, while the positive prehensions synthesize over concrescence to characterize the gravitational self- energy of matrix (or satisfaction), the negative prehensions follow Verlinde’s account, only valuing in sum (φ) at the maximization of coarse-graining (at the end of prehension). We learn from Verlinde that gravity as an adiabatic reaction force is involved in this process. Here, gravity results as an ‘acceleration’ effect given an adiabatic reaction force precipitated by the removal of values from the matrix. If gravity is emergent, as Verlinde’s model suggests, then it should also be the result of a process, like with space-time and string theory, where both are understood as arising from something more fundamental. This discloses an intermediary process. Whitehead refers to the contents associated with this process as the actual entities.

Even more generally, we consider Verlinde’s model in the range of a process philosophy to the extent that the key components of string theory (qua emergent gravity) are shown to participate in the generative cycle of AE’s. The significance of such a move would ‘satisfy’ (pun intended) Whitehead’s call for a theory of experience underwriting nature and science. With this in place all that’s left is to relocate Verlinde’s approach under the guise of a process physics, using it as cornerstone for an event-ontology. Retroactively, Verlinde’s model could be recognized as the tipping point that leverages modern science into a process paradigm through the accompaniment of Whitehead’s event-ontology and a little Higgs ‰ emergent, collective Higgs boson signifying a process-event ontology.

This study recognizes Verlinde’s model as having delivered a philosophy of physics suitable to the outcome of Whitehead’s AE’s; correspondingly, Whitehead’s AE’s are finally given a physical theory in which to find camaraderie with Verlinde’s emergent gravity model. This suggests that emergent gravity arises as the end-result of a process.

Taken to its furthest reaches, both Whitehead’s AE’s and Verlinde’s EG models are shown to designate the features of a more-fundamental process underwriting nature. This represents the formal process that Whitehead predicted to underwrite (his) speculative philosophy qua experiential metaphysics; in other words, underwriting nature with a foundational process at the seat of both material and experiential systems.

207

8.4 — Closing Remarks

Here at the end it is valuable to step back and to consider four implications of this project, assuming it has been successful. First, we should view science and its theories more generally from the standpoint of an emergent paradigm. This stems all the way back to our initial selection of non-quantization vs. quantization methods of quantum gravity in chapter two. Following this branch of interpretation we encountered the ‘induced’ and ‘emergent’ models of gravity by Sakharov and Verlinde, respectively.

In the process, as Verlinde insists, string theory and space-time must also be recognized as emergent. Given that Berenstein (2006 ), Seiberg (2006 ), and others have also suggested that quantum mechanics and field theory are emergent—including the historical significance of derived and patently emergent S-Matrix approaches in both quantum mechanics and string theory (see e.g., Chew and Stapp) — at some point we’re compelled to consider that perhaps science is more naturally-inclined to a process paradigm out of which material physics (qua Standard Model) thereby derives.

Second, one of the main results of this study has been to philosophically situate Verlinde’s emergent gravity model within a comprehensive, process ontology underwriting experiential and material systems. In such a design, emergence and ‘dependent origination’ are both natural and necessary conditions. In addition, Verlinde’s string-theoretic approach finds epical comport within Whitehead’s process philosophy of AE’s. Through an alignment of the central components of string theory and the AE’s into a parallel sequence in an event-based logic and narrative, we are then able to cross-verify the initial connection through links between the components in both models. Primarily, these have been ordered and identified in the dictionaries at the ends of chapters five and six; however, we’re also able to state the correlations in a succinct manner.

To these ends, bosonic (tachyonic) strings in 25d are considered like a representation of Whitehead’s continuous potentialities, or eternal objects, as a sea of strands streaming through a 5d vacuum in Chew’s derivation (2004), and in a kinematic capacity (see Finkelstein, 2004). Through the application of a snapshot mechanism in the event logic, we are able to provide an interpretation for how ‘continuous potentialities’ give rise to ‘atomic actualities.’ These arise in the capacity of a Dn-brane of open strings as the physical-interpretation of Whitehead’s ‘duration’ or ‘event’ as an “ instantaneous spread of the actual world of that occasion ” (PRel 54). In both cases this object is interpreted to signify the initial conditions of a generative, event-cycle.

The closed strings as phonons (coupling) in Verlinde are linked to the revived, ‘original datum’ of Whitehead’s Gifford draft, sequenced into the identity of a quasiparticle (phonon) such that they don’t halt the process but instead further the action by providing the coupling for prehension from which values are distinguished from those passing into the phases of concrescence for positive synthesis into a final, satisfaction — and those

208 integrated-out in coarse-graining, as negative prehensions clarified in Verlinde as leading to a cumulative, gravitational effect at the end (maximal) of coarse-graining.

Third, by virtue of an event-ontology in Whitehead’s primordial process, underwriting experience and material systems, we are able to proffer an incorporated narrative applied to Verlinde’s EG in the context of string theory such that string theory is given a basis, logic, and internal-consistency as a set of phenomena at the Planck scale.

All together this provides a narrative incorporating bosonic strings as strands; fermionic strings as open strings; a Dn-brane as the medium to which a set of open strings are attached; closed strings as phonons; closed strings as the reenactment of open strings qua open/closed string correspondence; this generalized to the AdS/QCD considers the closed strings as free-quarks in a chiral bag. The reason for this is given in Whitehead: it is to transform the raw, initial data into values pret-a-prehension, or in a mode such that they can be prehended by the coupling (see Christian, 1977; Lango, 1972).

Out of this the coupling separates values into positive and negative; the positive values acquire an expectation value and move from the Higgs branch into the smaller Coulomb branch, in Verlinde’s model (2011), or into the phases of concrescence, in Whitehead’s model. The final-result of positive values in matrix characterizes the derivative, gravitational self-energy of the matrix-group, meanwhile the values taken-out of the group are accumulated by nature (qua φ) and summed at end to generate the emergent gravitational effect as an adiabatic reaction force (Verlinde, 2011).

Finally, fourth: upon the same basis (as above), Whitehead’s AE’s are also brought into an epical alignment through the event-logic enunciated in chapters five and six. To construct this sequence, we first recognize Whitehead’s position that ‘a multiplicity of initial data’ represents the initial conditions for the phases of concrescence. The event- logic corroborates this by a ‘snapshot of strands’ representing the initial conditions; in string theory we link this to the fundamental element of a Dn-brane of open strings.

From here, the next move is critical to the whole venture and requires us to revive an element of the Gifford-lectures that led to PR. This applies to the notion of the ‘primary’ or ‘original datum’ vs. a multiplicity of initial data. We refer to this conundrum as Whitehead’s ‘data/datum dilemma’ for determining the origin of the phases of concrescence. 84 In order to resolve this we consider, naturally-enough, a both/and scenario following Whitehead to resolve ‘a multiplicity of data’ initiating the phases of concrescence; however, like Whitehead also suspected, this leads to the formation of an ‘original datum’ in the capacity of an ‘integral feeling’ of the initial data. Whitehead’s concern stemmed from an inability to see how the datum wouldn’t terminate the process.

84 This is discussed in the Appendix

209

The answer comes from reinforcing the selection of emergent descriptions of closed strings, not as gravitons but phonons (Verlinde 2011). Applied presently, the datum is adjusted from a fundamental value, capable of terminating the process, into an emergent phonon serving as coupling for holographic renormalization to fine-tune a set of values into either (+) or (-) species. The datum overcomes terminating the process through identity of a phonon as coupling in the holographic dual (AdS/CFT) scenario. When the coarse-graining procedure of renormalization reaches maximization, this produces two effects: a gravitational self-energy of matrix, as Whitehead’s ‘satisfaction’ — and an emergent gravitational force kept track of by Newton’s potential φ, as the reactive-effect of the combined, negative-prehensions coarse-grained, or integrated-out of the matrix.

Who would have thought that Whitehead’s AE’s could provide precisely the framework to philosophically-situate gravity in an emergent approach? Given his own attempt to derive gravity in 1922, we examine this to highlight the places where his approach can be shown to predict characteristic features of the emergent approach. Gravity is still the result of a real force, as Whitehead conjectured, but as it were, a real-force one-level prior in fundamentality where AE’s are described in corollary framework for emergent gravity.

As we look forward there is compelling argumentation for considering a worldview predicated not on substances but on processes and events. The time is ripe for a new renaissance of information, a re-birth of worldview, and a commitment to a world that values processes over substances. Whitehead’s AE’s, rightly situated in an emergent physics, provide an ample basis for such an approach. The geomodal model, in years to come, will serve as a reference and a guide for this journey to rediscover the world as an experiential venture that reestablishes the role of soul in personhood. We will soon remember that life is a process, not an alchemical quest for ‘first’ materials. The point of this life is to cultivate the garden of one’s soul. To these ends we look forward.

Se acabo. Se comenzará.

It ends; and so it begins.

210

BIBLIOGRAPHY

1. Emerson, Ralph Waldo; Delivered before the Senior Class in Divinity College, Cambridge, Sunday Evening, July 15, 1838 2. Verlinde, E.P.; “On the Origin of Gravity and the Laws of Newton,” arXiv:1001.0785 [hep-th]. 3. Whitehead, Alfred North; Process and Reality: Corrected Edition, ed. David Ray Griffin and Donald W. Sherburne (New York: Free Press, 1978) 4. Sakharov, Andrei D. "VACUUM QUANTUM FLUCTUATIONS IN CURVED SPACE AND THE THEORY OF GRAVITATION." Dokl. Akad. Nauk SSSR, 177: 70-1 (Nov.-Dec. 1967). (1967). 5. T. Jacobson; Thermodynamics of space-time: The Einstein equation of state," Phys. Rev. Lett. 75, 1260 (1995) 6. Frampton, Paul H., and Thomas W. Kephart. "Primordial black holes, Hawking radiation and the early universe." Modern Physics Letters A 20, no. 21 (2005): 1573-1576. 7. D. A. Easson, P. H. Frampton, and G. F. Smoot, arXiv:1002.4278 (2010). 8. Dijkgraaf, 2012 9. Chivukula, A. (2010). Gravity as an Entropic Phenomenon, 22. High Energy Physics - Theory; General Relativity and Quantum Cosmology. Retrieved from http://arxiv.org/abs/1011.4106 10. 't Hooft, Gerard; (1993). Dimensional Reduction in Quantum Gravity. pp. 10026. arXiv:gr-qc/9310026. Bibcode 1993gr.qc 11. ‘t Hooft, Gerard; Classical Quantum Gravity . 16, 3263 (1999) [gr-qc/9903084] 12. ‘t Hooft, Gerard; Determinism beneath Quantum Mechanics , [quant-ph/0212095] 13. Mäkelä, Jarmo. "Notes Concerning" On the Origin of Gravity and the Laws of Newton" by E. Verlinde (arXiv: 1001.0785)." arXiv preprint arXiv:1001.3808 (2010). 14. Polchinski J., String theory. Vol. I. An introduction to the bosonic string, Cambridge Monographs on Mathematical Physics, Cambridge University Press, Cambridge, 1998. 15. Rovelli, Carlo. Quantum gravity . Cambridge University Press, 2004. 16. Gao, Shan; Entropy 2011, 13 , 936-948; doi:10.3390/e13050936 (2011) 17. Witten, Edward. "Is supersymmetry really broken?." International Journal of Modern Physics A 10, no. 08 (1995): 1247-1248. 18. Padmanabhan, T. "Equipartition of energy in the horizon degrees of freedom and the emergence of gravity." Modern Physics Letters A 25, no. 14 (2010): 1129-1136. 19. Liu, Zhao. "Hidden symmetries for thermodynamics and emergence of relativity." Communications in Theoretical Physics 54, no. 4 (2010): 641. 20. Aristotle. Metaphysics , ed. With an introduction by David Ross. Edited and translated by John Warrington. New York: E.P. Dutton & Co., 1956.Ramal, Randy. "In What Sense is Whitehead’s Speculative Philosophy a First Philosophy?." Concrescence 4 (2003). 21. Whitehead, PRel, 1922 22. Berenstein, David. "Large N BPS states and emergent quantum gravity." Journal of High Energy Physics 2006, no. 01 (2006): 125. 23. S.W. Wei, Y. X. Liu and Y. Q.Wang, “Friedmann equation of FRW universe in deformed Horava-Lifshitz gravity from entropic force,” arXiv:1001.5238 [hep-th]. (2012) 24. Zhang, Jingyi, and Zheng Zhao. "Hawking radiation of charged particles via tunneling from the Reissner-Nordström black hole." Journal of High Energy Physics 2005, no. 10 (2005): 055. 25. Desmet, Ronny. "Whitehead and the British reception of Einstein’s relativity: An addendum to Victor Lowe’s Whitehead biography." Process Studies Supplements 11 (2007): 1-44. 26. Desmet, Ronny. The Minkowskian Background of Whitehead’s Theory of Gravitation . Springer Berlin Heidelberg, 2010. 27. Fowler, Dean R, Disconfirmation of Whitehead's Relativity Theory - A Critical Reply, Process Studies 4(4) 288-290 (1974) 28. Bain, Jonathan. "Whitehead’s theory of gravity." Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 29, no. 4 (1998): 547-574. 29. Eddington, Arthur S. "A Comparison of Whitehead's and Einstein's Formulae." Nature 113 (1924): 192. 30. Whitehead, Alfred North. The principles of natural knowledge . Cambridge University Press, 1919.

211

31. Yutaka, Tanaka. "Einstein and Whitehead: The Comparison between Einstein's and Whitehead's Theories of Relativity." Historia Scientiarum 32 (1987). 32. Cobb, John. "Whitehead word book." sq.(trad. française légèrement modifiée: Lexique whiteheadien, Editions Chromatika, 2010, pp. 87-88). Cette caractéristique s’ appliquant à tous les animaux dotés d’un système nerveux central (2008): 45. 33. Danielewski, Marek. "The Planck--Kleinert Crystal." Zeitschrift fur Naturforschung A-Journal of Physical Sciences 62, no. 10-11 (2007): 564-568. 34. Kleinert, H. "Spontaneous generation of string tension and quark potential." Physical review letters 58, no. 19 (1987): 1915. 35. Bardeen, James M., Brandon Carter, and Stephen W. Hawking. "The four laws of black hole mechanics." Communications in Mathematical Physics 31, no. 2 (1973): 161-170. 36. Chew, Geoffrey; in Whitehead and Quantum theory; eds. Eastman, T. and Keeton, H., 2004 37. Moskowitz, 2014 38. Palma, Gonzalo A., and Subodh P. Patil. "UV/IR mode mixing and the CMB." Physical Review D 80, no. 8 (2009): 083010. 39. Chen, De-You, Haitang Yang, and Xiao-Tao Zu. "Hawking radiation of black holes in the z= 4 Horava– Lifshitz gravity." Physics Letters B 681, no. 5 (2009): 463-468. 40. Liberati, Stefano, Matt Visser, and Silke Weinfurtner. "Analogue quantum gravity phenomenology from a two-component Bose–Einstein condensate." Classical and Quantum Gravity 23, no. 9 (2006): 3129. 41. Cai, Rong-Gen, Li-Ming Cao, and Ya-Peng Hu. "Hawking radiation of an apparent horizon in a FRW universe." Classical and Quantum Gravity 26, no. 15 (2009): 155018. 42. Hamma, Alioscia, Fotini Markopoulou, Seth Lloyd, Francesco Caravelli, Simone Severini, and Klas Markström. "Quantum Bose-Hubbard model with an evolving graph as a toy model for emergent spacetime." Physical Review D 81, no. 10 (2010): 104032. 43. Dreyer, Olaf, Fotini Markopoulou, and Lee Smolin. "Symmetry and entropy of black hole horizons." Nuclear Physics B 744, no. 1 (2006): 1-13. 44. Dreyer, O. "Emergent relativity." Approaches to Quantum Gravity: Towards a New Understanding of Space, Time and Matter, Editor D. Oriti, Cambridge University Press, Cambridge (2009): 99-110. 45. El-Showk, Sheer, and Kyriakos Papadodimas. "Emergent spacetime and holographic CFTs." Journal of High Energy Physics 2012, no. 10 (2012): 1-72. 46. Gross, David; Talk at Strings 2014 held at Princeton University and the Institute for Advanced Study, Princeton, June23-27, 2014 47. Wuthrich, Christian (2006) Approaching the Planck Scale from a Generally Relativistic Point of View: A Philosophical Appraisal of Loop Quantum Gravity. Doctoral Dissertation, University of Pittsburgh. 48. Wald, Robert M. "General Relativity, 1984." (1984). 49. Weinberg, Steven. "Effects of a neutral intermediate boson in semileptonic processes." Physical Review D 5, no. 6 (1972): 1412. 50. Yang, Hyun Seok. "Emergent spacetime and the origin of gravity." Journal of High Energy Physics 2009, no. 05 (2009): 012. 51. JA Wheeler et al., eds., pp. 443-454. ... SPIE Vol. 1351 Digital Image Synthesis and Inverse Optics (1990) 52. Kuhlmann, Meinard, "Quantum Field Theory", The Stanford Encyclopedia of Philosophy (Spring 2014 Edition), Edward N. Zalta (ed.) Wheeler, 1990 53. Peltola, Ari. Studies on the Hawking Radiation and Gravitational Entropy . University of Jyväskylä, 2007. 54. Mason, Lionel J., and Ezra T. Newman. "A connection between the Einstein and Yang-Mills equations." Communications in mathematical physics 121, no. 4 (1989): 659-668. 55. Chamseddine, Ali H. "Complexified gravity in noncommutative spaces." Communications in Mathematical Physics 218, no. 2 (2001): 283-292. 56. Srednicki, Mark. "Chaos and quantum thermalization." Physical Review E 50, no. 2 (1994): 888. 57. Pessa; in Globus, Gordon G., Karl H. Pribram, and Giuseppe Vitiello, eds. Brain and being: at the boundary between science, philosophy, language and arts . Vol. 58. John Benjamins Publishing, 2004. 58. Dirac, Paul AM. "On the theory of quantum mechanics." In Proceedings of the Royal Society of London A: Mathematical, Physical and Engineering Sciences , vol. 112, no. 762, pp. 661-677. The Royal Society, 1926.

212

59. Jaffe, A., and E. Witten. "Yang-Mills Existence and Mass Gap." Millenium Prize Problems, Clay Mathematics Institute, Cambridge, MA (www. claymath. org/prize-problems/yangmills. htm) (2000). 60. Halvorson, Hans, and Michael Müger. "Algebraic quantum field theory." arXiv preprint math- ph/0602036 (2006). 61. R. Haag, Dan. Mat. Fys. Medd. ... Zh.13, No. 4 109 (1961) 62. K. Hepp and E. Lieb, Ann. Phys. 76, 360, (1972) 63. Itzykson, Claude, and J-B. Zuber. "Two-dimensional conformal invariant theories on a torus." Nuclear Physics B 275, no. 4 (1986): 580-616. 64. Umezawa, 1993; Advanced Field Theory: Micro, Macro and the Thermal Concepts; American Institute of Physics, NY, USA (1993) 65. Baumgartl, Joerg, Maria Zvyagolskaya, and Clemens Bechinger. "Tailoring of phononic band structures in colloidal crystals." Physical review letters 99, no. 20 (2007): 205503. 66. Wu, Yue-Liang, and Zhi-Feng Xie. "A three-flavor AdS/QCD model with a back-reacted geometry." Journal of High Energy Physics 2007, no. 10 (2007): 009. 67. Renate Loll; Utrecht University's Institute for Theoretical Physics delivers a lecture on Searching for the Quantum Origins of Space and Time. Perimeter Institute in Waterloo, Ontario, on May 5, 2010 68. Liu, Zhao. "Hidden symmetries for thermodynamics and emergence of relativity." Communications in Theoretical Physics 54, no. 4 (2010): 641. 69. Duff, M. J. "String and M-theory: answering the critics." Foundations of Physics 43, no. 1 (2013): 182-200. 70. Culetu, Hristu. "Comments on" On the Origin of Gravity and the Laws of Newton", by Erik Verlinde." arXiv preprint arXiv:1002.3876 (2010). 71. Lakatos, Imre, and Alan Musgrave. "Criticism and the Growth of Knowledge." (1970). 72. Norton, John D. "General covariance and the foundations of general relativity: eight decades of dispute." Reports on progress in physics 56, no. 7 (1993): 791. 73. Butterfield, Jeremy, and Chris J. Isham. "Spacetime and the philosophical challenge of quantum gravity." Physics meets philosophy at the Planck scale (2001): 33-89. 74. Hedrich, 2008 75. Jacobson, T. A., and G. E. Volovik. "Event horizons and ergoregions in 3 He." Physical Review D 58, no. 6 (1998): 064021. 76. Fedichev, Petr O., and Uwe R. Fischer. "Gibbons-Hawking effect in the sonic de Sitter space-time of an expanding Bose-Einstein-condensed gas." Physical review letters 91, no. 24 (2003): 240407. 77. Wilczek, Frank. "Superfluidity and space-time translation symmetry breaking." Physical review letters 111, no. 25 (2013): 250402. 78. Chapline, George, Pawel O. Mazur, Ignatios Antoniadis, and Emil Mottola. "Superfluid Picture for Rotating Space-Times." general relativity 2 (2014): 7. 79. Bhattacharya, Sourav (2007). "Black-Hole No-Hair Theorems for a Positive Cosmological Constant". Physical Review Letters 99 (20). 80. Mohaupt, Thomas. "Black hole entropy, special geometry and strings." arXiv preprint hep-th/0007195 (2000). 81. Weinstein, S.; Rickles, D. Quantum Gravity. In The Stanford Encyclopedia of Philosophy; Zalta, E.N., Ed.; Spring: Berlin, Germany, 2011. 82. J.Y. Zhang and Z. Zhao, Hawking radiation of charged particles via tunneling from the Reissner- Nordstrom black hole, JHEP 10 (2005) 055. J.Y. Zhang and Z. Zhao, Charged particles’ tunnelling from the Kerr-Newman black hole, Phys. Lett. B 638 (2006) 110 [arXiv:0512153[gr-qc]]. 83. Q.Q. Jiang, S.Q. Wu and X. Cai, Hawking radiation as tunneling from the Kerr and Kerr-Newman black holes, Phys. Rev. D 73 (2006) 064003 84. Q.Q. Jiang and S.Q. Wu, Hawking radiation of charged particles as tunneling from Reissner- Nordstrom-de Sitter black holes with a global monopole, Phys. Lett. B 635 (2006) 151 85. Liberati, Stefano, Matt Visser, and Silke Weinfurtner. "Naturalness in an emergent analogue spacetime." Physical review letters 96, no. 15 (2006): 151301. 86. Grygiel, Wojciech P. "Consistent Quantum Histories: Towards a Universal Language of Physics." Old and New Concepts of Physics 4 (2007): 71-98.

213

87. Lahav, Oren, Amir Itah, Alex Blumkin, Carmit Gordon, Shahar Rinott, Alona Zayats, and Jeff Steinhauer. "Realization of a sonic black hole analog in a Bose-Einstein condensate." Physical review letters 105, no. 24 (2010): 240401. 88. David Deutsch. Towards a quantum theory without “quantization”. In Steven M Christensen, editor, Quantum Theory of Gravity: Essays in Honor of the 60th Birthday of Bryce S DeWitt, pages 421–430. Hilger, Bristol, 1984. 89. Matusis, Alec, Leonard Susskind, and Nicolaos Toumbas. "The IR/UV connection in the non- commutative gauge theories." Journal of High Energy Physics 2000, no. 12 (2000): 002. 90. Minwalla, Shiraz, Mark Van Raamsdonk, and Nathan Seiberg. "Noncommutative perturbative dynamics." Journal of High Energy Physics 2000, no. 02 (2000): 020. 91. Planck, M., and S-B. Preuss Akad. "Wiss. 440 (1899)." Ann. Phys 1 (1900): 69. 92. Barceló, Carlos, Stefano Liberati, Sebastiano Sonego, and Matt Visser. "Revisiting the semiclassical gravity scenario for gravitational collapse." arXiv preprint arXiv:0909.4157 (2009). 93. Di Casola, Eolo, Stefano Liberati, and Sebastiano Sonego. "Nonequivalence of equivalence principles." American Journal of Physics 83, no. 1 (2015): 39-46. 94. Schwarzschild, K. "On the gravitational field of a sphere of incompressible fluid according to Einstein’s theory." arXiv preprint physics/9912033 (1916). 95. Horowitz, Gary T., and Saul A. Teukolsky. Black holes . Springer New York, 1999. 96. Horowitz, Gary T., ed. Black holes in higher dimensions . Cambridge University Press, 2012. 97. Soloviev, Black Hole Statistical Physics: Entropy; arvix; 2005 98. Gyftopoulos, Elias P., and Gian Paolo Beretta. Thermodynamics: foundations and applications . Courier Corporation, 2005. 99. Chakrabarti, C. G., and Kajal De. "Boltzmann-Gibbs entropy: axiomatic characterization and application." International Journal of Mathematics and Mathematical Sciences 23, no. 4 (2000): 243-251. 100. James J. Kelly, Semiclassical Statistical Mechanics; from Statistical Physics using Mathematica; © 1996- 2002 101. Dieks, Dennis. "Is There a Unique Physical Entropy? Micro versus Macro." In New Challenges to Philosophy of Science , pp. 23-34. Springer Netherlands, 2013. 102. Willie, Diana; Stephen Haking Unabridged Guide; Emereo Publishing, Oct 24, 2012 103. Jibu, Marj, and Kunio Yasue. "Magic without magic: Meaning of quantum brain dynamics." (1997). 104. Camenzind, Max. Compact objects in astrophysics . Springer Berlin Heidelberg, 2007. 105. Majhi, Bibhas Ranjan. "Emergent gravity: From statistical point of view." In Journal of Physics: Conference Series , vol. 405, no. 1, p. 012020. IOP Publishing, 2012. 106. Aid, S., V. Andreev, B. Andrieu, R-D. Appuhn, M. Arpagaus, A. Babaev, J. Bähr et al. "A measurement and QCD analysis of the proton structure function F 2 (x, Q 2) at HERA." Nuclear Physics B 470, no. 1 (1996): 3-38. 107. Wang, Chiou-Fu, Ronald Hanson, D. D. Awschalom, E. L. Hu, T. Feygelson, J. Yang, and J. E. Butler. "Fabrication and characterization of two-dimensional photonic crystal microcavities in nanocrystalline diamond." Applied Physics Letters 91, no. 20 (2007): 201112. 108. Marolf, Donald. "Unitarity and holography in gravitational physics." Physical Review D 79, no. 4 (2009): 044010. 109. Zhang, Jingyi, and Zheng Zhao. "Hawking radiation of charged particles via tunneling from the Reissner-Nordström black hole." Journal of High Energy Physics 2005, no. 10 (2005): 055. 110. Ashtekar, Abhay, and Martin Bojowald. "Black hole evaporation: A paradigm." Classical and Quantum Gravity 22, no. 16 (2005): 3349. 111. Ashtekar, Abhay, John Baez, Alejandro Corichi, and Kirill Krasnov. "Quantum geometry and black hole entropy." Physical Review Letters 80, no. 5 (1998): 904. 112. Wald, Robert M. "On particle creation by black holes." Communications in Mathematical Physics 45, no. 1 (1975): 9-34. 113. Wald, Robert M. "The thermodynamics of black holes." (2001). 114. Hull, Chris, and Barton Zwiebach. "Double field theory." Journal of High Energy Physics 2009, no. 09 (2009): 099. 115. Shannon, Claude E. "Bell System Tech. J. 27 (1948) 379; CE Shannon." Bell System Tech. J 27 (1948): 623.

214

116. Misner, Thorne, and K. S. Thorne. "Wheeler.: 1973, Gravitation." (1973). 117. Jing, Jiliang, Liancheng Wang, Qiyuan Pan, and Songbai Chen. "Holographic superconductors in Gauss- Bonnet gravity with Born-Infeld electrodynamics." Physical Review D 83, no. 6 (2011): 066010. 118. Damasco, John Jeffrey. "Gravity as an Emergent Phenomenon." (2012). 119. Kowall, James. "On the Nature of Physical and Non-Physical Reality (Part II)." Scientific GOD Journal 6, no. 2 (2015). 120. Gao, Shan. "Is gravity an entropic force?." Entropy 13, no. 5 (2011): 936-948. 121. Maldacena, Juan. "The illusion of gravity." Scientific American 293, no. 5 (2005): 56-63. 122. Alday, Luis F., and Juan Maldacena. "Gluon scattering amplitudes at strong coupling." Journal of High Energy Physics 2007, no. 06 (2007): 064. 123. Heisenberg, ___ 124. Yoneya, Tamiaki. "String theory and the space-time uncertainty principle." Progress of Theoretical Physics 103, no. 6 (2000): 1081-1125. 125. Blumenhagen, Ralph, Dieter Lüst, and Stefan Theisen. Basic concepts of string theory . Springer Science & Business Media, 2012. 126. Ananth, Sudarshan, and Stefan Theisen. "KLT relations from the Einstein–Hilbert Lagrangian." Physics Letters B 652, no. 2 (2007): 128-134. 127. Cooper, Necia Grant, and Geoffrey B. West, eds. Particle physics: a Los Alamos primer . Vol. 11. CUP Archive, 1988. 128. McGarrie, Moritz. "Gauge mediated supersymmetry breaking in five dimensions." arXiv preprint arXiv:1109.6245 (2011). 129. Salisbury, Donald C. "The quantization of the relativistic string." General relativity and gravitation 16, no. 10 (1984): 955-978. 130. Shapiro, Ilya L., and Joan Sola. "On the possible running of the cosmological “constant”." Physics Letters B 682, no. 1 (2009): 105-113. 131. Cappelli, Andrea, Elena Castellani, Filippo Colomo, and Paolo Di Vecchia, eds. The Birth of string theory . Cambridge University Press, 2012. 132. Michael B. Schulz, Elliott F. Tammaro, "M-theory/type IIA duality and K3 in the ... Changing Bubbles in String Theory," JHEP 0305, 014 (2003) 133. Frautschi, Steven Clark. "Regge poles and S-matrix theory." (1963). 134. Giulini, Domenico JW, Claus Kiefer, and Claus Lämmerzahl, eds. Quantum gravity: from theory to experimental search . Vol. 631. Springer Science & Business Media, 2003. 135. Balasubramanian, Vijay, Steven B. Giddings, and Albion Lawrence. "What do CFTs tell us about anti-de Sitter ?." Journal of High Energy Physics 1999, no. 03 (1999): 001. 136. Aref’eva, Irina, Andrey Bagrov, and Alexey S. Koshelev. "Holographic thermalization from Kerr-AdS." Journal of High Energy Physics 2013, no. 7 (2013): 1-16. 137. Fitzpatrick, A. Liam, and Jared Kaplan. "Unitarity and the holographic S-matrix." Journal of High Energy Physics 2012, no. 10 (2012): 1-47. 138. Veneziano, Gabriele. "A stringy nature needs just two constants." EPL (Europhysics Letters) 2, no. 3 (1986): 199. 139. Szab´o, R., Szabados, L., Ngeow, C.-C., et al. 2011, MNRAS, 413, 2709 140. Tong, David. "Lectures on string theory." arXiv preprint arXiv:0908.0333 (2009). 141. Antoniadis, Ignatios, E. Dudas, and A. Sagnotti. "Supersymmetry breaking, open strings and M-theory." Nuclear Physics B 544, no. 3 (1999): 469-502. 142. Riggins, Paul, and Vatche Sahakian. "Black hole thermalization, D0 brane dynamics, and emergent spacetime." Physical Review D 86, no. 4 (2012): 046005. 143. J. H. Schwarz, “Evidence For Non-Perturbative String Symmetries,” hep-th/9411178. 144. Rudolph, M. "String theory and beyond." arXiv preprint hep-th/9812201 (1998). 145. Gaberdiel, Matthias R., and Rajesh Gopakumar. "An AdS 3 dual for minimal model CFTs." Physical Review D 83, no. 6 (2011): 066007. 146. Casalderrey-Solana, Jorge, Hong Liu, David Mateos, Krishna Rajagopal, and Urs Achim Wiedemann. "Gauge/string duality, hot QCD and heavy ion collisions." arXiv preprint arXiv:1101.0618 (2011).

215

147. Rickles, Dean. "AdS/CFT duality and the emergence of spacetime." Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 44, no. 3 (2013): 312-320. 148. Dawid, Richard. "Scientific realism in the age of string theory." (2007). 149. Font, Anamarıa, and Stefan Theisen. "Introduction to string compactification." In Geometric and Topological Methods for Quantum Field Theory , pp. 101-181. Springer Berlin Heidelberg, 2005. 150. Grimm, Thomas W., Max Kerstan, Eran Palti, and Timo Weigand. "Massive Abelian gauge symmetries and fluxes in F-theory." Journal of High Energy Physics 2011, no. 12 (2011): 1-51. 151. Compere, Geoffrey, and Donald Marolf. "Setting the boundary free in AdS/CFT." Classical and Quantum Gravity 25, no. 19 (2008): 195014. 152. Ivancevic, Vladimir. "Generalized Hamiltonian biodynamics and topology invariants of humanoid robots." International Journal of Mathematics and Mathematical Sciences 31, no. 9 (2002): 555-565. 153. Peeters, Kasper, and Marija Zamaklar. "The string/gauge theory correspondence in QCD." The European Physical Journal-Special Topics 152, no. 1 (2007): 113-138. 154. ‘t Hooft, Gerard. "A planar diagram theory for strong interactions." Nuclear Physics B 72 (1974): 461-473. 155. Scarpettini, A., D. Gomez Dumm, and Norberto N. Scoccola. "Light pseudoscalar mesons in a nonlocal SU (3) chiral quark model." Physical Review D 69, no. 11 (2004): 114018. 156. Frolov, Sergey, and Arkady A. Tseytlin. "Semiclassical quantization of rotating superstring in AdS5× S5." Journal of High Energy Physics 2002, no. 06 (2002): 007. 157. Ishizeki, R., M. Kruczenski, and A. Tirziu. "New open string solutions in AdS 5." Physical Review D 77, no. 12 (2008): 126018. 158. C. Chu and P. Ho, Nucl.Phys. B550 (1999 159. D. Kutasov, M. Marino and GW Moore, JHEP 0010, 045 (2000). 72. ... 2005 160. Aspinwall, Paul. Dirichlet branes and mirror symmetry . Vol. 4. American Mathematical Soc., 2009. 161. Iizuka, Norihiro, and Sandip P. Trivedi. "An inflationary model in string theory." Physical Review D 70, no. 4 (2004): 043519. 162. Giveon, Amit, and David Kutasov. "Brane dynamics and gauge theory." Reviews of Modern Physics 71, no. 4 (1999): 983. 163. Harvey, Jeffrey A., and Gregory Moore. "Five-brane instantons and R 2 couplings in N= 4 string theory." Physical Review D 57, no. 4 (1998): 2323. 164. Arnold, Peter, Guy D. Moore, and Laurence G. Yaffe. "Fate of non-Abelian plasma instabilities in 3+ 1 dimensions." Physical Review D 72, no. 5 (2005): 054003. 165. Zabusky, Norman J., and Martin D. Kruskal. "Interaction of solitons in a collisionless plasma and the recurrence of initial states." Phys. Rev. Lett 15, no. 6 (1965): 240-243. 166. Scott, Mark M., Mikhail P. Kostylev, Boris A. Kalinikos, and Carl E. Patton. "Excitation of bright and dark envelope solitons for magnetostatic waves with attractive nonlinearity." Physical Review B 71, no. 17 (2005): 174440. 167. Eilbeck, J. C., P. S. Lomdahl, and A. C. Scott. "Soliton structure in crystalline acetanilide." Physical Review B 30, no. 8 (1984): 4703. 168. Antoniadis, Ignatios, and Constantin Bachas. "Branes and the gauge hierarchy." Physics Letters B 450, no. 1 (1999): 83-91. 169. Globus, Gordon. "The being/brain problem." NeuroQuantology 3, no. 4 (2007). 170. Davydov, Aleksandr Sergeevich. "Solitons in molecular systems." Physica scripta 20, no. 3-4 (1979): 387. 171. Balasubramanian, Vijay, Alice Bernamonti, Johannes de Boer, Neil Copland, Ben Craps, Esko Keski- Vakkuri, B. Müller, Andreas Schäfer, Masaki Shigemori, and Wieland Staessens. "Thermalization of strongly coupled field theories." Physical review letters 106, no. 19 (2011): 191601. 172. Koch and Murugan, 2009 173. Gubser, Steven S., Igor R. Klebanov, and Alexander M. Polyakov. "Gauge theory correlators from non- critical string theory." Physics Letters B 428, no. 1 (1998): 105-114. 174. Polchinski, Joseph. "Critical behavior of random surfaces in one dimension." Nuclear Physics B 346, no. 2 (1990): 253-263. 175. Martinec, Emil, Peter Adshead, and Mark Wyman. "Chern-Simons EM-flation." Journal of High Energy Physics 2013, no. 2 (2013): 1-27. 176. Hořava, Petr. "Quantum gravity at a Lifshitz point." Physical Review D 79, no. 8 (2009): 084008.

216

177. Skenderis, Kostas. "Lecture notes on holographic renormalization." Classical and Quantum Gravity 19, no. 22 (2002): 5849. 178. Frampton, Paul H. "Consequences of vacuum instability in quantum field theory." Physical Review D 15, no. 10 (1977): 2922. 179. Karch, Andreas, and Lisa Randall. "Locally localized gravity." Journal of High Energy Physics 2001, no. 05 (2001): 008. 180. Aharony, Ofer, and David Kutasov. "Holographic duals of long open strings." Physical Review D 78, no. 2 (2008): 026005. 181. Wüthrich, C. (2005). To Quantize or Not to Quantize: Fact and Folklore in Quantum Gravity. Philosophy of Science , 72 (5), 777–788. 182. Padmanabhan, T. (2009a). Equipartition of energy in the horizon degrees of freedom and the emergence of gravity, 1129–1136. General Relativity and Quantum Cosmology; High Energy Physics - Theory. 183. Li, M. and Wang, Y. “Quantum UV/IR Relations and Holographic Dark Energy from Entropic Force,” arXiv:1001.4466 [hep-th]. 184. Wang, T. “The Coulomb Force as an Entropic Force,” arXiv:1001.4965 [hep-th]. 185. Witten, Edward. "Perturbative gauge theory as a string theory in twistor space." Communications in Mathematical Physics 252, no. 1-3 (2004): 189-258. 186. Rovelli, C. and Smolin, L. “Discreteness of area and volume in quantum gravity,” Nucl. Phys. B 442, 593 (1995); [Erratum-ibid. B 456, 753 (1995)] [arXiv:gr-qc/9411005]. 187. Hawking, S.W., and Hunter, C.J. “Gravitational entropy and global structure,” Phys. Rev. D 59, 044025 (1999) [arXiv:hep-th/9808085]. 188. Hawking, S.W., Hunter, C.J., and Page, D.N. “Nut charge, anti-de Sitter space and entropy,” Phys. Rev. D 59, 044033 (1999) [arXiv:hep-th/9809035]. 189. Carlip, S. “Entropy from conformal field theory at Killing horizons,” Class. Quant. Grav. 16 (1999) 3327 [arXiv:gr-qc/9906126]. 190. Padmanabhan, T. Mod. Phys. Lett. A 25 (2010) 1129 [arXiv:0912.3165 [gr-qc]]. 191. Padmanabhan, “Why Does the Universe Expand? ”[arXiv gr-qc (Jan, 2010) 1001.3380v1]. 192. Bousso, Raphael (2002). "The holographic principle". Reviews of Modern Physics 74 (3): 825–874. arXiv:hep-th/0203101. Bibcode 2002RvMP...74..825B. doi:10.1103/RevModPhys.74.825. 193. Polchinski J., String theory. Vol. I. An introduction to the bosonic string, Cambridge Monographs on Mathematical Physics, Cambridge University Press, Cambridge, 1998. 194. Rovelli C., Quantum gravity, Cambridge Monographs on Mathematical Physics, Cambridge University Press, Cambridge, 2004. 195. Sakharov Conference on Physics, Moscow, (91):447-454 196. Jacobson, Ted. "Trans-Planckian redshifts and the substance of the space-time river." Progress of Theoretical Physics Supplement 136 (1999): 1-17. 197. Bardeen, Carter, and Hawking, “The Four Laws of Black Hole Mechanics,” Commun. Math. Phys. 31 (1973) 161. 198. Bekenstein, Jacob D. “Black holes and entropy,” Phys. Rev. D 7 (1973) 2333. 199. Hawking, “Particle Creation by Black Holes,” Commun. Math. Phys. 43 (1975) 199 [Erratum-ibid. 46 (1976) 206]. 200. Wang, T. (2012). Modified entropic gravity revisited, 12. High Energy Physics - Theory; General Relativity and Quantum Cosmology. 201. Bekenstein, Jacob D. (January 1981). "Universal upper bound on the entropy-to-energy ratio for bounded systems". Physical Review D 23 (215): 287–298. Bibcode 1981PhRvD..23..287B. 202. Bekenstein, Jacob D. "Novel ‘‘no-scalar-hair’’theorem for black holes." Physical Review D 51, no. 12 (1995): R6608. 203. Bekenstein, Jacob D. (August 2003). "Information in the Holographic Universe — Theoretical results about black holes suggest that the universe could be like a gigantic hologram". Scientific American 17: p. 59. doi:10.1093/shm/17.1.145. 204. Majumdar, Parthasarathi (1998). "Black Hole Entropy and Quantum Gravity". ArXiv: General Relativity and Quantum Cosmology 73: 147. arXiv:gr-qc/9807045. Bibcode 1999InJPB..73..147M.

217

205. Lloyd, Seth (2002-05-24). "Computational Capacity of the Universe". Physical Review Letters 88 (23): 237901. arXiv:quant-ph/0110141. Bibcode 2002PhRvL..88w7901L. doi:10.1103/PhysRevLett.88.237901. PMID 12059399. 206. Davies, Paul. Cosmic Blueprint: New Discoveries In Natures Ability To Order Universe . Templeton Foundation Press, 2004. 207. Davies, Paul. "Multiverse Cosmological Models and the Anthropic Principle". CTNS. Retrieved 2008-03- 14. 208. Gubser, Steven S. "Superluminal neutrinos and extra dimensions: Constraints from the null energy condition." Physics Letters B 705, no. 3 (2011): 279-281. 209. Holographic Universe by Tega Jessa on March 22, 2010; http://www.universetoday.com/59921/holographic-universe/#ixzz2Co1URAr1 210. Timmer, 2011 211. Munkhammar, Joakim. "Is Holographic Entropy and Gravity the result of Quantum Mechanics?." arXiv preprint arXiv:1003.1262 (2010). 212. Hartnoll, Sean A. "Horizons, holography and condensed matter." arXiv preprint arXiv:1106.4324 (2011). 213. Hubeny, V. E. (2010). The Fluid/Gravity Correspondence: a new perspective on the Membrane Paradigm, 20. General Relativity and Quantum Cosmology; High Energy Physics - Theory; Fluid Dynamics. doi:10.1088/0264-9381/28/11/114007 214. Seiberg, Nathan. "Emergent spacetime." arXiv preprint hep-th/0601234 (2006). 215. Yang, H. S. (2006). Emergent Gravity from Noncommutative Spacetime, 50. High Energy Physics - Theory; General Relativity and Quantum Cosmology; High Energy Physics - Phenomenology. doi:10.1142/S0217751X0904587X 216. Maldacena, J. M. (1996). Black Holes in String Theory, (June 1996), 80. High Energy Physics - Theory; General Relativity and Quantum Cosmology. 217. Freund, 218. Hawking, Stephen W. "Gravitational radiation from colliding black holes." Physical Review Letters 26, no. 21 (1971): 1344. 219. Hawking, Stephen W. "Information loss in black holes." Physical Review D 72, no. 8 (2005): 084013. 220. Banerjee, R. (2010). From Black Holes To Emergent Gravity. International Journal of Modern Physics D , 19 (14), 2365–2369. 221. David Deutsch. Towards a quantum theory without “quantization”. In Steven M Christensen, editor, Quantum Theory of Gravity: Essays in Honor of the 60th Birthday of Bryce S DeWitt, pages 421–430. Hilger, Bristol, 1984. 222. Patton, Charles M., and John A. Wheeler. "Is physics legislated by cosmogony." In Quantum gravity , vol. 1, pp. 538-605. 1975. 223. Easson, D. A., Frampton, P. H., & Smoot, G. F. (2011). Entropic accelerating universe. Physics Letters, Section B: Nuclear, Elementary Particle and High-Energy Physics , 696 (3), 273–277. High Energy Physics - Theory; Cosmology and Extragalactic Astrophysics; High Energy Physics - Phenomenology. doi:10.1016/j.physletb.2010.12.025 224. Visser, Matt. "Sakharov's induced gravity: a modern perspective." Modern Physics Letters A 17, no. 15n17 (2002): 977-991. 225. Visser, M. (2011). Conservative entropic forces. Journal of High Energy Physics , 2011 (10), 21. High Energy Physics - Theory; General Relativity and Quantum Cosmology. Retrieved from http://arxiv.org/abs/1108.5240 226. Padmanabhan, T. “Thermodynamical Aspects of Gravity: New insights,” arXiv:0911.5004. 227. Lee, J. W. (2012). On the Origin of Entropic Gravity and Inertia. Foundations of Physics , 42 (9), 1153–1164. High Energy Physics - Theory; General Relativity and Quantum Cosmology; Quantum Physics. 228. Kowalski-Glikman, J. (2010). Note on gravity, entropy, and BF topological field theory. Physical Review D - Particles, Fields, Gravitation and Cosmology , 81 (8), 5. High Energy Physics - Theory. doi:10.1103/PhysRevD.81.084038 229. Susskind, Leonard, 1995, "The World as a Hologram". Journal of Mathematical Physics 36 (11): 6377– 6396. arXiv:hep-th/9409089. Bibcode 1995JMP....36.6377S.

218

230. Susskind, Leonard, "The Black Hole War – My Battle with Stephen Hawking to Make the World Safe for Quantum Mechanics", Little, Brown and Company (2008) 231. Roveto, Jonathan J., and Gerardo Munoz. "A challenge to entropic gravity." arXiv preprint arXiv:1201.2475 (2012). 232. Cai, Y. F., Liu, J., & Li, H. (2010). Entropic cosmology: A unified model of inflation and late-time acceleration. Physics Letters, Section B: Nuclear, Elementary Particle and High-Energy Physics , 690 (3), 213–219. Cosmology and Extragalactic Astrophysics; General Relativity and Quantum Cosmology; High Energy Physics - Phenomenology; High Energy Physics - Theory. doi:10.1016/j.physletb.2010.05.033 233. Zhao, L. (2010). Hidden symmetries for thermodynamics and emergence of relativity, 4. High Energy Physics - Theory. Retrieved from http://arxiv.org/abs/1002.0488 234. Chen, C., & Force, E. (2010). Emergent Gravity from the Entropic Force, 1–8. 235. Nicolini, P. (2010). Entropic force, noncommutative gravity, and ungravity. Physical Review D - Particles, Fields, Gravitation and Cosmology , 82 (4), 8. General Relativity and Quantum Cosmology; High Energy Physics - Theory. doi:10.1103/PhysRevD.82.044030 236. Carlip, Steven. "Near-Horizon Conformal Symmetry Revisited." In APS Meeting Abstracts , vol. 1, p. 10001. 2013. 237. Becker, Katrin, Melanie Becker, and John H. Schwarz. String theory and M-theory: A modern introduction . Cambridge University Press, 2006. 238. Khurgin, Jacob B. "Viewpoint: phonon lasers gain a sound foundation." Physics 3 (2010): 16. 239. Fujii, Yasunori, and Kei-ichi Maeda. The scalar-tensor theory of gravitation . Cambridge University Press, 2003. 240. Davies, Paul CW. "Scalar production in Schwarzschild and Rindler metrics." Journal of Physics A: Mathematical and General 8, no. 4 (1975): 609. 241. Van Raamsdonk, Mark. "BUILDING UP SPACE–TIME WITH QUANTUM ENTANGLEMENT." International Journal of Modern Physics D 19, no. 14 (2010): 2429-2435. 242. Gherghetta, Tony, Marco Peloso, and Erich Poppitz. "Emergent gravity from a mass deformation in warped spacetime." Physical Review D 72, no. 10 (2005): 104003. 243. Eingorn, Maxim V., and Vitaliy D. Rusov. "Emergent Quantum Euler Equation and Bose–Einstein Condensates." Foundations of Physics 44, no. 2 (2014): 183-191. 244. Moldoveanu, Florin. "Heuristic rule for constructing physics axiomatization." arXiv preprint arXiv:1001.4586 (2010). 245. Unruh, William G. "Notes on black-hole evaporation." Physical Review D 14, no. 4 (1976): 870. 246. Sheykhi, A., and K. Rezazadeh Sarab. "Einstein equations and MOND theory from Debye entropic gravity." Journal of Cosmology and Astroparticle Physics 2012, no. 10 (2012): 012. 247. Nicolini, Piero. "Entropic force, noncommutative gravity, and ungravity." Physical Review D 82, no. 4 (2010): 044030. 248. Vedral, Vlatko. Decoding reality: the universe as quantum information . Oxford University Press, 2010. 249. Sherburne, Donald W., ed. A key to Whitehead's process and reality . Indiana University Press, 1966. 250. Whitehead, Alfred North. Adventures of ideas . Vol. 9317. Simon and Schuster, 1967. 251. Whitehead, Alfred North. Religion in the making: Lowell lectures 1926 . Fordham Univ Press, 1926. 252. Stapp, Henry P. "Attention, intention, and will in quantum physics." Journal of Consciousness studies 6, no. 8-9 (1999): 143-143. 253. Stapp, Henry. "Whitehead, James, and the Ontology of Quantum Theory." Mind and Matter 5, no. 1 (2007): 83-109. 254. Shimony, Abner, and Howard Stein. "Comment on “Nonlocal character of quantum theory,” by Henry P. Stapp [Am. J. Phys. 65 (4), 300–304 (1997)]." American Journal of Physics 69, no. 8 (2001): 848-853. 255. Eastman, T. E., and Hank Keeton. "Physics and Whitehead: Process, Quantum and Experience." (2003). 256. Eastman, Timothy E. "Our cosmos, from substance to process." World Futures 64, no. 2 (2008): 84-93. 257. Clayton, Philip; in Whitehead and Quantum theory; eds. Eastman, T. and Keeton, H., 2004 258. Nobo, Jorge; in Whitehead and Quantum theory; eds. Eastman, T. and Keeton, H., 2004 259. Lucas, George R. The rehabilitation of Whitehead: an analytic and historical assessment of process philosophy . SUNY Press, 1989.

219

260. Hartshorne, Charles, and Creighton Peden. Whitehead's view of reality . Pilgrim Press, 1981. 261. Leclerc, Ivor. "Whitehead's metaphysics: an introductory exposition." (1958). 262. Christian, William A. "An Interpretation of Whitehead's Metaphysics." (1977). 263. McHenry, Leemon. "Whitehead, quantum mechanics and local realism." Process Studies 31, no. 1 (2002): 164-170. 264. Silberstein, Michael, and John McGeever. "The search for ontological emergence." The Philosophical Quarterly 49, no. 195 (1999): 201-214. 265. Kauffman, Stuart. At home in the universe: The search for the laws of self-organization and complexity . Oxford university press, 1995. 266. Levin, Michael A., and Xiao-Gang Wen. "String-net condensation: A physical mechanism for topological phases." Physical Review B 71, no. 4 (2005): 045110. 267. Morowitz, Harold J. The emergence of everything: How the world became complex . Oxford University Press, 2002. 268. Murphy, Nancey C., and George Francis Rayner Ellis. On the moral nature of the universe: Theology, cosmology, and ethics . Vol. 137. Fortress Press, 1996. 269. Hansen, 2004; in Whitehead and Quantum theory; eds. Eastman, T. and Keeton, H., 2004 270. Hartshorne, Charles. "Whitehead’s revolutionary concept of prehension." International Philosophical Quarterly 19, no. 3 (1979): 253-263. 271. Jungerman, John A. World in process: Creativity and interconnection in the new physics . SUNY Press, 2000. 272. Ford, Lewis S. The Emergence of Whitehead's Metaphysics, 1925-1929. SUNY Press, 1984. 273. Wüthrich, C. (2005). To Quantize or Not to Quantize: Fact and Folklore in Quantum Gravity. Philosophy of Science , 72 (5), 777–788. 274. Finkelstein, David; in Whitehead and Quantum theory; eds. Eastman, T. and Keeton, H., 2004 275. Laszlo, Istvan, and Rachel Bean. "Nonlinear growth in modified gravity theories of dark energy." Physical Review D 77, no. 2 (2008): 024048. 276. Whitehead, A. "1967. Science and the Modern World." (1925): 51. 277. Nathalie Despraz (« La réduction phénoménologique comme praxis », Les Carnets du Centre de Philosophie du Droit, n" 74, Louvain-la-Neuve, CPDR, 1999) 278. Synge, John Lighton. "The relativity theory of AN Whitehead." (1951). 279. S.R. Coleman and J. Mandula, Phys. Rev 159 (1967) 1251-1256. 280. Jammer, Max. "Einstein and Religion. Princetonn." (1999). 281. Schwanauer, Francis. "No Many is Not a One (for the Case is a Comparison)." (1981). 282. Craig, AD Bud. "How do you feel--now? The anterior insula and human awareness." Nature Reviews Neuroscience 10 (2009): 59-70. 283. Allman, John M., Nicole A. Tetreault, Atiya Y. Hakeem, Kebreten F. Manaye, Katerina Semendeferi, Joseph M. Erwin, Soyoung Park, Virginie Goubert, and Patrick R. Hof. "The von Economo neurons in frontoinsular and anterior cingulate cortex in great apes and humans." Brain Structure and Function 214, no. 5-6 (2010): 495-517. 284. Mayer, Emeran A. "Gut feelings: the emerging biology of gut–brain communication." Nature Reviews Neuroscience 12, no. 8 (2011): 453-466. 285. Krause; on Nothing. Heythrop Journal 54 (4):678-690 286. Krause, Elizabeth (1997). The Metaphysics of Experience: A Companion to Whitehead's Process and Reality. New York: Fordham University Press 287. Krips, Henry, "Measurement in Quantum Theory", The Stanford Encyclopedia of Philosophy (Fall 2013 Edition), Edward N. Zalta (ed.), 288. Albert, DZ (1992). Quantum mechanics and experience (pp. 88–92, 161–164 289. Grünbaum, Adolf. "Whitehead's method of extensive abstraction." The British Journal for the Philosophy of Science 4, no. 15 (1953): 215-226. 290. Noyes, H. Pierre. Bit-string physics: A Finite and discrete approach to natural philosophy . Vol. 27. World Scientific, 2001.

220

291. Bastin, Ted, H. Pierre Noyes, John Amson, and Clive W. Kilmister. "On the physical interpretation and the mathematical structure of the combinatorial hierarchy." International Journal of Theoretical Physics 18, no. 7 (1979): 445-488. 292. Noyes, H. Pierre. "Bit-string physics: a novel “theory of everything”." In Physics and Computation, 1994. PhysComp'94, Proceedings., Workshop on , pp. 88-94. IEEE, 1994. 293. Peirce, Charles Sanders, Charles Hartshorne, and Paul Weiss, eds. Collected papers of charles sanders peirce . Vol. 5. Harvard University Press, 1935. 294. James, William. "II.—What is an emotion?." Mind 34 (1884): 188-205. 295. Peskin, Michael E., and Daniel V. Schroeder. An introduction to quantum field theory . Westview, 1995. 296. Schuijer, Michiel. Analyzing atonal music: pitch-class set theory and its contexts . Vol. 60. University Rochester Press, 2008. 297. Viehweg, Eckart. Quasi-projective moduli for polarized manifolds . Vol. 30. Berlin: Springer, 1995. 298. Simpson, Carlos T. "Moduli of representations of the fundamental group of a smooth projective variety I." Publications mathématiques de l'IHES 79, no. 1 (1994): 47-129. 299. Hume, David. An enquiry concerning human understanding . Broadview Press, 2011. 300. Whitehead, Gifford Lectures; 1927-1928 301. Whitehead, Alfred North. The principle of relativity: with applications to physical science . Cambridge University Press, 2011. 302. Einstein, Albert. "On the method of theoretical physics." Philosophy of science 1, no. 2 (1934): 163-169. 303. Fowler, Dean; Relativity Physics and the Doctrine of God; a comparative study of Whitehead and Einstein; PhD Dissertation; Claremont Graduate University; 1975 304. Kiley, John F. "The metaphysical foundations of the epistemology of Albert Einstein." (1961). 305. Palter, Robert. "Whitehead's philosophy of science." (1960). 306. Tipler, Paul A., and Ralph Llewellyn. Modern physics . Macmillan, 2003. 307. Ushenko, A. P. "Einstein’s influence on philosophy." Albert Einstein: Philosopher-Scientist (1949). 308. Whitehead, Alfred North. The concept of nature . University of Michigan Press, 1959. 309.Whitehead, Alfred North. Religion in the making: Lowell lectures 1926 . Fordham Univ Press, 1926. 310. Whitehead, Alfred N. "An enquiry concerning the principles of human knowledge." (1919). 311. Whitehead, Alfred North. "Modes of Thought. 1938." New York: The Free Press 3 (1968): 434-438. 312. Stengers, Isabelle. Thinking With Whitehead: A Free and Wild Creation of Concepts . Translated by Michael Chase. Cambridge, MA: Harvard University Press, 2011 313. Gibbons, Gary, and Clifford M. Will. "On the multiple deaths of Whitehead's theory of gravity." Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 39, no. 1 (2008): 41-61. 314. Lango, John W. Whitehead's ontology . SUNY Press, 1972. 315. Lango, J. W. (2013). The Logic of Simultaneity, 66 (11), 340–350. 316. Čapek, M. (1961). The Philosophical Impact of Contemporary Physics. Princeton, NJ: Van Nostrand. 317. Schild, A. "On gravitational theories of Whitehead's type." Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences 235, no. 1201 (1956): 202-209. 318. Whitehead, Alfred North; Function of Reason; Princeton; Princeton University Press; 1929 319. Jung, Carl Gustav. The archetypes and the collective unconscious . No. 20. Princeton University Press, 1981. 320. Plato. The republic of Plato . Vol. 30. New York: Oxford University Press, 1945. 321. John Von Neumann. Mathematical foundations of quantum mechanics . No. 2. Princeton university press, 1955. 322. Tsu, Lao. Tao te ching . Vintage, 1989. 323. D’Hoker, Eric, and Daniel Z. Freedman. "Supersymmetric gauge theories and the AdS/CFT correspondence." Theoretical Advanced Study Institute in Elementary Particle Physics (TASI 2001): Strings, Branes and Extra Dimensions (2002): 3-158.

221