Riemannian Geometry and Statistical Machine Learning

Total Page:16

File Type:pdf, Size:1020Kb

Riemannian Geometry and Statistical Machine Learning Riemannian Geometry and Statistical Machine Learning Doctoral Thesis Guy Lebanon Language Technologies Institute School of Computer Science Carnegie Mellon University [email protected] January 31, 2005 Abstract Statistical machine learning algorithms deal with the problem of selecting an appropriate N N statistical model from a model space Θ based on a training set xi i=1 or (xi,yi) i=1 { } ⊂ X { } ⊂ . In doing so they either implicitly or explicitly make assumptions on the geometries X × Y of the model space Θ and the data space . Such assumptions are crucial to the success of X the algorithms as different geometries are appropriate for different models and data spaces. By studying these assumptions we are able to develop new theoretical results that enhance our understanding of several popular learning algorithms. Furthermore, using geometrical reasoning we are able to adapt existing algorithms such as radial basis kernels and linear margin classifiers to non-Euclidean geometries. Such adaptation is shown to be useful when the data space does not exhibit Euclidean geometry. In particular, we focus in our experiments on the space of text documents that is naturally associated with the Fisher information metric on corresponding multinomial models. Thesis Committee: John Lafferty (chair) Geoffrey J. Gordon, Michael I. Jordan, Larry Wasserman 1 Acknowledgements This thesis contains work that I did during the years 2001-2004 at Carnegie Mellon University. During that period I received a lot of help from faculty, students and friends. However, I feel that I should start by thanking my family: Alex, Anat and Eran Lebanon, for their support during my time at Technion and Carnegie Mellon. Similarly, I thank Alfred Bruckstein, Ran El-Yaniv, Michael Lindenbaum and Hava Siegelmann from the Technion for helping me getting started with research in computer science. At Carnegie Mellon University I received help from a number of people, most importantly my advisor John Lafferty. John helped me in many ways. He provided excellent technical hands-on assistance, as well as help on high-level and strategic issues. Working with John was a very pleasant and educational experience. It fundamentally changed the way I do research and turned me into a better researcher. I also thank John for providing an excellent environment for research without distractions and for making himself available whenever I needed. I thank my thesis committee members Geoffrey J. Gordon, Michael I. Jordan and Larry Wasser- man for their helpful comments. I benefited from interactions with several graduate students at Carnegie Mellon University. Risi Kondor, Leonid Kontorovich, Luo Si, Jian Zhang and Xioajin Zhu provided helpful comments on different parts of the thesis. Despite not helping directly on topics related to the thesis, I benefited from interactions with Chris Meek at Microsoft Research, Yoram Singer at the Hebrew University and Joe Verducci and Douglas Critchlow at Ohio State University. These interactions improved my understanding of machine learning and helped me write a better thesis. Finally, I want to thank Katharina Probst, my girlfriend for the past two years. Her help and support made my stay at Carnegie Mellon very pleasant, despite the many stressful factors in the life of a PhD student. I also thank her for patiently putting up with me during these years. 2 Contents 1 Introduction 7 2 Relevant Concepts from Riemannian Geometry 10 2.1 Topological and Differentiable Manifolds . ............. 10 2.2 TheTangentSpace................................. 13 2.3 RiemannianManifolds . .. .. .. .. .. .. .. .. ...... 14 3 Previous Work 16 4 Geometry of Spaces of Probability Models 19 4.1 Geometry of Non-Parametric Spaces . ......... 20 4.2 Geometry of Non-Parametric Conditional Spaces . ............. 22 4.3 Geometry of Spherical Normal Spaces . ......... 25 5 Geometry of Conditional Exponential Models and AdaBoost 26 5.1 Definitions...................................... 28 5.2 Correspondence Between AdaBoost and Maximum Likelihood............. 29 5.2.1 The Dual Problem (P1∗).............................. 30 5.2.2 The Dual Problem (P2∗).............................. 31 5.2.3 Specialcases .................................. 32 5.3 Regularization .................................. ..... 34 5.4 Experiments..................................... 36 6 Axiomatic Geometry for Conditional Models 39 6.1 Congruent Embeddings by Markov Morphisms of ConditionalModels ........ 41 6.2 A Characterization of Metrics on Conditional Manifolds ................ 43 6.2.1 ThreeUsefulTransformation . ...... 44 6.2.2 The Characterization Theorem . ...... 46 6.2.3 Normalized Conditional Models . ....... 52 6.3 A Geometric Interpretation of Logistic Regression and AdaBoost ........... 53 6.4 Discussion...................................... 55 7 Data Geometry Through the Embedding Principle 56 7.1 Statistical Analysis of the Embedding Principle . ............... 58 8 Diffusion Kernels on Statistical Manifolds 60 8.1 Riemannian Geometry and the Heat Kernel . ......... 62 8.1.1 TheHeatKernel ................................. 63 8.1.2 Theparametrixexpansion. ..... 65 8.2 RoundingtheSimplex .............................. ..... 67 8.3 Spectral Bounds on Covering Numbers and Rademacher Averages .......... 68 8.3.1 CoveringNumbers ............................... 68 8.3.2 RademacherAverages . 70 3 8.4 Experimental Results for Text Classification . .............. 73 8.5 Experimental Results for Gaussian Embedding . ............ 84 8.6 Discussion...................................... 86 9 Hyperplane Margin Classifiers 87 9.1 Hyperplanes and Margins on Sn .............................. 88 n 9.2 Hyperplanes and Margins on S+ ............................. 90 9.3 Logistic Regression on the Multinomial Manifold . .............. 94 9.4 Hyperplanes in Riemannian Manifolds . .......... 95 9.5 Experiments..................................... 97 10 Metric Learning 101 10.1 TheMetricLearningProblem . ........102 10.2 A Parametric Class of Metrics . .........103 10.3 An Inverse-Volume Probabilistic Model on the Simplex . ................108 10.3.1 Computing the Normalization Term . .......109 10.4 Application to Text Classification . ............109 10.5Summary ........................................ 113 11 Discussion 113 A Derivations Concerning Boosting and Exponential Models 115 A.1 Derivation of the Parallel Updates . ..........115 A.1.1 ExponentialLoss ............................... 115 A.1.2 Maximum Likelihood for Exponential Models . .........116 A.2 Derivation of the Sequential Updates . ...........117 A.2.1 ExponentialLoss ............................... 117 A.2.2 Log-Loss...................................... 118 A.3 RegularizedLossFunctions . ........118 A.3.1 Dual Function for Regularized Problem . ........118 A.3.2 Exponential Loss–Sequential update rule . ...........119 A.4 Divergence Between Exponential Models . ...........119 B Gibbs Sampling from the Posterior of Dirichlet Process Mixture Model based on a Spherical Normal Distribution 120 C The Volume Element of a Family of Metrics on the Simplex 121 C.1 The Determinant of a Diagonal Matrix plus a Constant Matrix ............121 C.2 The Differential Volume Element of F ........................123 λ∗J D Summary of Major Contributions 124 4 Mathematical Notation Following is a list of the most frequent mathematical notations in the thesis. Set/manifold of data points X Θ Set/manifold of statistical models X Y Cartesian product of sets/manifold × Xk The Cartesian product of X with itself k times The Euclidean norm k · k , Euclidean dot product between two vectors h· ·i p(x ; θ) A probability model for x parameterized by θ p(y x ; θ) A probability model for y, conditioned on x and parameterized by θ | D( , ), D ( , ) I-divergence between two models · · r · · T The tangent space to the manifold at x xM M ∈ M g (u, v) A Riemannian metric at x associated with the tangent vectors u, v T x ∈ xM ∂ n , e n The standard basis associated with the vector space T = Rn { i}i=1 { i}i=1 xM ∼ G(x) The Gram matrix of the metric g, [G(x)]ij = gx(ei, ej) (u, v) The Fisher information metric at the model θ associated with the vectors u, v Jθ δ (u, v) The induced Euclidean local metric δ (u, v)= u, v x x h i δx,y Kronecker’s delta δx,y =1 if x = y and 0 otherwise ι : A X The inclusion map ι(x)= x from A X to X. → ⊂ N, Q, R The natural, rational and real numbers respectively R+ The set of positive real numbers Rk m The set of real k m matrices × × [A]i The i-row of the matrix A ∂ k,m The standard basis associated with T Rk m { ab}a,b=1 x × X The topological closure of X Hn The upper half plane x Rn : x 0 { ∈ n ≥ } Sn The n-sphere Sn = x Rn+1 : x2 = 1 { ∈ i i } Sn The positive orthant of the n-sphere Sn = x Rn+1 : x2 = 1 + P + { ∈ + i i } P The n-simplex P = x Rn+1 : x = 1 n n { ∈ + i i } P f g Function composition f g(x)= f(g(x)) ◦ ◦ P C ( , ) The set of infinitely differentiable functions from to ∞ M N M N f u The push-forward map f : Tx Tf(x) of u associated with f : ∗ ∗ M→ N M→N f g The pull-back metric on associated with ( , g) and f : ∗ M N M→N d ( , ), d( , ) The geodesic distance associated with the metric g g · · · · dvol g(θ) The volume element of the metric gθ, dvol g(θ)= √det gθ = det G(θ) Y The covariant derivative of the vector field Y in the direction ∇X p of the vector field X associated with the connection ∇ 5 ℓ(θ) The log-likelihood function (θ) The AdaBoost.M2 exponential loss function E p˜ Empirical distribution associated with a training set D ⊂X×Y uˆ The L2 normalized version of the vector u d(x,S) The distance from a point to a set d(x,S) = infy S d(x,y) ∈ 6 1 Introduction There are two fundamental spaces in machine learning. The first space consists of data points X and the second space Θ consists of possible learning models. In statistical learning, Θ is usually a space of statistical models, p(x ; θ) : θ Θ in the generative case or p(y x ; θ) : θ Θ in the { ∈ } { | ∈ } discriminative case. The space Θ can be either a low dimensional parametric space as in parametric statistics, or the space of all possible models as in non-parametric statistics.
Recommended publications
  • Riemannian Geometry and the General Theory of Relativity
    University of Montana ScholarWorks at University of Montana Graduate Student Theses, Dissertations, & Professional Papers Graduate School 1967 Riemannian geometry and the general theory of relativity Marie McBride Vanisko The University of Montana Follow this and additional works at: https://scholarworks.umt.edu/etd Let us know how access to this document benefits ou.y Recommended Citation Vanisko, Marie McBride, "Riemannian geometry and the general theory of relativity" (1967). Graduate Student Theses, Dissertations, & Professional Papers. 8075. https://scholarworks.umt.edu/etd/8075 This Thesis is brought to you for free and open access by the Graduate School at ScholarWorks at University of Montana. It has been accepted for inclusion in Graduate Student Theses, Dissertations, & Professional Papers by an authorized administrator of ScholarWorks at University of Montana. For more information, please contact [email protected]. p m TH3 OmERAl THEORY OF RELATIVITY By Marie McBride Vanisko B.A., Carroll College, 1965 Presented in partial fulfillment of the requirements for the degree of Master of Arts UNIVERSITY OF MOKT/JTA 1967 Approved by: Chairman, Board of Examiners D e a ^ Graduante school V AUG 8 1967, Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. V W Number: EP38876 All rights reserved INFORMATION TO ALL USERS The quality of this reproduotion is dependent upon the quality of the copy submitted. In the unlikely event that the author did not send a complete manuscript and there are missir^ pages, these will be noted. Also, if matedal had to be removed, a note will indicate the deletion. UMT Oi*MMtion neiitNna UMi EP38876 Published ProQuest LLQ (2013).
    [Show full text]
  • Riemannian Geometry Learning for Disease Progression Modelling Maxime Louis, Raphäel Couronné, Igor Koval, Benjamin Charlier, Stanley Durrleman
    Riemannian geometry learning for disease progression modelling Maxime Louis, Raphäel Couronné, Igor Koval, Benjamin Charlier, Stanley Durrleman To cite this version: Maxime Louis, Raphäel Couronné, Igor Koval, Benjamin Charlier, Stanley Durrleman. Riemannian geometry learning for disease progression modelling. 2019. hal-02079820v2 HAL Id: hal-02079820 https://hal.archives-ouvertes.fr/hal-02079820v2 Preprint submitted on 17 Apr 2019 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Riemannian geometry learning for disease progression modelling Maxime Louis1;2, Rapha¨elCouronn´e1;2, Igor Koval1;2, Benjamin Charlier1;3, and Stanley Durrleman1;2 1 Sorbonne Universit´es,UPMC Univ Paris 06, Inserm, CNRS, Institut du cerveau et de la moelle (ICM) 2 Inria Paris, Aramis project-team, 75013, Paris, France 3 Institut Montpelli`erainAlexander Grothendieck, CNRS, Univ. Montpellier Abstract. The analysis of longitudinal trajectories is a longstanding problem in medical imaging which is often tackled in the context of Riemannian geometry: the set of observations is assumed to lie on an a priori known Riemannian manifold. When dealing with high-dimensional or complex data, it is in general not possible to design a Riemannian geometry of relevance. In this paper, we perform Riemannian manifold learning in association with the statistical task of longitudinal trajectory analysis.
    [Show full text]
  • An Introduction to Riemannian Geometry with Applications to Mechanics and Relativity
    An Introduction to Riemannian Geometry with Applications to Mechanics and Relativity Leonor Godinho and Jos´eNat´ario Lisbon, 2004 Contents Chapter 1. Differentiable Manifolds 3 1. Topological Manifolds 3 2. Differentiable Manifolds 9 3. Differentiable Maps 13 4. Tangent Space 15 5. Immersions and Embeddings 22 6. Vector Fields 26 7. Lie Groups 33 8. Orientability 45 9. Manifolds with Boundary 48 10. Notes on Chapter 1 51 Chapter 2. Differential Forms 57 1. Tensors 57 2. Tensor Fields 64 3. Differential Forms 66 4. Integration on Manifolds 72 5. Stokes Theorem 75 6. Orientation and Volume Forms 78 7. Notes on Chapter 2 80 Chapter 3. Riemannian Manifolds 87 1. Riemannian Manifolds 87 2. Affine Connections 94 3. Levi-Civita Connection 98 4. Minimizing Properties of Geodesics 104 5. Hopf-Rinow Theorem 111 6. Notes on Chapter 3 114 Chapter 4. Curvature 115 1. Curvature 115 2. Cartan’s Structure Equations 122 3. Gauss-Bonnet Theorem 131 4. Manifolds of Constant Curvature 137 5. Isometric Immersions 144 6. Notes on Chapter 4 150 1 2 CONTENTS Chapter 5. Geometric Mechanics 151 1. Mechanical Systems 151 2. Holonomic Constraints 160 3. Rigid Body 164 4. Non-Holonomic Constraints 177 5. Lagrangian Mechanics 186 6. Hamiltonian Mechanics 194 7. Completely Integrable Systems 203 8. Notes on Chapter 5 209 Chapter 6. Relativity 211 1. Galileo Spacetime 211 2. Special Relativity 213 3. The Cartan Connection 223 4. General Relativity 224 5. The Schwarzschild Solution 229 6. Cosmology 240 7. Causality 245 8. Singularity Theorem 253 9. Notes on Chapter 6 263 Bibliography 265 Index 267 CHAPTER 1 Differentiable Manifolds This chapter introduces the basic notions of differential geometry.
    [Show full text]
  • Riemannian Geometry and Multilinear Tensors with Vector Fields on Manifolds Md
    International Journal of Scientific & Engineering Research, Volume 5, Issue 9, September-2014 157 ISSN 2229-5518 Riemannian Geometry and Multilinear Tensors with Vector Fields on Manifolds Md. Abdul Halim Sajal Saha Md Shafiqul Islam Abstract-In the paper some aspects of Riemannian manifolds, pseudo-Riemannian manifolds, Lorentz manifolds, Riemannian metrics, affine connections, parallel transport, curvature tensors, torsion tensors, killing vector fields, conformal killing vector fields are focused. The purpose of this paper is to develop the theory of manifolds equipped with Riemannian metric. I have developed some theorems on torsion and Riemannian curvature tensors using affine connection. A Theorem 1.20 named “Fundamental Theorem of Pseudo-Riemannian Geometry” has been established on Riemannian geometry using tensors with metric. The main tools used in the theorem of pseudo Riemannian are tensors fields defined on a Riemannian manifold. Keywords: Riemannian manifolds, pseudo-Riemannian manifolds, Lorentz manifolds, Riemannian metrics, affine connections, parallel transport, curvature tensors, torsion tensors, killing vector fields, conformal killing vector fields. —————————— —————————— I. Introduction (c) { } is a family of open sets which covers , that is, 푖 = . Riemannian manifold is a pair ( , g) consisting of smooth 푈 푀 manifold and Riemannian metric g. A manifold may carry a (d) ⋃ is푈 푖푖 a homeomorphism푀 from onto an open subset of 푀 ′ further structure if it is endowed with a metric tensor, which is a 푖 . 푖 푖 휑 푈 푈 natural generation푀 of the inner product between two vectors in 푛 ℝ to an arbitrary manifold. Riemannian metrics, affine (e) Given and such that , the map = connections,푛 parallel transport, curvature tensors, torsion tensors, ( ( ) killingℝ vector fields and conformal killing vector fields play from푖 푗 ) to 푖 푗 is infinitely푖푗 −1 푈 푈 푈 ∩ 푈 ≠ ∅ 휓 important role to develop the theorem of Riemannian manifolds.
    [Show full text]
  • Geometric GSI’19 Science of Information Toulouse, 27Th - 29Th August 2019
    ALEAE GEOMETRIA Geometric GSI’19 Science of Information Toulouse, 27th - 29th August 2019 // Program // GSI’19 Geometric Science of Information On behalf of both the organizing and the scientific committees, it is // Welcome message our great pleasure to welcome all delegates, representatives and participants from around the world to the fourth International SEE from GSI’19 chairmen conference on “Geometric Science of Information” (GSI’19), hosted at ENAC in Toulouse, 27th to 29th August 2019. GSI’19 benefits from scientific sponsor and financial sponsors. The 3-day conference is also organized in the frame of the relations set up between SEE and scientific institutions or academic laboratories: ENAC, Institut Mathématique de Bordeaux, Ecole Polytechnique, Ecole des Mines ParisTech, INRIA, CentraleSupélec, Institut Mathématique de Bordeaux, Sony Computer Science Laboratories. We would like to express all our thanks to the local organizers (ENAC, IMT and CIMI Labex) for hosting this event at the interface between Geometry, Probability and Information Geometry. The GSI conference cycle has been initiated by the Brillouin Seminar Team as soon as 2009. The GSI’19 event has been motivated in the continuity of first initiatives launched in 2013 at Mines PatisTech, consolidated in 2015 at Ecole Polytechnique and opened to new communities in 2017 at Mines ParisTech. We mention that in 2011, we // Frank Nielsen, co-chair Ecole Polytechnique, Palaiseau, France organized an indo-french workshop on “Matrix Information Geometry” Sony Computer Science Laboratories, that yielded an edited book in 2013, and in 2017, collaborate to CIRM Tokyo, Japan seminar in Luminy TGSI’17 “Topoplogical & Geometrical Structures of Information”.
    [Show full text]
  • Riemann's Contribution to Differential Geometry
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Elsevier - Publisher Connector Historia Mathematics 9 (1982) l-18 RIEMANN'S CONTRIBUTION TO DIFFERENTIAL GEOMETRY BY ESTHER PORTNOY UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN, URBANA, IL 61801 SUMMARIES In order to make a reasonable assessment of the significance of Riemann's role in the history of dif- ferential geometry, not unduly influenced by his rep- utation as a great mathematician, we must examine the contents of his geometric writings and consider the response of other mathematicians in the years immedi- ately following their publication. Pour juger adkquatement le role de Riemann dans le developpement de la geometric differentielle sans etre influence outre mesure par sa reputation de trks grand mathematicien, nous devons &udier le contenu de ses travaux en geometric et prendre en consideration les reactions des autres mathematiciens au tours de trois an&es qui suivirent leur publication. Urn Riemann's Einfluss auf die Entwicklung der Differentialgeometrie richtig einzuschZtzen, ohne sich von seinem Ruf als bedeutender Mathematiker iiberm;issig beeindrucken zu lassen, ist es notwendig den Inhalt seiner geometrischen Schriften und die Haltung zeitgen&sischer Mathematiker unmittelbar nach ihrer Verijffentlichung zu untersuchen. On June 10, 1854, Georg Friedrich Bernhard Riemann read his probationary lecture, "iber die Hypothesen welche der Geometrie zu Grunde liegen," before the Philosophical Faculty at Gdttingen ill. His biographer, Dedekind [1892, 5491, reported that Riemann had worked hard to make the lecture understandable to nonmathematicians in the audience, and that the result was a masterpiece of presentation, in which the ideas were set forth clearly without the aid of analytic techniques.
    [Show full text]
  • A SPACETIME GEOMETRODYNAMIC MODEL (GDM) of the PHYSICAL REALITY Shlomo Barak
    A SPACETIME GEOMETRODYNAMIC MODEL (GDM) OF THE PHYSICAL REALITY Shlomo Barak To cite this version: Shlomo Barak. A SPACETIME GEOMETRODYNAMIC MODEL (GDM) OF THE PHYSICAL REALITY. 2018. hal-01935260 HAL Id: hal-01935260 https://hal.archives-ouvertes.fr/hal-01935260 Preprint submitted on 14 Jan 2019 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. A SPACETIME GEOMETRODYNAMIC MODEL (GDM) Shlomo Barak Shlomo Barak OF THE PHYSICAL REALITY Shlomo Barak II The Book of the GDM A Realization of Einstein’s Vision Dr. Shlomo Barak Editor: Roger M. Kaye A collection of 18 papers published Nov 2016 to Nov 2018. III A. Einstein (1933) …. the axiomatic basis of theoretical physics cannot be extracted from experience but must be freely invented… Copyright © 2018 Shlomo Barak The right of Shlomo Barak to be identified as the author of this work has been asserted by him. All rights reserved. No part of this book may be reproduced or copied in any form or by any means, graphic, electronic or mechanical, or otherwise, including photocopying, recording, or information retrieval systems - without written permission. ISBN 978-965-90727-1-2 IV This GDM book is a collection of 18 papers published from November 2016 to November 2018.
    [Show full text]
  • 1. Introduction
    Beichler (1) Preliminary paper for Vigier IX Conference June 2014 MODERN FYSICS PHALLACIES: THE BEST WAY NOT TO UNIFY PHYSICS JAMES E. BEICHLER Research Institute for Paraphysics, Retired P.O. Box 624, Belpre, Ohio 45714 USA [email protected] Too many physicists believe the ‘phallacy’ that the quantum is more fundamental than relativity without any valid supporting evidence, so the earliest attempts to unify physics based on the continuity of relativity have been all but abandoned. This belief is probably due to the wealth of pro-quantum propaganda and general ‘phallacies in fysics’ that were spread during the second quarter of the twentieth century, although serious ‘phallacies’ exist throughout physics on both sides of the debate. Yet both approaches are basically flawed because both relativity and the quantum theory are incomplete and grossly misunderstood as they now stand. Had either side of the quantum versus relativity controversy sought common ground between the two worldviews, total unification would have been accomplished long ago. The point is, literally, that the discrete quantum, continuous relativity, basic physical geometry, theoretical mathematics and classical physics all share one common characteristic that has never been fully explored or explained – a paradoxical duality between a dimensionless point (discrete) and an extended length (continuity) in any dimension – and if the problem of unification is approached from an understanding of how this paradox relates to each paradigm, all of physics and indeed all of science could be unified under a single new theoretical paradigm. Keywords: unification, single field theory, unified field theory, quantized space-time, five-dimensional space-time, quantum, relativity, hidden variables, Einstein, Kaluza, Klein, Clifford 1.
    [Show full text]
  • Riemannian Reading: Using Manifolds to Calculate and Unfold
    Eastern Illinois University The Keep Masters Theses Student Theses & Publications 2017 Riemannian Reading: Using Manifolds to Calculate and Unfold Narrative Heather Lamb Eastern Illinois University This research is a product of the graduate program in English at Eastern Illinois University. Find out more about the program. Recommended Citation Lamb, Heather, "Riemannian Reading: Using Manifolds to Calculate and Unfold Narrative" (2017). Masters Theses. 2688. https://thekeep.eiu.edu/theses/2688 This is brought to you for free and open access by the Student Theses & Publications at The Keep. It has been accepted for inclusion in Masters Theses by an authorized administrator of The Keep. For more information, please contact [email protected]. The Graduate School� EA<.TEH.NILLINOIS UNJVER.SITY" Thesis Maintenance and Reproduction Certificate FOR: Graduate Candidates Completing Theses in Partial Fulfillment of the Degree Graduate Faculty Advisors Directing the Theses RE: Preservation, Reproduction, and Distribution of Thesis Research Preserving, reproducing, and distributing thesis research is an important part of Booth Library's responsibility to provide access to scholarship. In order to further this goal, Booth Library makes all graduate theses completed as part of a degree program at Eastern Illinois University available for personal study, research, and other not-for-profit educational purposes. Under 17 U.S.C. § 108, the library may reproduce and distribute a copy without infringing on copyright; however, professional courtesy dictates that permission be requested from the author before doing so. Your signatures affirm the following: • The graduate candidate is the author of this thesis. • The graduate candidate retains the copyright and intellectual property rights associated with the original research, creative activity, and intellectual or artistic content of the thesis.
    [Show full text]
  • Teaching General Relativity
    Teaching General Relativity Robert M. Wald Enrico Fermi Institute and Department of Physics The University of Chicago, Chicago, IL 60637, USA February 4, 2008 Abstract This Resource Letter provides some guidance on issues that arise in teaching general relativity at both the undergraduate and graduate levels. Particular emphasis is placed on strategies for presenting the mathematical material needed for the formulation of general relativity. 1 Introduction General Relativity is the theory of space, time, and gravity formulated by Einstein in 1915. It is widely regarded as a very abstruse, mathematical theory and, indeed, until recently it has not generally been regarded as a suitable subject for an undergraduate course. In actual fact, the the mathematical material (namely, differential geometry) needed to attain a deep understanding of general relativity is not particularly difficult and requires arXiv:gr-qc/0511073v1 14 Nov 2005 a background no greater than that provided by standard courses in advanced calculus and linear algebra. (By contrast, considerably more mathematical sophistication is need to provide a rigorous formulation of quantum theory.) Nevertheless, this mathematical material is unfamiliar to most physics students and its application to general relativity goes against what students have been taught since high school (or earlier): namely, that “space” has the natural structure of a vector space. Thus, the mathematical material poses a major challenge to teaching general relativity—particularly for a one-semester course. If one take the time to teach the mathematical material properly, one runs the risk of turning the course into a course on differential geometry and doing very little physics.
    [Show full text]
  • A Geometric Introduction to Spacetime and Special Relativity
    A GEOMETRIC INTRODUCTION TO SPACETIME AND SPECIAL RELATIVITY. WILLIAM K. ZIEMER Abstract. A narrative of special relativity meant for graduate students in mathematics or physics. The presentation builds upon the geometry of space- time; not the explicit axioms of Einstein, which are consequences of the geom- etry. 1. Introduction Einstein was deeply intuitive, and used many thought experiments to derive the behavior of relativity. Most introductions to special relativity follow this path; taking the reader down the same road Einstein travelled, using his axioms and modifying Newtonian physics. The problem with this approach is that the reader falls into the same pits that Einstein fell into. There is a large difference in the way Einstein approached relativity in 1905 versus 1912. I will use the 1912 version, a geometric spacetime approach, where the differences between Newtonian physics and relativity are encoded into the geometry of how space and time are modeled. I believe that understanding the differences in the underlying geometries gives a more direct path to understanding relativity. Comparing Newtonian physics with relativity (the physics of Einstein), there is essentially one difference in their basic axioms, but they have far-reaching im- plications in how the theories describe the rules by which the world works. The difference is the treatment of time. The question, \Which is farther away from you: a ball 1 foot away from your hand right now, or a ball that is in your hand 1 minute from now?" has no answer in Newtonian physics, since there is no mechanism for contrasting spatial distance with temporal distance.
    [Show full text]
  • Do Carmo Riemannian Geometry 1. Review Example 1.1. When M
    DIFFERENTIAL GEOMETRY NOTES HAO (BILLY) LEE Abstract. These are notes I took in class, taught by Professor Andre Neves. I claim no credit to the originality of the contents of these notes. Nor do I claim that they are without errors, nor readable. Reference: Do Carmo Riemannian Geometry 1. Review 2 Example 1.1. When M = (x; jxj) 2 R : x 2 R , we have one chart φ : R ! M by φ(x) = (x; jxj). This is bad, because there’s no tangent space at (0; 0). Therefore, we require that our collection of charts is maximal. Really though, when we extend this to a maximal completion, it has to be compatible with φ, but the map F : M ! R2 is not smooth. Definition 1.2. F : M ! N smooth manifolds is differentiable at p 2 M, if given a chart φ of p and of f(p), then −1 ◦ F ◦ φ is differentiable. Given p 2 M, let Dp be the set of functions f : M ! R that are differentiable at p. Given α :(−, ) ! M with 0 α(0) = p and α differentiable at 0, we set α (0) : Dp ! R by d α0(0)(f) = (f ◦ α) (0): dt Then 0 TpM = fα (0) : Dp ! R : α is a curve with α(0) = p and diff at 0g : d This is a finite dimensional vector space. Let φ be a chart, and αi(t) = φ(tei), then the derivative at 0 is just , and dxi claim that this forms a basis. For F : M ! N differentiable, define dFp : TpM ! TF (p)N by 0 0 (dFp)(α (0))(f) = (f ◦ F ◦ α) (0): Need to check that this is well-defined.
    [Show full text]