Mathematical Advances in Manifold Learning

Total Page:16

File Type:pdf, Size:1020Kb

Mathematical Advances in Manifold Learning Mathematical Advances in Manifold Learning Nakul Verma University of California, San Diego [email protected] June 03, 2008 Abstract number – the angle of the turn, i.e. the orientation of the body. Manifold learning has recently gained a lot of inter- In a typical learning scenario the task is slightly est by machine learning practitioners. Here we pro- more complicated as the agent only gets to see a vide a mathematically rigorous treatment of some of few samples from which it somehow needs to inter- the techniques in unsupervised learning in context of polate and generalize various possible scenarios. In manifolds. We will study the problems of dimension our example this translates to the agent only hav- reduction and density estimation and present some ing access to few of the body poses, from which it recent results in terms of fast convergence rates when needs to predict where the person is looking. Thus the data lie on a manifold. the agent is faced with the difficulty to find an ap- propriate (possibly non-linear) basis to represent this data compactly. Manifold learning can be broadly 1 Introduction described as the study of algorithms that use and in- ferring the properties of data that is sampled from an With increase in the volume of data, both in terms of underlying manifold. number of observations as well as number of measure- The goal of this survey is to study different math- ments, traditional learning algorithms are now faced ematical techniques by which we can estimate some with new challenges. One may expect that more data global properties of a manifold from a few samples. should lead to more accurate models, however a large We will start by studying random projections as a collection of irrelevant and correlated features just nonadaptive linear dimensionality reduction proce- add on to the computational complexity of the al- dure, which provides a probabilistic guarantee on pre- gorithm, without helping much to solve the task at serving the interpoint distances between all points on hand. This makes the learning task especially dif- a manifold. We will then focus on analyzing the spec- ficult. In an attempt to alleviate such problems, a trum of Laplace-Beltrami operator on functions on a new model in terms of manifolds for finding relevant manifold for finding non-linear embeddings and sim- features and representing the data by a few parame- plifying its structure. Lastly we will look at kernel ters is gaining interest by machine learning and signal density estimation to estimate high density regions processing communities. on a manifold. Most common examples of superficially high di- It is worth mentioning that our survey is by no mensional data are found in the fields of data mining means comprehensive and we simply highlight some and computer vision. Consider the problem of esti- of the recent theoretical advances in manifold learn- mating the face and body pose in humans. Know- ing. Most notably we do not cover the topics of ing where a person is looking gives a wealth of infor- regularization, regression and clustering of data be- mation to an automated agent regarding where the longing to manifolds. In the topic of dimensional- object of interest is – whether the person wants to ity reduction, we are skipping the analysis of classic interact with the agent or whether she is convers- techniques such as LLE (Locally Linear Embedding), ing with another person. The task of deciding where Isomap and their variants. someone is looking seems quite challenging given the fact that the agent is only receiving a large array of 1.1 Preliminaries pixels. However, knowing that a person’s orientation only has one degree of freedom, the relevant infor- We begin by introducing our notation which we will mation in this data can be expressed by just a single use throughout the paper. 1 (x2, y2) 1.5 1 0.5 0 a −0.5 b 10 −1 5 −1.5 0 12 10 −5 (x1, y1) 8 6 −10 4 2 0 −15 Figure 1: A 1-manifold in R3 Figure 2: Movement of a robot’s arm traces out a 2-manifold in R4 Definition 1. We say a function f : U 7→ V is a diffeomorphism, if it is smooth1 and invertible with a smooth inverse. 1.2 Some examples of manifolds Definition 2. A subset M ⊂ RD is said to be a Movement of a robotic arm: Consider the prob- smooth n-manifold if M is locally diffeomorphic to lem of modelling the movement of a robotic arm with Rn, that is, at each p ∈ M we can find an open neigh- two joints (see figure 2). For simplicity let’s restrict borhood U ⊂ RD such that there exist a diffeomorphic the movement to the 2D-plane. Since there are two map between U ∩ M and Rn. degrees of freedom, intuitively one should suspect that the movement should trace out a 2-manifold. It is always helpful to have a picture in mind. See We now confirm this in detail. R3 figure 1 for an example of 1-manifold in . Notice Let’s denote the fixed shoulder joint as the that locally any small segment of the manifold “looks origin, the position of the elbow joint as (x ,y ) R1 1 1 like” an interval in . and the position of wrist as (x2,y2). To see that Definition 3. A tangent space at a point p ∈ M, the movement of the robotic arm traces out a 2-manifold, consider the map f : R4 → R2 defined as denoted by TpM, is the affine subspace formed by col- 2 2 2 2 lection of all tangent vectors to M at p. (x1,y1,x2,y2) 7→ (x1 + y1, (x2 − x1) + (y2 − y1) ). Clearly M ⊂ R4, s.t. M = f −1(b2, a2) is the For the purposes of this survey we will restrict our- desired manifold. We can verify that locally M selves to the discussion of manifolds whose tangent is diffeomorphic to R2 by looking at its derivative x y 0 0 space at each point is equipped with an inner prod- map Df = 2 1 1 uct. Such manifolds are called Riemannian manifolds x1 − x2 y1 − y2 x2 − x1 y2 − y1 and allow us to define various notions of length, an- and observing that it has maximal rank for non- gles, curvature, etc. on the manifold. degenerate values of a and b. Since we will largely be dealing with samples from a manifold, we need to define Set of orthogonal n × n matrices: We present this example to demonstrate that manifolds are not RD Definition 4. A sequence x1,...,xn ⊂ M ⊂ is only good for representing physical processes with called independent and identically distributed (i.i.d.) small degrees of freedom but also to better under- when each xi is picked independently from a fixed dis- stand some of the abstract objects which we regularly tribution D over M. encounter. Consider the problem of understanding With this mathematical machinery in hand, we can the geometry the set of orthonormal matrices in the now demonstrate that manifolds incorporate a wide space of real n × n matrices. Note that the set of array of important examples – we present two such n × n orthonormal matrices is also called the orthog- onal group, and is denoted by O(n). We claim that examples that serve as a motivation to study these 2 n objects. this set forms a k(k − 1)/2-manifold in R . 2 n n(n+1)/2 1 To see this, consider the map f : R → R recall that a function is smooth if all its partial derivatives 2 n T Rn ∂ f/∂xi1 ...∂xin exist and are continuous. defined by (A)ij 7→ A A. Now M ⊂ such that 2 −1 M = f (In×n) is exactly O(n). To see that M is As one might expect, finding a mapping that pre- in fact a manifold, observe that the derivative map serves all distances of an arbitrary dataset can be a T T DfA.B = B A + A B is regular. difficult task. Luckily in our case, the saving grace comes from observing that the data has a manifold Observe that the examples above required us to structure. We are only required to preserve distances know the mapping f a priori. However in the context between points that lie on the manifold and not the of machine learning, the task is typically to estimate whole ambient space. properties about M without having access to f. 2.1.1 Dimension reduction of manifold data 1.3 Outline In the past decade, numerous methods for manifold The paper is organized as follows. We will discuss dimension reduction have been proposed. The clas- some linear and non-linear dimensionality reduction sic techniques such as Locally Linear Embeddings methods on manifolds with a special focus on ran- (LLE) and Isomaps, and newer ones such as Lapla- dom projections in section 2. We will then study cian Eigenmaps and Hessian Eigenmaps, all share a Laplacian-eigenmaps as a process to simplify mani- common intuition – all these methods try to capture fold structure in section 3, followed by nonparametric the local manifold geometry by constructing the ad- density estimation techniques on manifolds in section jacency graph on the sampled data. They all bene- 4. We will finally conclude by discussing the signif- fit from the observation that inference done on this icance of the results and some directions for future neighborhood graph corresponds approximately to work in section 5. the inference on the underlying manifold.
Recommended publications
  • The Grassmann Manifold
    The Grassmann Manifold 1. For vector spaces V and W denote by L(V; W ) the vector space of linear maps from V to W . Thus L(Rk; Rn) may be identified with the space Rk£n of k £ n matrices. An injective linear map u : Rk ! V is called a k-frame in V . The set k n GFk;n = fu 2 L(R ; R ): rank(u) = kg of k-frames in Rn is called the Stiefel manifold. Note that the special case k = n is the general linear group: k k GLk = fa 2 L(R ; R ) : det(a) 6= 0g: The set of all k-dimensional (vector) subspaces ¸ ½ Rn is called the Grassmann n manifold of k-planes in R and denoted by GRk;n or sometimes GRk;n(R) or n GRk(R ). Let k ¼ : GFk;n ! GRk;n; ¼(u) = u(R ) denote the map which assigns to each k-frame u the subspace u(Rk) it spans. ¡1 For ¸ 2 GRk;n the fiber (preimage) ¼ (¸) consists of those k-frames which form a basis for the subspace ¸, i.e. for any u 2 ¼¡1(¸) we have ¡1 ¼ (¸) = fu ± a : a 2 GLkg: Hence we can (and will) view GRk;n as the orbit space of the group action GFk;n £ GLk ! GFk;n :(u; a) 7! u ± a: The exercises below will prove the following n£k Theorem 2. The Stiefel manifold GFk;n is an open subset of the set R of all n £ k matrices. There is a unique differentiable structure on the Grassmann manifold GRk;n such that the map ¼ is a submersion.
    [Show full text]
  • On Manifolds of Tensors of Fixed Tt-Rank
    ON MANIFOLDS OF TENSORS OF FIXED TT-RANK SEBASTIAN HOLTZ, THORSTEN ROHWEDDER, AND REINHOLD SCHNEIDER Abstract. Recently, the format of TT tensors [19, 38, 34, 39] has turned out to be a promising new format for the approximation of solutions of high dimensional problems. In this paper, we prove some new results for the TT representation of a tensor U ∈ Rn1×...×nd and for the manifold of tensors of TT-rank r. As a first result, we prove that the TT (or compression) ranks ri of a tensor U are unique and equal to the respective seperation ranks of U if the components of the TT decomposition are required to fulfil a certain maximal rank condition. We then show that d the set T of TT tensors of fixed rank r forms an embedded manifold in Rn , therefore preserving the essential theoretical properties of the Tucker format, but often showing an improved scaling behaviour. Extending a similar approach for matrices [7], we introduce certain gauge conditions to obtain a unique representation of the tangent space TU T of T and deduce a local parametrization of the TT manifold. The parametrisation of TU T is often crucial for an algorithmic treatment of high-dimensional time-dependent PDEs and minimisation problems [33]. We conclude with remarks on those applications and present some numerical examples. 1. Introduction The treatment of high-dimensional problems, typically of problems involving quantities from Rd for larger dimensions d, is still a challenging task for numerical approxima- tion. This is owed to the principal problem that classical approaches for their treatment normally scale exponentially in the dimension d in both needed storage and computa- tional time and thus quickly become computationally infeasable for sensible discretiza- tions of problems of interest.
    [Show full text]
  • INTRODUCTION to ALGEBRAIC GEOMETRY 1. Preliminary Of
    INTRODUCTION TO ALGEBRAIC GEOMETRY WEI-PING LI 1. Preliminary of Calculus on Manifolds 1.1. Tangent Vectors. What are tangent vectors we encounter in Calculus? 2 0 (1) Given a parametrised curve α(t) = x(t); y(t) in R , α (t) = x0(t); y0(t) is a tangent vector of the curve. (2) Given a surface given by a parameterisation x(u; v) = x(u; v); y(u; v); z(u; v); @x @x n = × is a normal vector of the surface. Any vector @u @v perpendicular to n is a tangent vector of the surface at the corresponding point. (3) Let v = (a; b; c) be a unit tangent vector of R3 at a point p 2 R3, f(x; y; z) be a differentiable function in an open neighbourhood of p, we can have the directional derivative of f in the direction v: @f @f @f D f = a (p) + b (p) + c (p): (1.1) v @x @y @z In fact, given any tangent vector v = (a; b; c), not necessarily a unit vector, we still can define an operator on the set of functions which are differentiable in open neighbourhood of p as in (1.1) Thus we can take the viewpoint that each tangent vector of R3 at p is an operator on the set of differential functions at p, i.e. @ @ @ v = (a; b; v) ! a + b + c j ; @x @y @z p or simply @ @ @ v = (a; b; c) ! a + b + c (1.2) @x @y @z 3 with the evaluation at p understood.
    [Show full text]
  • DIFFERENTIAL GEOMETRY COURSE NOTES 1.1. Review of Topology. Definition 1.1. a Topological Space Is a Pair (X,T ) Consisting of A
    DIFFERENTIAL GEOMETRY COURSE NOTES KO HONDA 1. REVIEW OF TOPOLOGY AND LINEAR ALGEBRA 1.1. Review of topology. Definition 1.1. A topological space is a pair (X; T ) consisting of a set X and a collection T = fUαg of subsets of X, satisfying the following: (1) ;;X 2 T , (2) if Uα;Uβ 2 T , then Uα \ Uβ 2 T , (3) if Uα 2 T for all α 2 I, then [α2I Uα 2 T . (Here I is an indexing set, and is not necessarily finite.) T is called a topology for X and Uα 2 T is called an open set of X. n Example 1: R = R × R × · · · × R (n times) = f(x1; : : : ; xn) j xi 2 R; i = 1; : : : ; ng, called real n-dimensional space. How to define a topology T on Rn? We would at least like to include open balls of radius r about y 2 Rn: n Br(y) = fx 2 R j jx − yj < rg; where p 2 2 jx − yj = (x1 − y1) + ··· + (xn − yn) : n n Question: Is T0 = fBr(y) j y 2 R ; r 2 (0; 1)g a valid topology for R ? n No, so you must add more open sets to T0 to get a valid topology for R . T = fU j 8y 2 U; 9Br(y) ⊂ Ug: Example 2A: S1 = f(x; y) 2 R2 j x2 + y2 = 1g. A reasonable topology on S1 is the topology induced by the inclusion S1 ⊂ R2. Definition 1.2. Let (X; T ) be a topological space and let f : Y ! X.
    [Show full text]
  • Manifold Reconstruction in Arbitrary Dimensions Using Witness Complexes Jean-Daniel Boissonnat, Leonidas J
    Manifold Reconstruction in Arbitrary Dimensions using Witness Complexes Jean-Daniel Boissonnat, Leonidas J. Guibas, Steve Oudot To cite this version: Jean-Daniel Boissonnat, Leonidas J. Guibas, Steve Oudot. Manifold Reconstruction in Arbitrary Dimensions using Witness Complexes. Discrete and Computational Geometry, Springer Verlag, 2009, pp.37. hal-00488434 HAL Id: hal-00488434 https://hal.archives-ouvertes.fr/hal-00488434 Submitted on 2 Jun 2010 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Manifold Reconstruction in Arbitrary Dimensions using Witness Complexes Jean-Daniel Boissonnat Leonidas J. Guibas Steve Y. Oudot INRIA, G´eom´etrica Team Dept. Computer Science Dept. Computer Science 2004 route des lucioles Stanford University Stanford University 06902 Sophia-Antipolis, France Stanford, CA 94305 Stanford, CA 94305 [email protected] [email protected] [email protected]∗ Abstract It is a well-established fact that the witness complex is closely related to the restricted Delaunay triangulation in low dimensions. Specifically, it has been proved that the witness complex coincides with the restricted Delaunay triangulation on curves, and is still a subset of it on surfaces, under mild sampling conditions. In this paper, we prove that these results do not extend to higher-dimensional manifolds, even under strong sampling conditions such as uniform point density.
    [Show full text]
  • Hodge Theory
    HODGE THEORY PETER S. PARK Abstract. This exposition of Hodge theory is a slightly retooled version of the author's Harvard minor thesis, advised by Professor Joe Harris. Contents 1. Introduction 1 2. Hodge Theory of Compact Oriented Riemannian Manifolds 2 2.1. Hodge star operator 2 2.2. The main theorem 3 2.3. Sobolev spaces 5 2.4. Elliptic theory 11 2.5. Proof of the main theorem 14 3. Hodge Theory of Compact K¨ahlerManifolds 17 3.1. Differential operators on complex manifolds 17 3.2. Differential operators on K¨ahlermanifolds 20 3.3. Bott{Chern cohomology and the @@-Lemma 25 3.4. Lefschetz decomposition and the Hodge index theorem 26 Acknowledgments 30 References 30 1. Introduction Our objective in this exposition is to state and prove the main theorems of Hodge theory. In Section 2, we first describe a key motivation behind the Hodge theory for compact, closed, oriented Riemannian manifolds: the observation that the differential forms that satisfy certain par- tial differential equations depending on the choice of Riemannian metric (forms in the kernel of the associated Laplacian operator, or harmonic forms) turn out to be precisely the norm-minimizing representatives of the de Rham cohomology classes. This naturally leads to the statement our first main theorem, the Hodge decomposition|for a given compact, closed, oriented Riemannian manifold|of the space of smooth k-forms into the image of the Laplacian and its kernel, the sub- space of harmonic forms. We then develop the analytic machinery|specifically, Sobolev spaces and the theory of elliptic differential operators|that we use to prove the aforementioned decom- position, which immediately yields as a corollary the phenomenon of Poincar´eduality.
    [Show full text]
  • Appendix D: Manifold Maps for SO(N) and SE(N)
    424 Appendix D: Manifold Maps for SO(n) and SE(n) As we saw in chapter 10, recent SLAM implementations that operate with three-dimensional poses often make use of on-manifold linearization of pose increments to avoid the shortcomings of directly optimizing in pose parameterization spaces. This appendix is devoted to providing the reader a detailed account of the mathematical tools required to understand all the expressions involved in on-manifold optimization problems. The presented contents will, hopefully, also serve as a solid base for bootstrap- ping the reader’s own solutions. 1. OPERATOR DEFINITIONS In the following, we will make use of some vector and matrix operators, which are rather uncommon in mobile robotics literature. Since they have not been employed throughout this book until this point, it is in order to define them here. The “vector to skew-symmetric matrix” operator: A skew-symmetric matrix is any square matrix A such that AA= − T . This implies that diagonal elements must be all zeros and off-diagonal entries the negative of their symmetric counterparts. It can be easily seen that any 2× 2 or 3× 3 skew-symmetric matrix only has 1 or 3 degrees of freedom (i.e. only 1 or 3 independent numbers appear in such matrices), respectively, thus it makes sense to parameterize them as a vector. Generating skew-symmetric matrices ⋅ from such vectors is performed by means of the []× operator, defined as: 0 −x 2× 2: = x ≡ []v × () × x 0 x 0 z y (1) − 3× 3: []v = y ≡ z0 − x × z −y x 0 × The origin of the symbol × in this operator follows from its application to converting a cross prod- × uct of two 3D vectors ( x y ) into a matrix-vector multiplication ([]x× y ).
    [Show full text]
  • Differential Forms As Spinors Annales De L’I
    ANNALES DE L’I. H. P., SECTION A WOLFGANG GRAF Differential forms as spinors Annales de l’I. H. P., section A, tome 29, no 1 (1978), p. 85-109 <http://www.numdam.org/item?id=AIHPA_1978__29_1_85_0> © Gauthier-Villars, 1978, tous droits réservés. L’accès aux archives de la revue « Annales de l’I. H. P., section A » implique l’accord avec les conditions générales d’utilisation (http://www.numdam. org/conditions). Toute utilisation commerciale ou impression systématique est constitutive d’une infraction pénale. Toute copie ou impression de ce fichier doit contenir la présente mention de copyright. Article numérisé dans le cadre du programme Numérisation de documents anciens mathématiques http://www.numdam.org/ Ann. Inst. Henri Poincare, Section A : Vol. XXIX. n° L 1978. p. 85 Physique’ théorique.’ Differential forms as spinors Wolfgang GRAF Fachbereich Physik der Universitat, D-7750Konstanz. Germany ABSTRACT. 2014 An alternative notion of spinor fields for spin 1/2 on a pseudoriemannian manifold is proposed. Use is made of an algebra which allows the interpretation of spinors as elements of a global minimal Clifford- ideal of differential forms. The minimal coupling to an electromagnetic field is introduced by means of an « U(1 )-gauging ». Although local Lorentz transformations play only a secondary role and the usual two-valuedness is completely absent, all results of Dirac’s equation in flat space-time with electromagnetic coupling can be regained. 1. INTRODUCTION It is well known that in a riemannian manifold the laplacian D := - (d~ + ~d) operating on differential forms admits as « square root » the first-order operator ~ 2014 ~ ~ being the exterior derivative and e) the generalized divergence.
    [Show full text]
  • Lecture 12. Tensors
    Lecture 12. Tensors In this lecture we define tensors on a manifold, and the associated bundles, and operations on tensors. 12.1 Basic definitions We have already seen several examples of the idea we are about to introduce, namely linear (or multilinear) operators acting on vectors on M. For example, the metric is a bilinear operator which takes two vectors to give a real number, i.e. gx : TxM × TxM → R for each x is defined by u, v → gx(u, v). The difference between two connections ∇(1) and ∇(2) is a bilinear op- erator which takes two vectors and gives a vector, i.e. a bilinear operator Sx : TxM × TxM → TxM for each x ∈ M. Similarly, the torsion of a connec- tion has this form. Definition 12.1.1 A covector ω at x ∈ M is a linear map from TxM to R. The set of covectors at x forms an n-dimensional vector space, which we ∗ denote Tx M.Atensor of type (k, l)atx is a multilinear map which takes k vectors and l covectors and gives a real number × × × ∗ × × ∗ → R Tx : .TxM .../0 TxM1 .Tx M .../0 Tx M1 . k times l times Note that a covector is just a tensor of type (1, 0), and a vector is a tensor of type (0, 1), since a vector v acts linearly on a covector ω by v(ω):=ω(v). Multilinearity means that i1 ik j1 jl T c vi1 ,..., c vik , aj1 ω ,..., ajl ω i i j j 1 k 1 l . i1 ik j1 jl = c ...c aj1 ...ajl T (vi1 ,...,vik ,ω ,...,ω ) i1,...,ik,j1,...,jl 110 Lecture 12.
    [Show full text]
  • Differential Geometry Lecture 18: Curvature
    Differential geometry Lecture 18: Curvature David Lindemann University of Hamburg Department of Mathematics Analysis and Differential Geometry & RTG 1670 14. July 2020 David Lindemann DG lecture 18 14. July 2020 1 / 31 1 Riemann curvature tensor 2 Sectional curvature 3 Ricci curvature 4 Scalar curvature David Lindemann DG lecture 18 14. July 2020 2 / 31 Recap of lecture 17: defined geodesics in pseudo-Riemannian manifolds as curves with parallel velocity viewed geodesics as projections of integral curves of a vector field G 2 X(TM) with local flow called geodesic flow obtained uniqueness and existence properties of geodesics constructed the exponential map exp : V ! M, V neighbourhood of the zero-section in TM ! M showed that geodesics with compact domain are precisely the critical points of the energy functional used the exponential map to construct Riemannian normal coordinates, studied local forms of the metric and the Christoffel symbols in such coordinates discussed the Hopf-Rinow Theorem erratum: codomain of (x; v) as local integral curve of G is d'(TU), not TM David Lindemann DG lecture 18 14. July 2020 3 / 31 Riemann curvature tensor Intuitively, a meaningful definition of the term \curvature" for 3 a smooth surface in R , written locally as a graph of a smooth 2 function f : U ⊂ R ! R, should involve the second partial derivatives of f at each point. How can we find a coordinate- 3 free definition of curvature not just for surfaces in R , which are automatically Riemannian manifolds by restricting h·; ·i, but for all pseudo-Riemannian manifolds? Definition Let (M; g) be a pseudo-Riemannian manifold with Levi-Civita connection r.
    [Show full text]
  • Chapter 7 Geodesics on Riemannian Manifolds
    Chapter 7 Geodesics on Riemannian Manifolds 7.1 Geodesics, Local Existence and Uniqueness If (M,g)isaRiemannianmanifold,thentheconceptof length makes sense for any piecewise smooth (in fact, C1) curve on M. Then, it possible to define the structure of a metric space on M,whered(p, q)isthegreatestlowerboundofthe length of all curves joining p and q. Curves on M which locally yield the shortest distance between two points are of great interest. These curves called geodesics play an important role and the goal of this chapter is to study some of their properties. 489 490 CHAPTER 7. GEODESICS ON RIEMANNIAN MANIFOLDS Given any p M,foreveryv TpM,the(Riemannian) norm of v,denoted∈ v ,isdefinedby∈ " " v = g (v,v). " " p ! The Riemannian inner product, gp(u, v), of two tangent vectors, u, v TpM,willalsobedenotedby u, v p,or simply u, v .∈ # $ # $ Definition 7.1.1 Given any Riemannian manifold, M, a smooth parametric curve (for short, curve)onM is amap,γ: I M,whereI is some open interval of R. For a closed→ interval, [a, b] R,amapγ:[a, b] M is a smooth curve from p =⊆γ(a) to q = γ(b) iff→γ can be extended to a smooth curve γ:(a ", b + ") M, for some ">0. Given any two points,− p, q →M,a ∈ continuous map, γ:[a, b] M,isa" piecewise smooth curve from p to q iff → (1) There is a sequence a = t0 <t1 < <tk 1 <t = b of numbers, t R,sothateachmap,··· − k i ∈ γi = γ ! [ti,ti+1], called a curve segment is a smooth curve, for i =0,...,k 1.
    [Show full text]
  • Parallel and Killing Spinors on Spin Manifolds 1 Introduction
    Parallel and Killing Spinors on Spinc Manifolds Andrei Moroianu1 Institut fur¨ reine Mathematik, Ziegelstr. 13a, 10099 Berlin, Germany E-mail: [email protected] Abstract: We describe all simply connected Spinc manifolds carrying parallel and real Killing spinors. In particular we show that every Sasakian manifold (not necessarily Einstein) carries a canonical Spinc structure with Killing spinors. 1 Introduction The classification of irreducible simply connected spin manifolds with parallel spinors was obtained by M. Wang in 1989 [15] in the following way: the existence of a parallel spinor means that the spin representation of the holonomy group has a fixed point. Moreover, it requires the vanishing of the Ricci tensor, so the only symmetric spaces with parallel spinors are the flat ones. Then looking into Berger's list of possible holonomy groups for Riemannian manifolds and using some representation theory one finally obtains that the only suitable manifolds are those with holonomy 0, SU(n), Sp(n), Spin7 and G2. One can give the geometrical description of such a holonomy reduction in each of these cases [15]. For an earlier approach to this problem, see also [10]. The geometrical description of simply connected spin manifolds carrying real Killing spinors is considerably more complicated, and was obtained in 1993 by C. B¨ar [1] after a series of partial results of Th. Friedrich, R. Grunewald, I. Kath and O. Hijazi (cf. [4], [5], [6], [7], [9]). The main idea of C. B¨ar was to consider the cone over a manifold with Killing spinors and to show that the spin representation of the holonomy of the cone has a fixed point for a suitable scalar renormalisation of the metric on the base (actually this construction was already used in 1987 by R.
    [Show full text]