Robust and Scalable Learning of Complex Intrinsic Dataset Geometry Via Elpigraph

Robust and Scalable Learning of Complex Intrinsic Dataset Geometry Via Elpigraph

entropy Article Robust and Scalable Learning of Complex Intrinsic Dataset Geometry via ElPiGraph Luca Albergante 1,2,3,4,* , Evgeny Mirkes 5,6 , Jonathan Bac 1,2,3,7, Huidong Chen 8,9, Alexis Martin 1,2,3,10, Louis Faure 1,2,3,11 , Emmanuel Barillot 1,2,3 , Luca Pinello 8,9, Alexander Gorban 5,6 and Andrei Zinovyev 1,2,3,* 1 Institut Curie, PSL Research University, 75005 Paris, France; [email protected] (J.B.); [email protected] (A.M.); [email protected] (L.F.); [email protected] (E.B.) 2 INSERM U900, 75248 Paris, France 3 CBIO-Centre for Computational Biology, Mines ParisTech, PSL Research University, 75006 Paris, France 4 Sensyne Health, Oxford OX4 4GE, UK 5 Center for Mathematical Modeling, University of Leicester, Leicester LE1 7RH, UK; [email protected] (E.M.); [email protected] (A.G.) 6 Lobachevsky University, 603000 Nizhny Novgorod, Russia 7 Centre de Recherches Interdisciplinaires, Université de Paris, F-75000 Paris, France 8 Molecular Pathology Unit & Cancer Center, Massachusetts General Hospital Research Institute and Harvard Medical School, Boston, MA 02114, USA; [email protected] (H.C.); [email protected] (L.P.) 9 Broad Institute of MIT and Harvard, Cambridge, MA 02142, USA 10 ECE Paris, F-75015 Paris, France 11 Center for Brain Research, Medical University of Vienna, 22180 Vienna, Austria * Correspondence: [email protected] (L.A.); [email protected] (A.Z.) Received: 6 December 2019; Accepted: 2 March 2020; Published: 4 March 2020 Abstract: Multidimensional datapoint clouds representing large datasets are frequently characterized by non-trivial low-dimensional geometry and topology which can be recovered by unsupervised machine learning approaches, in particular, by principal graphs. Principal graphs approximate the multivariate data by a graph injected into the data space with some constraints imposed on the node mapping. Here we present ElPiGraph, a scalable and robust method for constructing principal graphs. ElPiGraph exploits and further develops the concept of elastic energy, the topological graph grammar approach, and a gradient descent-like optimization of the graph topology. The method is able to withstand high levels of noise and is capable of approximating data point clouds via principal graph ensembles. This strategy can be used to estimate the statistical significance of complex data features and to summarize them into a single consensus principal graph. ElPiGraph deals efficiently with large datasets in various fields such as biology, where it can be used for example with single-cell transcriptomic or epigenomic datasets to infer gene expression dynamics and recover differentiation landscapes. Keywords: data approximation; principal graphs; principal trees; topological grammars; software 1. Introduction One of the major trends in modern machine learning is the increasing use of ideas borrowed from multi-dimensional and information geometry. Thus, geometric data analysis (GDA) approaches treats datasets of various nature (multi-modal measurements, images, graph embeddings) as clouds of points in multi-dimensional space equipped with appropriate distance or similarity measures [1]. Many classical methods of data analysis starting from principal component analysis, correspondence Entropy 2020, 22, 296; doi:10.3390/e22030296 www.mdpi.com/journal/entropy Entropy 2020, 22, 296 2 of 27 analysis, K-means clustering, and their generalizations to non-Euclidean metrics, or non-linear objects—such as Hastie’s principal curves—can be considered part of GDA [2]. Topological data analysis (TDA) focuses on extracting persistent homologies of simplicial complexes derived from a datapoint cloud and also exploits data space geometry [3]. Information geometry understood broadly can be defined as the field that applies geometrical approach to study spaces of statistical models, and the relations between these models and the data [4]. Local and global intrinsic dimensionalities are important geometrical characteristics of multi-dimensional data point clouds [5,6]. Large real-life datasets frequently contain both low-dimensional and essentially high-dimensional components. One of the useful concepts in modern data analysis is the complementarity principle formulated in [7]: the data space can be split into a low volume (low dimensional) subset, which requires nonlinear methods for constructing complex data approximators, and a high-dimensional subset, characterized by measure concentration and simplicity allowing the effective application of linear methods. Manifold learning methods and their generalizations using different mathematical objects (such as graphs) aim at revealing and characterizing the geometry and topology of the low-dimensional part of the datapoint cloud. Manifold learning methods model multidimensional data as a noisy sample from an underlying generating manifold, usually of a relatively small dimensionality. A classical linear manifold learning method is the principal component analysis (PCA), introduced more than 100 years ago [8]. From the 1990s, multiple generalizations of PCA to non-linear manifolds have been suggested, including injective (when the manifold exists in the same space as the data themselves) methods such as self-organizing maps (SOMs) [9], elastic maps [10,11], regularized principal curves and manifolds [12], and projective (when the projection forms its own space) ones such as ISOMAP [13], local linear embedding (LLE) [14], t-distributed stochastic neighbor embedding (t-SNE) [15], UMAP [16], and many others. The manifold hypothesis can be challenged by the complexity of real-life data, frequently characterized by clusters having complex shapes, branching or converging (i.e., forming loops) non-linear trajectories, regions of varying local dimensionality, high level of noise, etc. In many contexts, the rigorous definition of manifold as underlying model of data structure may be too restrictive, and it might be advantageous to construct data approximators in the form of more general mathematical varieties, for example, by gluing manifolds on their boundaries, thus introducing singularities which can correspond to branching points. Many methods currently used to learn complex non-linear data approximators are based on an auxiliary object called k-nearest neighbor (kNN) graph, constructed by connecting each data point to its k closest (in a chosen metrics) neighboring points, or similar objects such as "-proximity graphs. These graphs can be used to re-define the similarity measures between data points (e.g., along the geodesic paths in the graph), and to infer the underlying data geometry. By contrast, principal graphs, being generalizations of principal curves, are data approximations constructed by injecting graphs into the data space ‘passing through the middle of data’ and possessing some regular properties, such that the complexity of the injection map is constrained [17–19]. Construction of principal graphs might not involve computing kNN graph-like objects or complete data distance matrix. Principal tree, a term coined by Gorban and Zinovyev in 2007, is the simplest and most tractable type of principal graph [20–23]. The historical sequence of the main steps of the principal graph methodology development, underlining the contribution and novelty of this paper, is depicted in Table1. Entropy 2020, 22, 296 3 of 27 Table 1. Elements of the principal graph methodology. Element Initial Publication Principal Advances Definition of principal curves based on Principal curves Hastie and Stuelze, 1989 [24] self-consistency Length constrained principal curves, polygonal line Piece-wise linear principal curves Kégl et al., 1999 [25] algorithm Fast iterative splitting algorithm to minimize the Elastic energy functional for elastic Gorban, Rossiev, Wunsch II, 1999 elastic energy, based on sequence of solutions of principal curves and manifolds [26] simple quadratic minimization problems Zinovyev, 2000 [27], Gorban and Construction of principal manifold approximations Method of elastic maps Zinovyev, 2001 [28] possessing various topologies Principal curves passing through a set of principal Principal Oriented Points Delicado, 2001 [29] oriented points Coining the term principal graph, an algorithm Principal graphs specialized for Kégl and Krzyzak, 2002 [30] extending the polygonal line algorithm, specialized image skeletonization on image skeletonization Simple principal graph algorithm, based on Self-assembling principal graphs Gorban and Zinovyev, 2005 [10] application of elastic map method, specialized on image skeletonization Suggesting the principle of (pluri-)harmonic graph General purpose elastic principal Gorban, Sumner, Zinovyev, 2007 embedding, coining the terms ‘principal tree’ and graphs [20] ‘principal cubic complex’ with algorithms for their construction Exploring multiple principal graph topologies via Gorban, Sumner, Zinovyev, 2007 Topological grammars gradient descent-like search in the space of [20] admissible structures Explicit control of principal graph Introducing three types of principal graph Gorban and Zinovyev, 2009 [17] complexity complexity and ways to constrain it Formulating reverse graph embedding problem. Regularized principal graphs Mao et al., 2015 [22] Suggesting SimplePPT algorithm. Further development in Mao et al., 2017 [19] Using trimmed version of the mean squared error, Gorban, Mirkes, Zinovyev, 2015 Robust principal graphs resulting in the ‘local’ growth of the principal graphs

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    27 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us