A Manifold Approach to Learning Mutually Orthogonal Subspaces

A Manifold Approach to Learning Mutually Orthogonal Subspaces

A Manifold Approach to Learning Mutually Orthogonal Subspaces Stephen Giguere [email protected] Francisco Garcia [email protected] College of Information and Computer Sciences, University of Massachusetts, Amherst, Massachusetts 01003 Sridhar Mahadevan [email protected] Stanford Research Institute, Menlo Park, California 94025 Abstract Recently, there has been interest in developing Rieman- Although many machine learning algorithms in- nian optimization methods that enforce certain constraints volve learning subspaces with particular charac- by leveraging the geometry of the parameters that satisfy teristics, optimizing a parameter matrix that is them, called the feasible set. Specifically, if the feasible set constrained to represent a subspace can be chal- forms a smooth manifold in the original parameter space, lenging. One solution is to use Riemannian op- then Riemannian optimization can be applied. Unlike other timization methods that enforce such constraints strategies for enforcing constraints, such as augmenting the implicitly, leveraging the fact that the feasible pa- loss function with penalty terms or projecting the param- rameter values form a manifold. While Rieman- eters back onto the feasible set, this approach eliminates nian methods exist for some specific problems, the need to deal with constraints explictly by performing such as learning a single subspace, there are more optimization on the constraint manifold directly. In many general subspace constraints that offer additional cases, using Riemannian optimization can simplify algo- flexibility when setting up an optimization prob- rithms, provide convergence guarantees on learning, and lem but have not been formulated as a manifold. ensure that constraints are satisfied exactly rather than ap- proximately. We propose the partitioned subspace (PS) mani- fold for optimizing matrices that are constrained In this work, we investigate Riemannian optimization to represent one or more subspaces. Each point methods for enforcing subspace constraints, which we de- on the manifold defines a partitioning of the input fine as any constraint that forces a matrix of parameters space into mutually orthogonal subspaces, where to represent one or more subspaces. There are two com- the number of partitions and their sizes are de- monly used subspace constraints that have well-established fined by the user. As a result, distinct groups of Riemannian optimization methods provided in (Edelman, features can be learned by defining different ob- Arias, and Smith, 1998). The first is applicable when op- jective functions for each partition. We illustrate timizing over matrices that define k-dimensional bases in the properties of the manifold through experi- an n-dimensional ambient space. In this case, the feasible ments on multiple dataset analysis and domain set consists of all n × k matrices Y that satisfy Y T Y = I, adaptation. which corresponds to the Stiefel manifold. If the optimiza- tion is instead taken over all distinct k-dimensional sub- spaces, then in addition to the constraint that Y T Y = I, 1. Introduction two matrices must be considered identical if they have the same span. This second condition is important because it The process of designing a model and learning its parame- implies that during optimization, estimates must be updated ters by numerically optimizing a loss function is a corner- to not only change the parameter matrix, but to do so in stone of machine learning. In this setting, it is common a way that changes the subspace it represents. The feasi- to place constraints on the model’s parameters to ensure ble set for this constraint corresponds to the Grassmannian that they are valid, to promote desirable characteristics such manifold, which has proven useful for a large number of as sparsity, to incorporate domain knowledge, or to make applications including background separation in video, hu- learning more efficient. Because they can significantly im- man activity analysis, subspace tracking, and others (He, pact the difficulty of an optimization problem, it is useful to Balzano, and Lui, 2011; Turaga and Chellappa, 2009; He, understand the properties of particular types of constraints Balzano, and Szlam, 2012). and to develop optimization techniques that preserve them. A Manifold Approach to Learning Mutually Orthogonal Subspaces Figure 1. Visualization of Q matrices showing the subspaces encoded by the Stiefel, Grassmannian, and partitioned subspace manifolds. Within each matrix, the colored segments represent mutually orthogonal subspaces. The values in each segment denote the dimen- sionality of the subspace. All three matrices are the same size and have k columns of interest, but the properties of their respective manifolds cause the subspaces to be partitioned differently. While the Stiefel manifold represents k, 1-dimensional subspaces and the Grassmannian represents a single, k-dimensional subspace, the PS manifold allows the number of partitions and their sizes to be chosen as needed for a given problem. While the constraints satisfied by optimizing on the Stiefel (PS) manifold. Finally, we provide several examples using or Grassmannian manifolds are useful in practice, there are both real and synthetic data to illustrate how the manifold more general subspace constraints that are not captured by can be applied in practice, and to establish intuition for its these manifolds and offer significant flexibility when set- properties. ting up an optimization problem. To see this, first consider Related work The problem of learning parameters sub- that a point on a Stiefel manifold can be viewed as a col- ject to subspace constraints has been investigated widely lection of k, 1-dimensional subspaces that are constrained and for a variety of applications. In general, these appli- to be mutually orthogonal. Similarly, a point on a Grass- cations use only simple subspace constraints, which effec- mannian manifold represents 1, k-dimensional subspace. tively cause the feasible set to be either the Grassmannian It is therefore natural to consider a general constraint in manifold or Stiefel manifold. Optimization subject to gen- which both the number of subspaces and their sizes can eral subspace constraints has been investigated much less. be specified by the user. These relationships are shown in Kim, Kittler, and Cipolla (2010) proposed an algorithm for figure 1. Importantly, subspace constraints of this form al- face recognition that is similar to our work in that they also low different set of features to be learned according to dif- learn a set of mutually-orthogonal subspaces. However, ferent criteria. In analogy to optimization on the Stiefel their approach learns subspaces by incrementally updating and Grassmannian manifolds, we aim to optimize over dis- and reorthogonalizing the subspace matrix, whereas we op- tinct sets of m mutually orthogonal subspaces with sizes timize on the constraint manifold itself using Riemannian [k ; k ; :::; k ], which we refer to as partitions. 1 2 m optimization. Our work presents, to our knowledge, the In this paper, we introduce a novel manifold that general- first formulation of general subspace constraints as a man- izes the Stiefel and Grassmannian manifolds to implicitly ifold. enforce the general subspace constraints described above. The rest of this paper is organized as follows. In section Individual points on our proposed manifold define parti- 2, we describe background concepts and establish notation. tions of n-dimensional space into mutually orthogonal sub- Next, we define the partitioned subspace manifold and pro- spaces of particular sizes. In addition, we derive update vide details necessary to use it for optimization in section rules for performing Riemannian optimization on the man- 3. Finally, section 4 illustrates the various properties of the ifold. This allows features of the original space to be manifold and analyzes them through experiments on both grouped in useful ways by defining separate objective func- real and synthetic data. tions on each subspace. For example, given two datasets X0 and X1, the linear features that best describe the two can easily be partitioned into a pair of subspaces containing 2. Background features unique to each dataset and a subspace of features Before defining the partitioned subspace manifold, we that are shared between them. Because of these character- briefly introduce some necessary background concepts istics, we refer to this manifold as the partitioned subspace from differential geometry. 2 A Manifold Approach to Learning Mutually Orthogonal Subspaces 2.1. Riemannian Manifolds Because they encode the relationship between matrices that map to the same point on the quotient, we refer to the sets A Riemannian manifold, or simply a manifold, can be de- E and E as the equivalence sets for their respective scribed as a continuous set of points that appears locally St Gr manifolds. Euclidean at every location. More specifically, a manifold is a topological space and a collection of differentiable, Representative elements As defined above, points on one-to-one mappings called charts. At any point on a d- the Stiefel and Grassmannian manifolds are equivalence dimensional manifold, there is a chart that maps a neigh- classes [Q] of orthogonal matrices Q 2 On. However, in borhood containing that point to the Euclidean space IRd practice, these points are represented by single, n × k ma- (Absil, Mahony, and Sepulchre, 2009). This property al- trices Y . This discrepancy

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    9 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us