LECTURE 5 1. Isotropic Grassmannians in This Section, We
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
The Grassmann Manifold
The Grassmann Manifold 1. For vector spaces V and W denote by L(V; W ) the vector space of linear maps from V to W . Thus L(Rk; Rn) may be identified with the space Rk£n of k £ n matrices. An injective linear map u : Rk ! V is called a k-frame in V . The set k n GFk;n = fu 2 L(R ; R ): rank(u) = kg of k-frames in Rn is called the Stiefel manifold. Note that the special case k = n is the general linear group: k k GLk = fa 2 L(R ; R ) : det(a) 6= 0g: The set of all k-dimensional (vector) subspaces ¸ ½ Rn is called the Grassmann n manifold of k-planes in R and denoted by GRk;n or sometimes GRk;n(R) or n GRk(R ). Let k ¼ : GFk;n ! GRk;n; ¼(u) = u(R ) denote the map which assigns to each k-frame u the subspace u(Rk) it spans. ¡1 For ¸ 2 GRk;n the fiber (preimage) ¼ (¸) consists of those k-frames which form a basis for the subspace ¸, i.e. for any u 2 ¼¡1(¸) we have ¡1 ¼ (¸) = fu ± a : a 2 GLkg: Hence we can (and will) view GRk;n as the orbit space of the group action GFk;n £ GLk ! GFk;n :(u; a) 7! u ± a: The exercises below will prove the following n£k Theorem 2. The Stiefel manifold GFk;n is an open subset of the set R of all n £ k matrices. There is a unique differentiable structure on the Grassmann manifold GRk;n such that the map ¼ is a submersion. -
2 Hilbert Spaces You Should Have Seen Some Examples Last Semester
2 Hilbert spaces You should have seen some examples last semester. The simplest (finite-dimensional) ex- C • A complex Hilbert space H is a complete normed space over whose norm is derived from an ample is Cn with its standard inner product. It’s worth recalling from linear algebra that if V is inner product. That is, we assume that there is a sesquilinear form ( , ): H H C, linear in · · × → an n-dimensional (complex) vector space, then from any set of n linearly independent vectors we the first variable and conjugate linear in the second, such that can manufacture an orthonormal basis e1, e2,..., en using the Gram-Schmidt process. In terms of this basis we can write any v V in the form (f ,д) = (д, f ), ∈ v = a e , a = (v, e ) (f , f ) 0 f H, and (f , f ) = 0 = f = 0. i i i i ≥ ∀ ∈ ⇒ ∑ The norm and inner product are related by which can be derived by taking the inner product of the equation v = aiei with ei. We also have n ∑ (f , f ) = f 2. ∥ ∥ v 2 = a 2. ∥ ∥ | i | i=1 We will always assume that H is separable (has a countable dense subset). ∑ Standard infinite-dimensional examples are l2(N) or l2(Z), the space of square-summable As usual for a normed space, the distance on H is given byd(f ,д) = f д = (f д, f д). • • ∥ − ∥ − − sequences, and L2(Ω) where Ω is a measurable subset of Rn. The Cauchy-Schwarz and triangle inequalities, • √ (f ,д) f д , f + д f + д , | | ≤ ∥ ∥∥ ∥ ∥ ∥ ≤ ∥ ∥ ∥ ∥ 2.1 Orthogonality can be derived fairly easily from the inner product. -
Classification on the Grassmannians
GEOMETRIC FRAMEWORK SOME EMPIRICAL RESULTS COMPRESSION ON G(k, n) CONCLUSIONS Classification on the Grassmannians: Theory and Applications Jen-Mei Chang Department of Mathematics and Statistics California State University, Long Beach [email protected] AMS Fall Western Section Meeting Special Session on Metric and Riemannian Methods in Shape Analysis October 9, 2010 at UCLA GEOMETRIC FRAMEWORK SOME EMPIRICAL RESULTS COMPRESSION ON G(k, n) CONCLUSIONS Outline 1 Geometric Framework Evolution of Classification Paradigms Grassmann Framework Grassmann Separability 2 Some Empirical Results Illumination Illumination + Low Resolutions 3 Compression on G(k, n) Motivations, Definitions, and Algorithms Karcher Compression for Face Recognition GEOMETRIC FRAMEWORK SOME EMPIRICAL RESULTS COMPRESSION ON G(k, n) CONCLUSIONS Architectures Historically Currently single-to-single subspace-to-subspace )2( )3( )1( d ( P,x ) d ( P,x ) d(P,x ) ... P G , , … , x )1( x )2( x( N) single-to-many many-to-many Image Set 1 … … G … … p * P , Grassmann Manifold Image Set 2 * X )1( X )2( … X ( N) q Project Eigenfaces GEOMETRIC FRAMEWORK SOME EMPIRICAL RESULTS COMPRESSION ON G(k, n) CONCLUSIONS Some Approaches • Single-to-Single 1 Euclidean distance of feature points. 2 Correlation. • Single-to-Many 1 Subspace method [Oja, 1983]. 2 Eigenfaces (a.k.a. Principal Component Analysis, KL-transform) [Sirovich & Kirby, 1987], [Turk & Pentland, 1991]. 3 Linear/Fisher Discriminate Analysis, Fisherfaces [Belhumeur et al., 1997]. 4 Kernel PCA [Yang et al., 2000]. GEOMETRIC FRAMEWORK SOME EMPIRICAL RESULTS COMPRESSION ON G(k, n) CONCLUSIONS Some Approaches • Many-to-Many 1 Tangent Space and Tangent Distance - Tangent Distance [Simard et al., 2001], Joint Manifold Distance [Fitzgibbon & Zisserman, 2003], Subspace Distance [Chang, 2004]. -
Arxiv:1711.05949V2 [Math.RT] 27 May 2018 Mials, One Obtains a Sum Whose Summands Are Rational Functions in Ti
RESIDUES FORMULAS FOR THE PUSH-FORWARD IN K-THEORY, THE CASE OF G2=P . ANDRZEJ WEBER AND MAGDALENA ZIELENKIEWICZ Abstract. We study residue formulas for push-forward in the equivariant K-theory of homogeneous spaces. For the classical Grassmannian the residue formula can be obtained from the cohomological formula by a substitution. We also give another proof using sym- plectic reduction and the method involving the localization theorem of Jeffrey–Kirwan. We review formulas for other classical groups, and we derive them from the formulas for the classical Grassmannian. Next we consider the homogeneous spaces for the exceptional group G2. One of them, G2=P2 corresponding to the shorter root, embeds in the Grass- mannian Gr(2; 7). We find its fundamental class in the equivariant K-theory KT(Gr(2; 7)). This allows to derive a residue formula for the push-forward. It has significantly different character comparing to the classical groups case. The factor involving the fundamen- tal class of G2=P2 depends on the equivariant variables. To perform computations more efficiently we apply the basis of K-theory consisting of Grothendieck polynomials. The residue formula for push-forward remains valid for the homogeneous space G2=B as well. 1. Introduction ∗ r Suppose an algebraic torus T = (C ) acts on a complex variety X which is smooth and complete. Let E be an equivariant complex vector bundle over X. Then E defines an element of the equivariant K-theory of X. In our setting there is no difference whether one considers the algebraic K-theory or the topological one. -
Orthogonal Complements (Revised Version)
Orthogonal Complements (Revised Version) Math 108A: May 19, 2010 John Douglas Moore 1 The dot product You will recall that the dot product was discussed in earlier calculus courses. If n x = (x1: : : : ; xn) and y = (y1: : : : ; yn) are elements of R , we define their dot product by x · y = x1y1 + ··· + xnyn: The dot product satisfies several key axioms: 1. it is symmetric: x · y = y · x; 2. it is bilinear: (ax + x0) · y = a(x · y) + x0 · y; 3. and it is positive-definite: x · x ≥ 0 and x · x = 0 if and only if x = 0. The dot product is an example of an inner product on the vector space V = Rn over R; inner products will be treated thoroughly in Chapter 6 of [1]. Recall that the length of an element x 2 Rn is defined by p jxj = x · x: Note that the length of an element x 2 Rn is always nonnegative. Cauchy-Schwarz Theorem. If x 6= 0 and y 6= 0, then x · y −1 ≤ ≤ 1: (1) jxjjyj Sketch of proof: If v is any element of Rn, then v · v ≥ 0. Hence (x(y · y) − y(x · y)) · (x(y · y) − y(x · y)) ≥ 0: Expanding using the axioms for dot product yields (x · x)(y · y)2 − 2(x · y)2(y · y) + (x · y)2(y · y) ≥ 0 or (x · x)(y · y)2 ≥ (x · y)2(y · y): 1 Dividing by y · y, we obtain (x · y)2 jxj2jyj2 ≥ (x · y)2 or ≤ 1; jxj2jyj2 and (1) follows by taking the square root. -
Filtering Grassmannian Cohomology Via K-Schur Functions
FILTERING GRASSMANNIAN COHOMOLOGY VIA K-SCHUR FUNCTIONS THE 2020 POLYMATH JR. \Q-B-AND-G" GROUPy Abstract. This paper is concerned with studying the Hilbert Series of the subalgebras of the cohomology rings of the complex Grassmannians and La- grangian Grassmannians. We build upon a previous conjecture by Reiner and Tudose for the complex Grassmannian and present an analogous conjecture for the Lagrangian Grassmannian. Additionally, we summarize several poten- tial approaches involving Gr¨obnerbases, homological algebra, and algebraic topology. Then we give a new interpretation to the conjectural expression for the complex Grassmannian by utilizing k-conjugation. This leads to two conjectural bases that would imply the original conjecture for the Grassman- nian. Finally, we comment on how this new approach foreshadows the possible existence of analogous k-Schur functions in other Lie types. Contents 1. Introduction2 2. Conjectures3 2.1. The Grassmannian3 2.2. The Lagrangian Grassmannian4 3. A Gr¨obnerBasis Approach 12 3.1. Patterns, Patterns and more Patterns 12 3.2. A Lex-greedy Approach 20 4. An Approach Using Natural Maps Between Rings 23 4.1. Maps Between Grassmannians 23 4.2. Maps Between Lagrangian Grassmannians 24 4.3. Diagrammatic Reformulation for the Two Main Conjectures 24 4.4. Approach via Coefficient-Wise Inequalities 27 4.5. Recursive Formula for Rk;` 27 4.6. A \q-Pascal Short Exact Sequence" 28 n;m 4.7. Recursive Formula for RLG 30 4.8. Doubly Filtered Basis 31 5. Schur and k-Schur Functions 35 5.1. Schur and Pieri for the Grassmannian 36 5.2. Schur and Pieri for the Lagrangian Grassmannian 37 Date: August 2020. -
§4 Grassmannians 73
§4 Grassmannians 4.1 Plücker quadric and Gr(2,4). Let 푉 be 4-dimensional vector space. e set of all 2-di- mensional vector subspaces 푈 ⊂ 푉 is called the grassmannian Gr(2, 4) = Gr(2, 푉). More geomet- rically, the grassmannian Gr(2, 푉) is the set of all lines ℓ ⊂ ℙ = ℙ(푉). Sending 2-dimensional vector subspace 푈 ⊂ 푉 to 1-dimensional subspace 훬푈 ⊂ 훬푉 or, equivalently, sending a line (푎푏) ⊂ ℙ(푉) to 한 ⋅ 푎 ∧ 푏 ⊂ 훬푉 , we get the Plücker embedding 픲 ∶ Gr(2, 4) ↪ ℙ = ℙ(훬 푉). (4-1) Its image consists of all decomposable¹ grassmannian quadratic forms 휔 = 푎∧푏 , 푎, 푏 ∈ 푉. Clearly, any such a form has zero square: 휔 ∧ 휔 = 푎 ∧ 푏 ∧ 푎 ∧ 푏 = 0. Since an arbitrary form 휉 ∈ 훬푉 can be wrien¹ in appropriate basis of 푉 either as 푒 ∧ 푒 or as 푒 ∧ 푒 + 푒 ∧ 푒 and in the laer case 휉 is not decomposable, because of 휉 ∧ 휉 = 2 푒 ∧ 푒 ∧ 푒 ∧ 푒 ≠ 0, we conclude that 휔 ∈ 훬 푉 is decomposable if an only if 휔 ∧ 휔 = 0. us, the image of (4-1) is the Plücker quadric 푃 ≝ { 휔 ∈ 훬푉 | 휔 ∧ 휔 = 0 } (4-2) If we choose a basis 푒, 푒, 푒, 푒 ∈ 푉, the monomial basis 푒 = 푒 ∧ 푒 in 훬 푉, and write 푥 for the homogeneous coordinates along 푒, then straightforward computation 푥 ⋅ 푒 ∧ 푒 ∧ 푥 ⋅ 푒 ∧ 푒 = 2 푥 푥 − 푥 푥 + 푥 푥 ⋅ 푒 ∧ 푒 ∧ 푒 ∧ 푒 < < implies that 푃 is given by the non-degenerated quadratic equation 푥푥 = 푥푥 + 푥푥 . -
LINEAR ALGEBRA FALL 2007/08 PROBLEM SET 9 SOLUTIONS In
MATH 110: LINEAR ALGEBRA FALL 2007/08 PROBLEM SET 9 SOLUTIONS In the following V will denote a finite-dimensional vector space over R that is also an inner product space with inner product denoted by h ·; · i. The norm induced by h ·; · i will be denoted k · k. 1. Let S be a subset (not necessarily a subspace) of V . We define the orthogonal annihilator of S, denoted S?, to be the set S? = fv 2 V j hv; wi = 0 for all w 2 Sg: (a) Show that S? is always a subspace of V . ? Solution. Let v1; v2 2 S . Then hv1; wi = 0 and hv2; wi = 0 for all w 2 S. So for any α; β 2 R, hαv1 + βv2; wi = αhv1; wi + βhv2; wi = 0 ? for all w 2 S. Hence αv1 + βv2 2 S . (b) Show that S ⊆ (S?)?. Solution. Let w 2 S. For any v 2 S?, we have hv; wi = 0 by definition of S?. Since this is true for all v 2 S? and hw; vi = hv; wi, we see that hw; vi = 0 for all v 2 S?, ie. w 2 S?. (c) Show that span(S) ⊆ (S?)?. Solution. Since (S?)? is a subspace by (a) (it is the orthogonal annihilator of S?) and S ⊆ (S?)? by (b), we have span(S) ⊆ (S?)?. ? ? (d) Show that if S1 and S2 are subsets of V and S1 ⊆ S2, then S2 ⊆ S1 . ? Solution. Let v 2 S2 . Then hv; wi = 0 for all w 2 S2 and so for all w 2 S1 (since ? S1 ⊆ S2). -
8. Grassmannians
66 Andreas Gathmann 8. Grassmannians After having introduced (projective) varieties — the main objects of study in algebraic geometry — let us now take a break in our discussion of the general theory to construct an interesting and useful class of examples of projective varieties. The idea behind this construction is simple: since the definition of projective spaces as the sets of 1-dimensional linear subspaces of Kn turned out to be a very useful concept, let us now generalize this and consider instead the sets of k-dimensional linear subspaces of Kn for an arbitrary k = 0;:::;n. Definition 8.1 (Grassmannians). Let n 2 N>0, and let k 2 N with 0 ≤ k ≤ n. We denote by G(k;n) the set of all k-dimensional linear subspaces of Kn. It is called the Grassmannian of k-planes in Kn. Remark 8.2. By Example 6.12 (b) and Exercise 6.32 (a), the correspondence of Remark 6.17 shows that k-dimensional linear subspaces of Kn are in natural one-to-one correspondence with (k − 1)- n− dimensional linear subspaces of P 1. We can therefore consider G(k;n) alternatively as the set of such projective linear subspaces. As the dimensions k and n are reduced by 1 in this way, our Grassmannian G(k;n) of Definition 8.1 is sometimes written in the literature as G(k − 1;n − 1) instead. Of course, as in the case of projective spaces our goal must again be to make the Grassmannian G(k;n) into a variety — in fact, we will see that it is even a projective variety in a natural way. -
Grassmannians Via Projection Operators and Some of Their Special Submanifolds ∗
GRASSMANNIANS VIA PROJECTION OPERATORS AND SOME OF THEIR SPECIAL SUBMANIFOLDS ∗ IVKO DIMITRIC´ Department of Mathematics, Penn State University Fayette, Uniontown, PA 15401-0519, USA E-mail: [email protected] We study submanifolds of a general Grassmann manifold which are of 1-type in a suitably defined Euclidean space of F−Hermitian matrices (such a submanifold is minimal in some hypersphere of that space and, apart from a translation, the im- mersion is built using eigenfunctions from a single eigenspace of the Laplacian). We show that a minimal 1-type hypersurface of a Grassmannian is mass-symmetric and has the type-eigenvalue which is a specific number in each dimension. We derive some conditions on the mean curvature of a 1-type hypersurface of a Grassmannian and show that such a hypersurface with a nonzero constant mean curvature has at most three constant principal curvatures. At the end, we give some estimates of the first nonzero eigenvalue of the Laplacian on a compact submanifold of a Grassmannian. 1. Introduction The Grassmann manifold of k dimensional subspaces of the vector space − Fm over a field F R, C, H can be conveniently defined as the base man- ∈{ } ifold of the principal fibre bundle of the corresponding Stiefel manifold of F unitary frames, where a frame is projected onto the F plane spanned − − by it. As observed already in 1927 by Cartan, all Grassmannians are sym- metric spaces and they have been studied in such context ever since, using the root systems and the Lie algebra techniques. However, the submani- fold geometry in Grassmannians of higher rank is much less developed than the corresponding geometry in rank-1 symmetric spaces, the notable excep- tions being the classification of the maximal totally geodesic submanifolds (the well known results of J.A. -
Essential Concepts of Projective Geomtry
Essential Concepts of Projective Geomtry Course Notes, MA 561 Purdue University August, 1973 Corrected and supplemented, August, 1978 Reprinted and revised, 2007 Department of Mathematics University of California, Riverside 2007 Table of Contents Preface : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : i Prerequisites: : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : :iv Suggestions for using these notes : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : :v I. Synthetic and analytic geometry: : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : :1 1. Axioms for Euclidean geometry : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 1 2. Cartesian coordinate interpretations : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 2 2 3 3. Lines and planes in R and R : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 3 II. Affine geometry : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 7 1. Synthetic affine geometry : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 7 2. Affine subspaces of vector spaces : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 13 3. Affine bases: : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : :19 4. Properties of coordinate -
A Guided Tour to the Plane-Based Geometric Algebra PGA
A Guided Tour to the Plane-Based Geometric Algebra PGA Leo Dorst University of Amsterdam Version 1.15{ July 6, 2020 Planes are the primitive elements for the constructions of objects and oper- ators in Euclidean geometry. Triangulated meshes are built from them, and reflections in multiple planes are a mathematically pure way to construct Euclidean motions. A geometric algebra based on planes is therefore a natural choice to unify objects and operators for Euclidean geometry. The usual claims of `com- pleteness' of the GA approach leads us to hope that it might contain, in a single framework, all representations ever designed for Euclidean geometry - including normal vectors, directions as points at infinity, Pl¨ucker coordinates for lines, quaternions as 3D rotations around the origin, and dual quaternions for rigid body motions; and even spinors. This text provides a guided tour to this algebra of planes PGA. It indeed shows how all such computationally efficient methods are incorporated and related. We will see how the PGA elements naturally group into blocks of four coordinates in an implementation, and how this more complete under- standing of the embedding suggests some handy choices to avoid extraneous computations. In the unified PGA framework, one never switches between efficient representations for subtasks, and this obviously saves any time spent on data conversions. Relative to other treatments of PGA, this text is rather light on the mathematics. Where you see careful derivations, they involve the aspects of orientation and magnitude. These features have been neglected by authors focussing on the mathematical beauty of the projective nature of the algebra.