
Lectures on Differential Geometry Wulf Rossmann 0 0 0 (Updated October 2003) 2 To the student This is a collection of lecture notes which I put together while teaching courses on manifolds, tensor analysis, and differential geometry. I offer them to you in the hope that they may help you, and to complement the lectures. The style is uneven, sometimes pedantic, sometimes sloppy, sometimes telegram style, sometimes long–winded, etc., depending on my mood when I was writing those particular lines. At least this set of notes is visibly finite. There are a great many meticulous and voluminous books written on the subject of these notes and there is no point of writing another one of that kind. After all, we are talking about some fairly old mathematics, still useful, even essential, as a tool and still fun, I think, at least some parts of it. A comment about the nature of the subject (elementary differential geometry and tensor calculus) as presented in these notes. I see it as a natural continuation of analytic geometry and calculus. It provides some basic equipment, which is indispensable in many areas of mathematics (e.g. analysis, topology, differential equations, Lie groups) and physics (e.g. classical mechanics, general relativity, all kinds of field theories). If you want to have another view of the subject you should by all means look around, but I suggest that you don’t attempt to use other sources to straighten out problems you might have with the material here. It would probably take you much longer to familiarize yourself sufficiently with another book to get your question answered than to work out your problem on your own. Even though these notes are brief, they should be understandable to anybody who knows calculus and linear algebra to the extent usually seen in second-year courses. There are no difficult theorems here; it is rather a matter of providing a framework for various known concepts and theorems in a more general and more natural setting. Unfortunately, this requires a large number of definitions and constructions which may be hard to swallow and even harder to digest. (In this subject the definitions are much harder than the theorems.) In any case, just by randomly leafing through the notes you will see many complicated looking expressions. Don’t be intimidated: this stuff is easy. When you looked at a calculus text for the first time in your life it probably looked complicated as well. Perhaps it will help to contemplate this piece of advice by Hermann Weyl from his classic Raum–Zeit–Materie of 1918 (my translation). Many will be horrified by the flood of formulas and indices which here drown the main idea 3 4 of differential geometry (in spite of the author’s honest effort for conceptual clarity). It is certainly regrettable that we have to enter into purely formal matters in such detail and give them so much space; but this cannot be avoided. Just as we have to spend laborious hours learning language and writing to freely express our thoughts, so the only way that we can lessen the burden of formulas here is to master the tool of tensor analysis to such a degree that we can turn to the real problems that concern us without being bothered by formal matters. W. R. Flow chart 1. Manifolds 1.1 Review of linear algebra and calculus··· 9 1.2 Manifolds: definitions and examples··· 24 1.3 Vectors and differentials··· 37 1.4 Submanifolds··· 51 1.5 Riemann metrics··· 61 1.6 Tensors··· 75 2. Connections and curvature 3. Calculus on manifolds 2.1 Connections··· 85 3.1 Differential forms··· 133 2.2 Geodesics··· 99 3.2 Differential calculus··· 141 2.3 Riemann curvature··· 104 3.3 Integral calculus··· 146 2.4 Gauss curvature··· 110 3.4 Lie derivatives··· 155 2.5 Levi-Civita’s connection··· 120 2.6 Curvature identities··· 130 4. Special topics 4.1 General Relativity··· 169 4.3 The rotation group SO(3)··· 185 4.2 The Schwarzschild metric··· 175 4.4 Cartan’s mobile frame··· 194 4.5 Weyl’s gauge theory paper of 1929··· 199 Chapter 3 is independent of chapter 2 and is used only in section 4.3. 5 6 Contents 1 Manifolds 9 1.1 Review of linear algebra and calculus . 9 1.2 Manifolds: definitions and examples . 24 1.3 Vectors and differentials . 37 1.4 Submanifolds . 51 1.5 Riemann metrics . 61 1.6 Tensors . 75 2 Connections and curvature 85 2.1 Connections . 85 2.2 Geodesics . 99 2.3 Riemann curvature . 104 2.4 Gauss curvature . 110 2.5 Levi-Civita’s connection . 120 2.6 Curvature identities . 130 3 Calculus on manifolds 133 3.1 Differential forms . 133 3.2 Differential calculus . 141 3.3 Integral calculus . 146 3.4 Lie derivatives . 155 4 Special Topics 169 4.1 General Relativity . 169 4.2 The Schwarzschild metric . 175 4.3 . 175 4.4 The rotation group SO(3) . 185 4.5 Cartan’s mobile frame . 194 4.6 Weyl’s gauge theory paper of 1929 . 199 Time chart 217 Annotated bibliography 219 Index 220 7 8 CONTENTS Chapter 1 Manifolds 1.1 Review of linear algebra and calculus A. Linear algebra.A (real) vector space is a set V together with two opera- tions, vector addition u + v (u, v ∈ V ) and scalar multiplication αv (α ∈ R, v ∈ V ). These operations have to satisfy those axioms you know (and can find spelled out in your linear algebra text). Example: Rn is the vector space of real n–tuples (x1, ··· , xn), xi ∈ R with componentwise vector addition and scalar multiplication. The basic fact is that every vector space has a basis, meaning a set of vectors {vi} so that any other vector v can be written uniquely as a P i linear combination α vi of vi’s. We shall always assume that our space V is finite–dimensional, which means that it admits a finite basis, consisting of say n elements. It that case any other basis has also n elements and n is called the dimension of V . For example, Rn comes equipped with a standard basis 1 n 1 n e1, ··· , en characterized by the property that (x , ··· , x ) = x e1 + ··· + x en. We may say that we can “identify” V with Rn after we fix an ordered basis {v1, ··· , vn}, since the x ∈ V correspond one–to–one to their n–tuples of com- ponents (x1, ··· , xn) ∈ R. But note(!): this identification depends on the choice n of the basis {vi} which “becomes” the standard basis {ei} of R . The indices i on α and vi are placed the way they are with the following rule in mind. 1.1.1 Summation convention. Any index occurring twice, once up, once i P i 1 n down, is summed over. For example x ei = i x ei = x e1 + ··· + x en. We may still keep the P’s if we want to remind ourselves of the summation. A linear transformation (or linear map) A : V → W between two vector spaces is a map which respects the two operations, i.e. A(u+v) = Au+Av and A(αv) = αAv. One often writes Av instead of A(v) for linear maps. In terms of a basis P i j {v1, ··· , vn} for V and {w1, ··· , wm} for W this implies that Av = ij α ai vj P i j if v = i α vi for some indexed system of scalars (ai ) called the matrix of A with respect to the bases {vi}, {wj}. With the summation convention the j j i equation w = Av becomes β = ai α . Example: the matrix of the identity 9 10 CHAPTER 1. MANIFOLDS i transformation 1 : V → V (with respect to any basis) is the Kronecker delta δj defined by 1 if i = j δi = j 0 if i 6= j j The inverse (if it exists) of a linear transformation A with matrix (ai ) is the j linear transformation B whose matrix (bi ) satisfies i k i akbj = δj If A : V → V is a linear transformation of V into itself, then the determinant of A is defined by the formula X i1 in det(A) = i1···in a1 ··· an (1) where i1···in = ±1 is the sign of the permutation (i1, ··· , in) of (1, ··· , n). This i seems to depend on the basis we use to write A as a matrix (aj), but in fact it doesn’t. Recall the following theorem. 1.1.2 Theorem. A : V → V is invertible if and only if det(A) 6= 0. There is a formula for A−1 (Cramer’s Formula) which says that −1 1 ˜ j i+j l A = det(A) A,a ˜i = (−1) det[ak|kl 6= ji]. j ˜ The ij entrya ˜i of A is called the ji cofactor of A, as you can look up in your linear algebra text. This formula is rarely practical for the actual calculation of A−1 for a particular A, but it is sometimes useful for theoretical considerations or for matrices with variable entries. The rank of a linear transformation A : V → W is the dimension of the image of A, i.e. of im(A) = {w ∈ W : w = Av for some v ∈ V }. This rank is equal to maximal number of linearly independent columns of the i matrix (aj), and equals maximal number of linearly independent rows as well. The linear map A : V → W is surjective (i.e. onto) iff1 rank(A) = m and injective (i.e.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages221 Page
-
File Size-