8.5 Unitary and Hermitian Matrices

Total Page:16

File Type:pdf, Size:1020Kb

8.5 Unitary and Hermitian Matrices 500 CHAPTER 8 COMPLEX VECTOR SPACES 8.5 UNITARY AND HERMITIAN MATRICES Problems involving diagonalization of complex matrices and the associated eigenvalue problems require the concept of unitary and Hermitian matrices. These matrices roughly correspond to orthogonal and symmetric real matrices. In order to define unitary and Hermitian matrices, the concept of the conjugate transpose of a complex matrix must first be introduced. Definition of the The conjugate transpose of a complex matrix A, denoted by A*, is given by Conjugate Transpose of a A* ϭ A T Complex Matrix where the entries ofA are the complex conjugates of the corresponding entries of A. Note that if A is a matrix with real entries, then A*.ϭ AT To find the conjugate transpose of a matrix, first calculate the complex conjugate of each entry and then take the transpose of the matrix, as shown in the following example. EXAMPLE 1 Finding the Conjugate Transpose of a Complex Matrix Determine A* for the matrix 3 ϩ 7i 0 A ϭ ΄ ΅. 2i 4 Ϫ i SECTION 8.5 UNITARY AND HERMITIAN MATRICES 501 Solution 3 ϩ 7i 0 3 Ϫ 7i 0 A ϭ ϭ ΄ ΅ ΄ 2i 4 Ϫ i΅ Ϫ2i 4 ϩ i 3 Ϫ 7i Ϫ2i A* ϭ AT ϭ ΄ ΅ 0 4 ϩ i Several properties of the conjugate transpose of a matrix are listed in the following theorem. The proofs of these properties are straightforward and are left for you to supply in Exercises 49–52. Theorem 8.8 If A and B are complex matrices and k is a complex number, then the following proper- ties are true. Properties of 1. ͑A*͒* ϭ A Conjugate Transpose 2. ͑A ϩ B͒* ϭ A* ϩ B* 3. ͑kA͒* ϭ kA* 4. ͑AB͒* ϭ B*A* Unitary Matrices Recall that a real matrix A is orthogonal if and only if AϪ1 ϭ AT. In the complex system, matrices having the property that AϪ1 ϭ A* are more useful and such matrices are called unitary. Definition of a A complex matrix A is unitary if Unitary Matrix AϪ1 ϭ A*. EXAMPLE 2 A Unitary Matrix Show that the matrix is unitary. 1 1 ϩ i 1 Ϫ i A ϭ ΄ ΅ 2 1 Ϫ i 1 ϩ 1 Solution Because 1 1 ϩ i 1 Ϫ i 1 1 Ϫ i 1 ϩ i 1 4 0 AA* ϭ ΄ ΅ ΄ ΅ ϭ ΄ ΅ ϭ I , 2 1 Ϫ i 1 ϩ i 2 1 ϩ i 1 Ϫ i 4 0 4 2 you can conclude that A* ϭ AϪ1. So, A is a unitary matrix. 502 CHAPTER 8 COMPLEX VECTOR SPACES In Section 7.3, you saw that a real matrix is orthogonal if and only if its row (or column) vectors form an orthonormal set. For complex matrices, this property characterizes matri- ces that are unitary. Note that a set of vectors ͭ ͮ v1, v2, . , vm in Cn (complex Euclidean space) is called orthonormal if the following are true. ʈ ʈ ϭ ϭ 1. vi 1, i 1, 2, . , m ϭ 2. vi и vj 0, i j The proof of the following theorem is similar to the proof of Theorem 7.8 given in Section 7.3. Theorem 8.9 An n ϫ n complex matrix A is unitary if and only if its row (or column) vectors form n Unitary Matrices an orthonormal set in C . EXAMPLE 3 The Row Vectors of a Unitary Matrix Show that the complex matrix is unitary by showing that its set of row vectors form an orthonormal set in C 3. ϩ 1 1 i Ϫ1 2 2 2 i i 1 A ϭ Ϫ Ί3 Ί3 Ί3 5i 3 ϩ i 4 ϩ 3i 2Ί15 2Ί15 2Ί15 Solution Let r1, r2, and r3 be defined as follows. 1 1 ϩ i 1 r ϭ ΂ , , Ϫ ΃ 1 2 2 2 i i 1 r ϭ ΂Ϫ , , ΃ 2 Ί3 Ί3 Ί3 5i 3 ϩ i 4 ϩ 3i r ϭ ΂ , , ΃ 3 2Ί15 2Ί15 2Ί15 The length of r1 is ʈ ʈ ϭ ͑ ͒1͞2 r1 r1 и r1 1 1 1 ϩ i 1 ϩ i Ϫ1 Ϫ1 1͞2 ϭ ΄΂ ΃΂ ΃ ϩ ΂ ΃΂ ΃ ϩ ΂ ΃΂ ΃΅ 2 2 2 2 2 2 1 2 1 1͞2 ϭ ΄ ϩ ϩ ΅ ϭ 1. 4 4 4 SECTION 8.5 UNITARY AND HERMITIAN MATRICES 503 The vectors r2 and r3 can also be shown to be unit vectors. The inner product of r1 and r2 is given by 1 Ϫi 1 ϩ i i Ϫ1 1 r и r ϭ ΂ ΃΂ ΃ ϩ ΂ ΃΂ ΃ ϩ ΂ ΃΂ ΃ 1 2 2 Ί3 2 Ί3 2 Ί3 1 i 1 ϩ i Ϫi Ϫ1 1 ϭ ΂ ΃΂ ΃ ϩ ΂ ΃΂ ΃ ϩ ΂ ΃΂ ΃ 2 Ί3 2 Ί3 2 Ί3 i i 1 1 ϭ Ϫ ϩ Ϫ 2Ί3 2Ί3 2Ί3 2Ί3 ϭ 0. ϭ ϭ ͭ ͮ Similarly,r1 и r3 0 and r2 и r3 0. So, you can conclude that r1, r2, r3 is an ortho- normal set. ͑Try showing that the column vectors of A also form an orthonormal set in C 3.͒ Hermitian Matrices A real matrix is called symmetric if it is equal to its own transpose. In the complex system, the more useful type of matrix is one that is equal to its own conjugate transpose. Such a matrix is called Hermitian after the French mathematician Charles Hermite (1822–1901). Definition of a A square matrix A is Hermitian if Hermitian Matrix A ϭ A*. As with symmetric matrices, you can easily recognize Hermitian matrices by inspection. To see this, consider the 2 ϫ 2 matrix A. ϩ ϩ ϭ a1 a2i b1 b2i A ΄ ϩ ϩ ΅ c1 c2i d1 d2i The conjugate transpose of A has the form A* ϭ AT ϩ ϩ ϭ a1 a2i c1 c2i ΄ ϩ ϩ ΅ b1 b2i d1 d2i Ϫ Ϫ ϭ a1 a2i c1 c2i ΄ Ϫ Ϫ ΅. b1 b2i d1 d2i If A is Hermitian, then A ϭ A*. So, you can conclude that A must be of the form ϩ ϭ a1 b1 b2i A ΄ Ϫ ΅. b1 b2i d1 504 CHAPTER 8 COMPLEX VECTOR SPACES Similar results can be obtained for Hermitian matrices of order n ϫ n. In other words, a square matrix A is Hermitian if and only if the following two conditions are met. 1. The entries on the main diagonal of A are real. 2. The entry aij in the ith row and the jth column is the complex conjugate of the entry aji in the jth row and ith column. EXAMPLE 4 Hermitian Matrices Which matrices are Hermitian? 1 3 Ϫ i 0 3 Ϫ 2i (a)΄ ΅ (b) ΄ ΅ 3 ϩ i i 3 Ϫ 2i 4 3 2 Ϫ i Ϫ3i Ϫ1 2 3 (c)΄2 ϩ i 0 1 Ϫ i΅ (d) ΄ 2 0 Ϫ1΅ 3i 1 ϩ i 0 3 Ϫ1 4 Solution (a) This matrix is not Hermitian because it has an imaginary entry on its main diagonal. (b) This matrix is symmetric but not Hermitian because the entry in the first row and second column is not the complex conjugate of the entry in the second row and first column. (c) This matrix is Hermitian. (d) This matrix is Hermitian, because all real symmetric matrices are Hermitian. One of the most important characteristics of Hermitian matrices is that their eigenvalues are real. This is formally stated in the next theorem. Theorem 8.10 If A is a Hermitian matrix, then its eigenvalues are real numbers. The Eigenvalues of a Hermitian Matrix Proof Let ␭ be an eigenvalue of A and ϩ a1 b1i a ϩ b i v ϭ 2 2 ΄ . ΅ . ϩ an bni be its corresponding eigenvector. If both sides of the equation Av ϭ ␭v are multiplied by the row vector v*, then ϭ ͑␭ ͒ ϭ ␭͑ ͒ ϭ ␭͑ 2 ϩ 2 ϩ 2 ϩ 2 ϩ . ϩ 2 ϩ 2͒ v*Av v* v v*v a1 b1 a2 b2 an bn . Furthermore, because ͑v*Av͒* ϭ v*A*͑v*͒* ϭ v*Av, SECTION 8.5 UNITARY AND HERMITIAN MATRICES 505 it follows that v*Av is a Hermitian 1 ϫ 1 matrix. This implies that v*Av is a real number, thus ␭ is real. REMARK: Note that this theorem implies that the eigenvalues of a real symmetric matrix are real, as stated in Theorem 7.7. To find the eigenvalues of complex matrices, follow the same procedure as for real ma- trices. EXAMPLE 5 Finding the Eigenvalues of a Hermitian Matrix Find the eigenvalues of the matrix A. 3 2 Ϫ i Ϫ3i A ϭ ΄2 ϩ i 0 1 Ϫ i΅ 3i 1 ϩ i 0 Solution The characteristic polynomial of A is ␭ Ϫ 3 Ϫ2 ϩ i 3i Խ ␭I Ϫ AԽ ϭ ΄Ϫ2 Ϫ i ␭ Ϫ1 ϩ i΅ Ϫ3i Ϫ1 Ϫ i ␭ ϭ ͑␭ Ϫ 3͒͑␭2 Ϫ 2͒ Ϫ ͑Ϫ2 ϩ i͓͒͑Ϫ2 Ϫ i͒␭ Ϫ ͑3i ϩ 3͔͒ ϩ 3i ͓͑1 ϩ 3i͒ ϩ 3␭i͔ ϭ ͑␭3 Ϫ 3␭2 Ϫ 2␭ ϩ 6͒ Ϫ ͑5␭ ϩ 9 ϩ 3i͒ ϩ ͑3i Ϫ 9 Ϫ 9␭͒ ϭ ␭3 Ϫ 3␭2 Ϫ 16␭ Ϫ 12 ϭ ͑␭ ϩ 1͒͑␭ Ϫ 6͒͑␭ ϩ 2͒. This implies that the eigenvalues of A are Ϫ1, 6, and Ϫ2. To find the eigenvectors of a complex matrix, use a similar procedure to that used for a real matrix. For instance, in Example 5, the eigenvector corresponding to the eigenvalue ␭ ϭϪ1 is obtained by solving the following equation. ␭ Ϫ Ϫ ϩ 3 2 i 3i v1 0 Ϫ2 Ϫ i ␭ Ϫ1 ϩ i v ϭ 0 ΄ ΅΄ 2΅ ΄ ΅ Ϫ Ϫ Ϫ ␭ 3i 1 i v3 0 Ϫ Ϫ ϩ 4 2 i 3i v1 0 Ϫ2 Ϫ i Ϫ1 Ϫ1 ϩ i v ϭ 0 ΄ ΅΄ 2΅ ΄ ΅ Ϫ Ϫ Ϫ Ϫ 3i 1 i 1 v3 0 506 CHAPTER 8 COMPLEX VECTOR SPACES Using Gauss-Jordan elimination, or a computer or calculator, obtain the following eigen- ␭ ϭϪ vector corresponding to 1 1.
Recommended publications
  • Math 4571 (Advanced Linear Algebra) Lecture #27
    Math 4571 (Advanced Linear Algebra) Lecture #27 Applications of Diagonalization and the Jordan Canonical Form (Part 1): Spectral Mapping and the Cayley-Hamilton Theorem Transition Matrices and Markov Chains The Spectral Theorem for Hermitian Operators This material represents x4.4.1 + x4.4.4 +x4.4.5 from the course notes. Overview In this lecture and the next, we discuss a variety of applications of diagonalization and the Jordan canonical form. This lecture will discuss three essentially unrelated topics: A proof of the Cayley-Hamilton theorem for general matrices Transition matrices and Markov chains, used for modeling iterated changes in systems over time The spectral theorem for Hermitian operators, in which we establish that Hermitian operators (i.e., operators with T ∗ = T ) are diagonalizable In the next lecture, we will discuss another fundamental application: solving systems of linear differential equations. Cayley-Hamilton, I First, we establish the Cayley-Hamilton theorem for arbitrary matrices: Theorem (Cayley-Hamilton) If p(x) is the characteristic polynomial of a matrix A, then p(A) is the zero matrix 0. The same result holds for the characteristic polynomial of a linear operator T : V ! V on a finite-dimensional vector space. Cayley-Hamilton, II Proof: Since the characteristic polynomial of a matrix does not depend on the underlying field of coefficients, we may assume that the characteristic polynomial factors completely over the field (i.e., that all of the eigenvalues of A lie in the field) by replacing the field with its algebraic closure. Then by our results, the Jordan canonical form of A exists.
    [Show full text]
  • Lecture Notes: Qubit Representations and Rotations
    Phys 711 Topics in Particles & Fields | Spring 2013 | Lecture 1 | v0.3 Lecture notes: Qubit representations and rotations Jeffrey Yepez Department of Physics and Astronomy University of Hawai`i at Manoa Watanabe Hall, 2505 Correa Road Honolulu, Hawai`i 96822 E-mail: [email protected] www.phys.hawaii.edu/∼yepez (Dated: January 9, 2013) Contents mathematical object (an abstraction of a two-state quan- tum object) with a \one" state and a \zero" state: I. What is a qubit? 1 1 0 II. Time-dependent qubits states 2 jqi = αj0i + βj1i = α + β ; (1) 0 1 III. Qubit representations 2 A. Hilbert space representation 2 where α and β are complex numbers. These complex B. SU(2) and O(3) representations 2 numbers are called amplitudes. The basis states are or- IV. Rotation by similarity transformation 3 thonormal V. Rotation transformation in exponential form 5 h0j0i = h1j1i = 1 (2a) VI. Composition of qubit rotations 7 h0j1i = h1j0i = 0: (2b) A. Special case of equal angles 7 In general, the qubit jqi in (1) is said to be in a superpo- VII. Example composite rotation 7 sition state of the two logical basis states j0i and j1i. If References 9 α and β are complex, it would seem that a qubit should have four free real-valued parameters (two magnitudes and two phases): I. WHAT IS A QUBIT? iθ0 α φ0 e jqi = = iθ1 : (3) Let us begin by introducing some notation: β φ1 e 1 state (called \minus" on the Bloch sphere) Yet, for a qubit to contain only one classical bit of infor- 0 mation, the qubit need only be unimodular (normalized j1i = the alternate symbol is |−i 1 to unity) α∗α + β∗β = 1: (4) 0 state (called \plus" on the Bloch sphere) 1 Hence it lives on the complex unit circle, depicted on the j0i = the alternate symbol is j+i: 0 top of Figure 1.
    [Show full text]
  • MATH 2370, Practice Problems
    MATH 2370, Practice Problems Kiumars Kaveh Problem: Prove that an n × n complex matrix A is diagonalizable if and only if there is a basis consisting of eigenvectors of A. Problem: Let A : V ! W be a one-to-one linear map between two finite dimensional vector spaces V and W . Show that the dual map A0 : W 0 ! V 0 is surjective. Problem: Determine if the curve 2 2 2 f(x; y) 2 R j x + y + xy = 10g is an ellipse or hyperbola or union of two lines. Problem: Show that if a nilpotent matrix is diagonalizable then it is the zero matrix. Problem: Let P be a permutation matrix. Show that P is diagonalizable. Show that if λ is an eigenvalue of P then for some integer m > 0 we have λm = 1 (i.e. λ is an m-th root of unity). Hint: Note that P m = I for some integer m > 0. Problem: Show that if λ is an eigenvector of an orthogonal matrix A then jλj = 1. n Problem: Take a vector v 2 R and let H be the hyperplane orthogonal n n to v. Let R : R ! R be the reflection with respect to a hyperplane H. Prove that R is a diagonalizable linear map. Problem: Prove that if λ1; λ2 are distinct eigenvalues of a complex matrix A then the intersection of the generalized eigenspaces Eλ1 and Eλ2 is zero (this is part of the Spectral Theorem). 1 Problem: Let H = (hij) be a 2 × 2 Hermitian matrix. Use the Min- imax Principle to show that if λ1 ≤ λ2 are the eigenvalues of H then λ1 ≤ h11 ≤ λ2.
    [Show full text]
  • Parametrization of 3×3 Unitary Matrices Based on Polarization
    Parametrization of 33 unitary matrices based on polarization algebra (May, 2018) José J. Gil Parametrization of 33 unitary matrices based on polarization algebra José J. Gil Universidad de Zaragoza. Pedro Cerbuna 12, 50009 Zaragoza Spain [email protected] Abstract A parametrization of 33 unitary matrices is presented. This mathematical approach is inspired by polarization algebra and is formulated through the identification of a set of three orthonormal three-dimensional Jones vectors representing respective pure polarization states. This approach leads to the representation of a 33 unitary matrix as an orthogonal similarity transformation of a particular type of unitary matrix that depends on six independent parameters, while the remaining three parameters correspond to the orthogonal matrix of the said transformation. The results obtained are applied to determine the structure of the second component of the characteristic decomposition of a 33 positive semidefinite Hermitian matrix. 1 Introduction In many branches of Mathematics, Physics and Engineering, 33 unitary matrices appear as key elements for solving a great variety of problems, and therefore, appropriate parameterizations in terms of minimum sets of nine independent parameters are required for the corresponding mathematical treatment. In this way, some interesting parametrizations have been obtained [1-8]. In particular, the Cabibbo-Kobayashi-Maskawa matrix (CKM matrix) [6,7], which represents information on the strength of flavour-changing weak decays and depends on four parameters, constitutes the core of a family of parametrizations of a 33 unitary matrix [8]. In this paper, a new general parametrization is presented, which is inspired by polarization algebra [9] through the structure of orthonormal sets of three-dimensional Jones vectors [10].
    [Show full text]
  • Math 223 Symmetric and Hermitian Matrices. Richard Anstee an N × N Matrix Q Is Orthogonal If QT = Q−1
    Math 223 Symmetric and Hermitian Matrices. Richard Anstee An n × n matrix Q is orthogonal if QT = Q−1. The columns of Q would form an orthonormal basis for Rn. The rows would also form an orthonormal basis for Rn. A matrix A is symmetric if AT = A. Theorem 0.1 Let A be a symmetric n × n matrix of real entries. Then there is an orthogonal matrix Q and a diagonal matrix D so that AQ = QD; i.e. QT AQ = D: Note that the entries of M and D are real. There are various consequences to this result: A symmetric matrix A is diagonalizable A symmetric matrix A has an othonormal basis of eigenvectors. A symmetric matrix A has real eigenvalues. Proof: The proof begins with an appeal to the fundamental theorem of algebra applied to det(A − λI) which asserts that the polynomial factors into linear factors and one of which yields an eigenvalue λ which may not be real. Our second step it to show λ is real. Let x be an eigenvector for λ so that Ax = λx. Again, if λ is not real we must allow for the possibility that x is not a real vector. Let xH = xT denote the conjugate transpose. It applies to matrices as AH = AT . Now xH x ≥ 0 with xH x = 0 if and only if x = 0. We compute xH Ax = xH (λx) = λxH x. Now taking complex conjugates and transpose (xH Ax)H = xH AH x using that (xH )H = x. Then (xH Ax)H = xH Ax = λxH x using AH = A.
    [Show full text]
  • Math 408 Advanced Linear Algebra Chi-Kwong Li Chapter 2 Unitary
    Math 408 Advanced Linear Algebra Chi-Kwong Li Chapter 2 Unitary similarity and normal matrices n ∗ Definition A set of vector fx1; : : : ; xkg in C is orthogonal if xi xj = 0 for i 6= j. If, in addition, ∗ that xj xj = 1 for each j, then the set is orthonormal. Theorem An orthonormal set of vectors in Cn is linearly independent. n Definition An orthonormal set fx1; : : : ; xng in C is an orthonormal basis. A matrix U 2 Mn ∗ is unitary if it has orthonormal columns, i.e., U U = In. Theorem Let U 2 Mn. The following are equivalent. (a) U is unitary. (b) U is invertible and U ∗ = U −1. ∗ ∗ (c) UU = In, i.e., U is unitary. (d) kUxk = kxk for all x 2 Cn, where kxk = (x∗x)1=2. Remarks (1) Real unitary matrices are orthogonal matrices. (2) Unitary (real orthogonal) matrices form a group in the set of complex (real) matrices and is a subgroup of the group of invertible matrices. ∗ Definition Two matrices A; B 2 Mn are unitarily similar if A = U BU for some unitary U. Theorem If A; B 2 Mn are unitarily similar, then X 2 ∗ ∗ X 2 jaijj = tr (A A) = tr (B B) = jbijj : i;j i;j Specht's Theorem Two matrices A and B are similar if and only if tr W (A; A∗) = tr W (B; B∗) for all words of length of degree at most 2n2. Schur's Theorem Every A 2 Mn is unitarily similar to a upper triangular matrix. Theorem Every A 2 Mn(R) is orthogonally similar to a matrix in upper triangular block form so that the diagonal blocks have size at most 2.
    [Show full text]
  • Arxiv:1901.01378V2 [Math-Ph] 8 Apr 2020 Where A(P, Q) Is the Arithmetic Mean of the Vectors P and Q, G(P, Q) Is Their P Geometric Mean, and Tr X Stands for Xi
    MATRIX VERSIONS OF THE HELLINGER DISTANCE RAJENDRA BHATIA, STEPHANE GAUBERT, AND TANVI JAIN Abstract. On the space of positive definite matrices we consider dis- tance functions of the form d(A; B) = [trA(A; B) − trG(A; B)]1=2 ; where A(A; B) is the arithmetic mean and G(A; B) is one of the different versions of the geometric mean. When G(A; B) = A1=2B1=2 this distance is kA1=2− 1=2 1=2 1=2 1=2 B k2; and when G(A; B) = (A BA ) it is the Bures-Wasserstein metric. We study two other cases: G(A; B) = A1=2(A−1=2BA−1=2)1=2A1=2; log A+log B the Pusz-Woronowicz geometric mean, and G(A; B) = exp 2 ; the log Euclidean mean. With these choices d(A; B) is no longer a metric, but it turns out that d2(A; B) is a divergence. We establish some (strict) convexity properties of these divergences. We obtain characterisations of barycentres of m positive definite matrices with respect to these distance measures. 1. Introduction Let p and q be two discrete probability distributions; i.e. p = (p1; : : : ; pn) and q = (q1; : : : ; qn) are n -vectors with nonnegative coordinates such that P P pi = qi = 1: The Hellinger distance between p and q is the Euclidean norm of the difference between the square roots of p and q ; i.e. 1=2 1=2 p p hX p p 2i hX X p i d(p; q) = k p− qk2 = ( pi − qi) = (pi + qi) − 2 piqi : (1) This distance and its continuous version, are much used in statistics, where it is customary to take d (p; q) = p1 d(p; q) as the definition of the Hellinger H 2 distance.
    [Show full text]
  • Matrices That Commute with a Permutation Matrix
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Elsevier - Publisher Connector Matrices That Commute With a Permutation Matrix Jeffrey L. Stuart Department of Mathematics University of Southern Mississippi Hattiesburg, Mississippi 39406-5045 and James R. Weaver Department of Mathematics and Statistics University of West Florida Pensacola. Florida 32514 Submitted by Donald W. Robinson ABSTRACT Let P be an n X n permutation matrix, and let p be the corresponding permuta- tion. Let A be a matrix such that AP = PA. It is well known that when p is an n-cycle, A is permutation similar to a circulant matrix. We present results for the band patterns in A and for the eigenstructure of A when p consists of several disjoint cycles. These results depend on the greatest common divisors of pairs of cycle lengths. 1. INTRODUCTION A central theme in matrix theory concerns what can be said about a matrix if it commutes with a given matrix of some special type. In this paper, we investigate the combinatorial structure and the eigenstructure of a matrix that commutes with a permutation matrix. In doing so, we follow a long tradition of work on classes of matrices that commute with a permutation matrix of a specified type. In particular, our work is related both to work on the circulant matrices, the results for which can be found in Davis’s compre- hensive book [5], and to work on the centrosymmetric matrices, which can be found in the book by Aitken [l] and in the papers by Andrew [2], Bamett, LZNEAR ALGEBRA AND ITS APPLICATIONS 150:255-265 (1991) 255 Q Elsevier Science Publishing Co., Inc., 1991 655 Avenue of the Americas, New York, NY 10010 0024-3795/91/$3.50 256 JEFFREY L.
    [Show full text]
  • MATRICES WHOSE HERMITIAN PART IS POSITIVE DEFINITE Thesis by Charles Royal Johnson in Partial Fulfillment of the Requirements Fo
    MATRICES WHOSE HERMITIAN PART IS POSITIVE DEFINITE Thesis by Charles Royal Johnson In Partial Fulfillment of the Requirements For the Degree of Doctor of Philosophy California Institute of Technology Pasadena, California 1972 (Submitted March 31, 1972) ii ACKNOWLEDGMENTS I am most thankful to my adviser Professor Olga Taus sky Todd for the inspiration she gave me during my graduate career as well as the painstaking time and effort she lent to this thesis. I am also particularly grateful to Professor Charles De Prima and Professor Ky Fan for the helpful discussions of my work which I had with them at various times. For their financial support of my graduate tenure I wish to thank the National Science Foundation and Ford Foundation as well as the California Institute of Technology. It has been important to me that Caltech has been a most pleasant place to work. I have enjoyed living with the men of Fleming House for two years, and in the Department of Mathematics the faculty members have always been generous with their time and the secretaries pleasant to work around. iii ABSTRACT We are concerned with the class Iln of n><n complex matrices A for which the Hermitian part H(A) = A2A * is positive definite. Various connections are established with other classes such as the stable, D-stable and dominant diagonal matrices. For instance it is proved that if there exist positive diagonal matrices D, E such that DAE is either row dominant or column dominant and has positive diag­ onal entries, then there is a positive diagonal F such that FA E Iln.
    [Show full text]
  • 216 Section 6.1 Chapter 6 Hermitian, Orthogonal, And
    216 SECTION 6.1 CHAPTER 6 HERMITIAN, ORTHOGONAL, AND UNITARY OPERATORS In Chapter 4, we saw advantages in using bases consisting of eigenvectors of linear opera- tors in a number of applications. Chapter 5 illustrated the benefit of orthonormal bases. Unfortunately, eigenvectors of linear operators are not usually orthogonal, and vectors in an orthonormal basis are not likely to be eigenvectors of any pertinent linear operator. There are operators, however, for which eigenvectors are orthogonal, and hence it is possible to have a basis that is simultaneously orthonormal and consists of eigenvectors. This chapter introduces some of these operators. 6.1 Hermitian Operators § When the basis for an n-dimensional real, inner product space is orthonormal, the inner product of two vectors u and v can be calculated with formula 5.48. If v not only represents a vector, but also denotes its representation as a column matrix, we can write the inner product as the product of two matrices, one a row matrix and the other a column matrix, (u, v) = uT v. If A is an n n real matrix, the inner product of u and the vector Av is × (u, Av) = uT (Av) = (uT A)v = (AT u)T v = (AT u, v). (6.1) This result, (u, Av) = (AT u, v), (6.2) allows us to move the matrix A from the second term to the first term in the inner product, but it must be replaced by its transpose AT . A similar result can be derived for complex, inner product spaces. When A is a complex matrix, we can use equation 5.50 to write T T T (u, Av) = uT (Av) = (uT A)v = (AT u)T v = A u v = (A u, v).
    [Show full text]
  • Matrix Theory
    Matrix Theory Xingzhi Zhan +VEHYEXI7XYHMIW MR1EXLIQEXMGW :SPYQI %QIVMGER1EXLIQEXMGEP7SGMIX] Matrix Theory https://doi.org/10.1090//gsm/147 Matrix Theory Xingzhi Zhan Graduate Studies in Mathematics Volume 147 American Mathematical Society Providence, Rhode Island EDITORIAL COMMITTEE David Cox (Chair) Daniel S. Freed Rafe Mazzeo Gigliola Staffilani 2010 Mathematics Subject Classification. Primary 15-01, 15A18, 15A21, 15A60, 15A83, 15A99, 15B35, 05B20, 47A63. For additional information and updates on this book, visit www.ams.org/bookpages/gsm-147 Library of Congress Cataloging-in-Publication Data Zhan, Xingzhi, 1965– Matrix theory / Xingzhi Zhan. pages cm — (Graduate studies in mathematics ; volume 147) Includes bibliographical references and index. ISBN 978-0-8218-9491-0 (alk. paper) 1. Matrices. 2. Algebras, Linear. I. Title. QA188.Z43 2013 512.9434—dc23 2013001353 Copying and reprinting. Individual readers of this publication, and nonprofit libraries acting for them, are permitted to make fair use of the material, such as to copy a chapter for use in teaching or research. Permission is granted to quote brief passages from this publication in reviews, provided the customary acknowledgment of the source is given. Republication, systematic copying, or multiple reproduction of any material in this publication is permitted only under license from the American Mathematical Society. Requests for such permission should be addressed to the Acquisitions Department, American Mathematical Society, 201 Charles Street, Providence, Rhode Island 02904-2294 USA. Requests can also be made by e-mail to [email protected]. c 2013 by the American Mathematical Society. All rights reserved. The American Mathematical Society retains all rights except those granted to the United States Government.
    [Show full text]
  • Unitary-And-Hermitian-Matrices.Pdf
    11-24-2014 Unitary Matrices and Hermitian Matrices Recall that the conjugate of a complex number a + bi is a bi. The conjugate of a + bi is denoted a + bi or (a + bi)∗. − In this section, I’ll use ( ) for complex conjugation of numbers of matrices. I want to use ( )∗ to denote an operation on matrices, the conjugate transpose. Thus, 3 + 4i = 3 4i, 5 6i =5+6i, 7i = 7i, 10 = 10. − − − Complex conjugation satisfies the following properties: (a) If z C, then z = z if and only if z is a real number. ∈ (b) If z1, z2 C, then ∈ z1 + z2 = z1 + z2. (c) If z1, z2 C, then ∈ z1 z2 = z1 z2. · · The proofs are easy; just write out the complex numbers (e.g. z1 = a+bi and z2 = c+di) and compute. The conjugate of a matrix A is the matrix A obtained by conjugating each element: That is, (A)ij = Aij. You can check that if A and B are matrices and k C, then ∈ kA + B = k A + B and AB = A B. · · You can prove these results by looking at individual elements of the matrices and using the properties of conjugation of numbers given above. Definition. If A is a complex matrix, A∗ is the conjugate transpose of A: ∗ A = AT . Note that the conjugation and transposition can be done in either order: That is, AT = (A)T . To see this, consider the (i, j)th element of the matrices: T T T [(A )]ij = (A )ij = Aji =(A)ji = [(A) ]ij. Example. If 1 2i 4 1 + 2i 2 i 3i ∗ − A = − , then A = 2 + i 2 7i .
    [Show full text]