International Journal of Pure and Applied Mathematics Volume 111 No. 4 2016, 643-646 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu AP doi: 10.12732/ijpam.v111i4.11 ijpam.eu

A NOTE ON THE LEMMA

Robert Vrabel Faculty of Materials Science and Technology Slovak University of Technology in Bratislava J. Bottu 25, 917 24 Trnava, SLOVAKIA

Abstract: In this note, a generalization of the Matrix Determinant Lemma to the finite sum of dyadic products of vectors is derived.

AMS Subject Classification: 15A24, 15A15 Key Words: matrix determinant lemma

1. Generalized Matrix Determinant Lemma

The Matrix Determinant Lemma (MDL) representing an important analytical tool in the matrix theory is usually formulated in the form of Theorem 1 (Matrix Determinant Lemma, [1]). If H is an invertible n × n matrix, and u and v are two n−dimensional column vectors, then

− det H + uvT = 1 + vT H 1u det(H).   For the wider context of MDL, see [2] and [3]. In this short note we generalize this theorem for the noninvertible matrices H and for finite sum of dyadic products of vectors. Theorem 2 (Generalized Matrix Determinant Lemma). Suppose H is a of dimension n and ui, vi are the n × 1 column vectors,

Received: October 4, 2016 c 2016 Academic Publications, Ltd. Revised: November 22, 2016 url: www.acadpubl.eu Published: December 30, 2016 644 R. Vrabel i = 1, . . . , k. Then for every k ≥ 1 we have the equality

k T det (H + ∆k) = det(H) + vi adj(H + ∆i−1)ui, (1) Xi=1 where n × n zero matrix for i = 0, i ∆i =  T  ujvj for i = 1, . . . , k. jP=1 Remark 3. For k= 1 and an H we obtain the classical MDL for the dyadic product of two vectors u1 and v1 (Theorem 1). Proof. We use the induction principle to prove Theorem 2. We define as a predicate P (k) the statement of Theorem 2. Step 1: We prove that the formula (1) is true for k = 1, that is,

T T P (k = 1) : det H + u1v1 = det(H) + v1 adj(H)u1.  First, let us assume that a matrix H is invertible. From the matrix identity

T In 0 In + u1v1 u1 In 0 In u1 T T = T  v1 1   0 1   −v1 1   0 1 + v1 u1  we obtain that T T det In + u1v1 = 1 + v1 u1.  Hence

T −1 T det H + u1v1 = det(H) det In + (H u1)v1  T −1  = det(H) 1 + v1 (H u1) T  = det(H) + v1 adj(H)u1.

Now let det(H) = 0. Let us consider a small perturbation of H in the form H +ǫIn. The det(H +ǫIn) is a polynomial in ǫ which has at most n roots on the real axis. Thus there exists ǫ0 such that the matrices H + ǫIn are the invertible matrices for all ǫ ∈ (0, ǫ0) and so

T T det (H + ǫIn) + u1v1 = det(H + ǫIn) + v1 adj(H + ǫIn)u1.  Now in the limit for ǫ → 0+, taking into consideration that the polynomials on the both sides of the last equality are continuous functions, we obtain the A NOTE ON THE MATRIX DETERMINANT LEMMA 645 statement P (1). This completes the proof of P (1) for an arbitrary square matrix H. Step 2: (Proof that an implication P (k = s) =⇒ P (k = s + 1) is true). The induction hypothesis is that (1) is true for some k = s ≥ 1. We have

det (H + ∆s+1) T = det [H + ∆s] + us+1vs+1 T  = det (H + ∆s) + vs+1 adj (H + ∆s) us+1 s T T = det(H) + vi adj(H + ∆i−1)ui + vs+1 adj (H + ∆s) us+1 Xi=1 s+1 T = det(H) + vi adj(H + ∆i−1)ui. Xi=1 Thus (1) is true for all k ≥ 1. Remark 4. From the just proved Theorem 2 it follow some useful corol- laries: (a) The product UV T of the matrices

...... U = u . u . ··· . ur and V = v . v . ··· . vr ,  1 2   1 2 

where ui and vi are n × 1 column vectors, i = 1, . . . , r (r ≥ 1) may be r T T expressed in the form of the sum of dyadic products, UV = uivi . n=1 Thus from (1) we have the matrix determinant identity P r T T det H + UV = det(H) + vi adj(H + ∆i−1)ui,  Xi=1 where H is an arbitrary n × n matrix and ∆i are defined in Theorem 2. This equality can be used also for deriving some results if the matrix U or/and V have a special form; (b) For H = 0 (the zero matrix) we obtain a generalized formula for the determinant of the product of two (in general non-square) matrices r T T det UV = vi adj(∆i−1)ui.  Xi=1 For r = 1 we get the obvious fact that the matrix formed by the dyadic product of two vectors has a determinant equal to zero. 646 R. Vrabel

Acknowledgments

This publication is the result of implementation of the project ”University Sci- entific Park: Campus MTF STU - CAMBO” (26220220179) supported by the Research & Development Operational Program funded by ERDF.

References [1] J. Ding, A. Zhou, Eigenvalues of rank–one updated matrices with some applications, Ap- plied Mathematics Letters, 20 No. 12 (2007), 1223-1226, doi: 10.1016/j.aml.2006.11.016. [2] D.A. Harville, Matrix Algebra From a Statistician’s Perspective, Springer-Verlag, New York (2008), doi: 10.1007/b98818. [3] J.N. Franklin, Matrix theory, Dover Publications in Mineola, New York (1993).